Gianni Amati
banner
gianniamati.bsky.social
Gianni Amati
@gianniamati.bsky.social
Scientist
Better GPUs mean faster response times, allowing more users to be served per hour. However, COT increases processing time, which means OpenAI has to spend more on NVIDIA hardware to maintain its service.
February 10, 2025 at 6:01 PM
All using just less than 5GB of Memory.
February 10, 2025 at 5:59 PM
Then I ran the query and got first COT preamble and the proper answer.
February 10, 2025 at 5:59 PM
I chose deepseek-r1 as my model. In LLM Anything, I set it as the preferred LLM in the model settings.
February 10, 2025 at 5:57 PM
With one line code example was just to explain to you what I meant by the "full data lifecicle". I come from Logic and I like a lambda calculus style of programming.
December 27, 2024 at 9:34 AM
You can do it with one line of code passing the data through a pipeline of functions
December 26, 2024 at 6:58 PM
However mutate or group_by seem to be slower than when done with Base R
December 26, 2024 at 9:59 AM
The pipeline operator enables clear and concise chaining of operations which is at the base of the functional style of programming. Indeed the tidyverse is designed to support the full data lifecycle, providing tools and functions for each stage of a typical data analysis workflow.
December 26, 2024 at 9:54 AM