#databs #databricks #fabric
#databs #databricks #fabric
#databs
#databs
If you work in data, you probably would have come across a version of this headline this past week. A small disclaimer, I have not used SDF and neither do I have solid understanding of the tech that sits behind it, so take what I say with a grain of salt.
If you work in data, you probably would have come across a version of this headline this past week. A small disclaimer, I have not used SDF and neither do I have solid understanding of the tech that sits behind it, so take what I say with a grain of salt.
#databs
#databs
#databs
#databs
#dataBS
www.brooklyndata.co/ideas/2025/0...
#datasky #databs
Its true that there are no shortage of tools when it comes to data ingestion. But before you open the wallet to one of the many options out there, it might be worth doing a thorough due diligence based on your current and future needs.
#databs
#databs
#databs
dbt has been a game changer to many #data teams, mainly for writing reusable and version controlled transformation logic. We are also witnessing an explosion of tools that wants to become the next #dbt. What is better, dbt core or dbt cloud?
#databs
The standard definition that you get is that it is a unified governance solution built into Databricks. It is accurate but that was not intuitive to me when I started building on UC. See 🧵 for some additional context on UC.
#databs
The standard definition that you get is that it is a unified governance solution built into Databricks. It is accurate but that was not intuitive to me when I started building on UC. See 🧵 for some additional context on UC.
#databs
In a nutshell, it is an abstraction that simplifies the incremental ingestion of data by monitoring the files that arrive in the cloud storage, supporting reliable and resilient data pipelines cost efficiently.
#databs
In a nutshell, it is an abstraction that simplifies the incremental ingestion of data by monitoring the files that arrive in the cloud storage, supporting reliable and resilient data pipelines cost efficiently.
#databs
www.anthropic.com/research/bui...
Without attaching a name to it, I have tested all except the agent workflow in 2024.
Currently running an orchestrator-worker, prompt chaining, and routing workflows across a handful of projects in production.
www.anthropic.com/research/bui...
Without attaching a name to it, I have tested all except the agent workflow in 2024.
Currently running an orchestrator-worker, prompt chaining, and routing workflows across a handful of projects in production.
- Unpacking an iterable*
- Use copy method*
- Using dict constructor*
- Dictionary comprehension*
- Using deepcopy (from copy)
First 3 are has similar performance, followed by dictionary comprehension, and finally deep copy.
*shallow copy
- Unpacking an iterable*
- Use copy method*
- Using dict constructor*
- Dictionary comprehension*
- Using deepcopy (from copy)
First 3 are has similar performance, followed by dictionary comprehension, and finally deep copy.
*shallow copy
Check the🧵for the one that I use time to time.
Check the🧵for the one that I use time to time.