Dylan
dylantartarini.bsky.social
Dylan
@dylantartarini.bsky.social
Data Scientist 👨🏻‍💻🐍

Based in Bologna 🇮🇹🇪🇺

MSc in Economics 📊

Passionate about books, mountains, politics and policies, coding, artificial intelligence and quality conversations. 📖⛰️👾
Reposted by Dylan
This excellent article showcasing how to build sophisticated LLM applications by @dylantartarini.bsky.social is a prime example of the in-depth, application-focused #LLM insights that we are currently seeking.
Build and Query Knowledge Graphs with LLMs | Towards Data Science
Going from document ingestion to smart queries — all with open tools and guided setup
towardsdatascience.com
May 21, 2025 at 9:27 PM
this is what that script looks like in the UI btw
April 12, 2025 at 5:41 PM
Not really.
Create a app.py with streamlit (or whatever u want to name it) at the root (same level as your src folder).
Then import whatever is needed from src to mimic a chat (I suppose similar to what you already do in src/main.py).

Post here an example just to give you an idea: it's just python
April 12, 2025 at 5:38 PM
Great!
In my experience writing Streamlit could become frustrating for complex projects, but chatbots are really easy to do nowadays, even with callbacks and memory -> docs.streamlit.io/develop/tuto... find here some example :)

Hope I helped, this really is an interesting project!
Build a basic LLM chat app - Streamlit Docs
docs.streamlit.io
April 12, 2025 at 5:08 PM
Have you considered using something like Streamlit Community Cloud (streamlit.io/cloud) or Hugging Face Spaces (huggingface.co/spaces)?
It's a minimal effort but might be worth it to showcase the project to a broader audience imho. In both cases the platform would take care of the hosting for you.
Streamlit Community Cloud • Streamlit
Deploy, manage, and share your Streamlit apps — all for free.
streamlit.io
April 12, 2025 at 4:49 PM
Hi Martina, had the chance to take a look at the tool and was wondering: are you guys planning on releasing a Front-End of some sort? Not all academics are computer scientists, might make sense to spread the tool's usage.
April 12, 2025 at 4:09 PM
Clearly,

- nr. 1 needs domain knowledge and time;
- nr. 2 is computationally expensive and risks entropy but is grounded in docs;
- nr. 3 is risking entropy even more, but is less grounded.
March 10, 2025 at 6:34 PM
I'd be interested in how to manage the Ontology when building and evolving the graph from documents:

Current options in my mind are
1. predefine the Ontology
2. infer it from docs
3. leave the Agent in charge of extracting entities and relationships free to do whatever it wants
March 10, 2025 at 6:34 PM
My workaround at the moment (my problem statement was: I don't want to spend money) was to start using Groq (console.groq.com/docs/overview) with online endpoints of open source LLMs, available for free under certain rate / token limits per day.
GroqCloud
Experience the fastest inference in the world
console.groq.com
March 9, 2025 at 5:22 PM
I am working with some complex prompts due to the use case (Knowledge representation using Graphs).
In my M1 MacBook Air I find that Llama3.2 (1B) works fine for simple/medium tasks at reasonable speed.
However, for some harder tasks I am yet to find a good alternative that could be hosted locally
March 9, 2025 at 5:20 PM
Reposted by Dylan
This chart is from our Spanish Instagram account: www.instagram.com/ourworldinda...

Read more about our translation efforts: ourworldindata.org/instagram-in...
Login • Instagram
Welcome back to Instagram. Sign in to check out what your friends, family & interests have been capturing & sharing around the world.
www.instagram.com
March 7, 2025 at 12:11 PM