huggingface.co/diffbot/Llam...
huggingface.co/diffbot/Llam...
Download Diffbot LLM. Run it off your own GPU. Congrats, your on-prem #AI is smarter than #Perplexity.
Download Diffbot LLM. Run it off your own GPU. Congrats, your on-prem #AI is smarter than #Perplexity.
3. We open sourced Diffbot LLM. Perplexity chose to keep theirs secret.
3. We open sourced Diffbot LLM. Perplexity chose to keep theirs secret.
What IS significant is how we got here vs. Perplexity.
1. Diffbot LLM is a side project. Sonar is Perplexity's entire business.
What IS significant is how we got here vs. Perplexity.
1. Diffbot LLM is a side project. Sonar is Perplexity's entire business.
The next morning, we beat Sonar Pro.
The next morning, we beat Sonar Pro.
The SimpleQA benchmark they used is open source and LLM judged...
The SimpleQA benchmark they used is open source and LLM judged...
We look forward to building a future of grounded AI with you all.
We look forward to building a future of grounded AI with you all.
And we are excited to share that we are releasing Diffbot LLM open source on #Github, with weights available for download on #Huggingface.
github.com/diffbot/diff...
And we are excited to share that we are releasing Diffbot LLM open source on #Github, with weights available for download on #Huggingface.
github.com/diffbot/diff...
Knowledge is best retrieved at inference, outside of model weights.
Knowledge is best retrieved at inference, outside of model weights.
Not only is credit provided to publishers, every fact is also independently verifiable.
Not only is credit provided to publishers, every fact is also independently verifiable.
Naturally, this means Diffbot LLM always provides full attribution to its cited sources.
Naturally, this means Diffbot LLM always provides full attribution to its cited sources.