#Ollama
wonder how well ollama would run on this thing.. and the price, lol https://store.steampowered.com/sale/steammachine
Steam Machine
Your games on the big screen
store.steampowered.com
November 12, 2025 at 7:39 PM
SearXNG integration safeguards your privacy—no tracking, no profiles—while Ollama Cloud offers power for high-volume inference, keeping business logic accurate and up to date. #AI #LLM #Ollama #OllamaCloud #Searxng #UnPerplexedSpready
matasoft.hr/qtrendcontro...
(Un)Perplexed Spready with web search enabled Ollama models
Configure web search-enabled Ollama models in (Un)Perplexed Spready. Complete tutorial for connecting SearXNG to local AI models for enhanced spreadsheet automation.
matasoft.hr
November 12, 2025 at 7:22 PM
🔥 Hot Repo! 🔥 (100+ new stars)

📦 open-webui / open-webui
⭐ 115,016 (+133)
🗒 JavaScript

User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
GitHub - open-webui/open-webui: User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
User-friendly AI Interface (Supports Ollama, OpenAI API, ...) - open-webui/open-webui
github.com
November 12, 2025 at 7:02 PM
Build a private, offline RAG with Ollama + FAISS. Ingest docs, chunk, embed, and cite answers—no APIs, no cloud, full control over sensitive data. #rag
Building a RAG System That Runs Completely Offline
hackernoon.com
November 12, 2025 at 5:06 PM
I built something called Code Genie - basically an AI coding assistant that runs locally using Ollama instead of sending your code to the cloud.

Link: github.com/Sherin-SEF-A...
GitHub - Sherin-SEF-AI/code-genie: 🧞 Advanced AI Coding Agent | Autonomous Workflows | Multi-Agent System | Natural Language Programming | Privacy-First | Runs Locally with Ollama | Open Source
🧞 Advanced AI Coding Agent | Autonomous Workflows | Multi-Agent System | Natural Language Programming | Privacy-First | Runs Locally with Ollama | Open Source - Sherin-SEF-AI/code-genie
github.com
November 12, 2025 at 3:20 PM
From Windows to Linux, (Un)Perplexed Spready’s web-enabled intelligence runs anywhere, delivering streaming answers, hybrid data outputs, and flexible integration for every spreadsheet user. #AI #LLM #Ollama #OllamaCloud #Searxng #UnPerplexedSpready
matasoft.hr/qtrendcontro...
(Un)Perplexed Spready with web search enabled Ollama models
Configure web search-enabled Ollama models in (Un)Perplexed Spready. Complete tutorial for connecting SearXNG to local AI models for enhanced spreadsheet automation.
matasoft.hr
November 12, 2025 at 2:20 PM
Пишем LLM бенчмарк для GPU-серверов с картами NVIDIA в Ollama Как мы в HOSTKEY создали собственный LLM-бенчмарк для GPU-серверов с видеокартами NVIDIA в Ollama. Подробно о методике тестирования, коде на bash, результатах и закономерностях производительности. Читать далее

Interest | Match | Feed
Origin
habr.com
November 12, 2025 at 11:41 AM
Пишем LLM бенчмарк для GPU-серверов с картами NVIDIA в Ollama Как мы в HOSTKEY создали собственный LLM-бенчмарк для GPU-серверов с видеокартами NVIDIA в Ollama. Подробно о методике тестирования, коде на bash, результатах и закономерностях производительности. Читать далее

Interest | Match | Feed
Origin
habr.com
November 12, 2025 at 11:40 AM
Пишем LLM бенчмарк для GPU-серверов с картами NVIDIA в Ollama Как мы в HOSTKEY создали собственный LLM-бенчмарк для GPU-серв...

#ollama #gpu #сервер #nvidia #llm #deepseek #nvidia-smi #cuda #бенчмаркинг

Origin | Interest | Match
November 12, 2025 at 11:41 AM
Reins - Un Ollama sur mobile sans faire tourner Ollama sur mobile korben.info/reins-ollama...
Reins - Un Ollama sur mobile sans faire tourner Ollama sur mobile | Le site de Korben
Vous voulez utiliser Ollama sur votre iPhone ou Android pour lancer vos petits LLM en local ? Ce serait super cool non ? Bah j’ai une mauvaise ...
korben.info
November 12, 2025 at 10:18 AM
Every review, headline, or trend is now within spreadsheet reach—Spready pulls and processes web results safely, and summarizes findings using Ollama LLMs locally or in the cloud. #AI #LLM #Ollama #OllamaCloud #Searxng #UnPerplexedSpready
matasoft.hr/qtrendcontro...
(Un)Perplexed Spready with web search enabled Ollama models
Configure web search-enabled Ollama models in (Un)Perplexed Spready. Complete tutorial for connecting SearXNG to local AI models for enhanced spreadsheet automation.
matasoft.hr
November 12, 2025 at 10:02 AM
Spready provides secure access to advanced LLMs (Ollama Cloud/local) and live Internet data (SearXNG), enabling AI-driven factual enrichment, comparison, extraction, categorization and reporting in spreadsheets. #AI #LLM #Ollama #OllamaCloud #Searxng #UnPerplexedSpready
matasoft.hr/qtrendcontro...
November 12, 2025 at 8:56 AM
Spready provides secure access to advanced LLMs (Ollama Cloud/local) and live Internet data (SearXNG), enabling AI-driven factual enrichment, comparison, extraction, categorization and reporting in spreadsheets. #AI #LLM #Ollama #OllamaCloud #Searxng #UnPerplexedSpready
matasoft.hr/qtrendcontro...
(Un)Perplexed Spready with web search enabled Ollama models
Configure web search-enabled Ollama models in (Un)Perplexed Spready. Complete tutorial for connecting SearXNG to local AI models for enhanced spreadsheet automation.
matasoft.hr
November 12, 2025 at 8:56 AM
Essential read for DPOs: using local AI (Ollama/LM Studio/N8N) to automate #GDPR #RODO compliance work without data leaks. Practical guide with Polish model recommendations: pawel.rosol.pl/posts/ai-w-p...
November 12, 2025 at 8:26 AM
Create an Ollama compliant model and make it accessible to the world! Sharing is caring. :-) youtu.be/grCeXX-N_Gg #ollama #python #llm #machinelearning #ai #unsloth
Ollama Model File Create gguf model and push it to ollama or huggingface
YouTube video by Mike Møller Nielsen
youtu.be
November 12, 2025 at 8:20 AM
Wer Aufgaben an Chatbots wie ChatGPT & Co. delegiert, muss den Betreibern oft sensible Daten anvertrauen. Mit dem Open-Source-Projekt Ollama geht das lokal. #Künstliche Intelligenz
heise+ | Mac als lokales KI-System: So geht's
Wer Aufgaben an Chatbots wie ChatGPT & Co. delegiert, muss den Betreibern oft sensible Daten anvertrauen. Mit dem Open-Source-Projekt Ollama geht das lokal.
www.heise.de
November 12, 2025 at 7:04 AM
Get the latest figures and summaries with hybrid AI-driven spreadsheet formulas—combining SearXNG metasearch and scalable LLM processing, all fully inside your workflow. #AI #LLM #Ollama #OllamaCloud #Searxng #UnPerplexedSpready
matasoft.hr/qtrendcontro...
matasoft.hr
November 12, 2025 at 7:03 AM
Run powerful open-source LLMs locally with Ollama. FreeCodeCamp shows how to install, pull models, and chat directly on your computer—no cloud required. Perfect for privacy, experimentation, and learning AI hands-on. Your machine becomes an AI lab, whether for coding, writing, or research.
November 12, 2025 at 6:31 AM
If you want to use an LLM, you don't need a cloud subscription or massive server. You can run it right on your personal computer - and Manish shows you how. You'll learn how to install Ollama via the UI and command line, set everything up, & then run it locally.
www.freecodecamp.org/news/how-to-...
November 12, 2025 at 1:01 AM
Ollama or LMStudio. I use it on a local server I have at home, pretty good as an Alexa replacement.
Added RAG using Kernel Memory for all of our recipes and personal documents, so I can just ask it "what next" when we are cooking.
November 12, 2025 at 12:42 AM
Nutzt jemand ein lokales LLM im # Homelab ? Wenn ja, wie ist das Setup? Überlege ein Minisforum PC mit einer NVIDIA A2000 zu beschaffen. Die Karte dann in PVE an eine VM für # Ollama durchreichen. Möchte es für # HomeAssitant nutzen, ggf dann für weitere Apps.

Interest | Match | Feed
Origin
social.teqqy.de
November 11, 2025 at 11:19 PM
Como Baixar, Quantizar e Rodar um Modelo LLM Localmente no Ollama Rodar um modelo de linguagem grande (LLM) no seu próprio computador pode parecer coisa de outro mundo — e, pra ser sincera, eu t...

#llm #ai #python #machinelearning

Origin | Interest | Match
Como Baixar, Quantizar e Rodar um Modelo LLM Localmente no Ollama
Rodar um modelo de linguagem grande (LLM) no seu próprio computador pode parecer coisa de outro mundo...
dev.to
November 11, 2025 at 11:33 PM
@ simon Was the an llm plugin that let you serve any models you had registered with the llm cli as an OpenAI/Ollama-compatible server? I seem to recall seeing this but can't find it now.

Interest | Match | Feed
Origin
ecoevo.social
November 11, 2025 at 9:02 PM
Whether you automate pricing updates, product research, or scientific reviews, Spready’s approach lets you tap into real-time verified data via SearXNG, plus context-rich LLM answers from Ollama Cloud. #AI #LLM #Ollama #OllamaCloud #Searxng #UnPerplexedSpready
matasoft.hr/qtrendcontro...
(Un)Perplexed Spready with web search enabled Ollama models
Configure web search-enabled Ollama models in (Un)Perplexed Spready. Complete tutorial for connecting SearXNG to local AI models for enhanced spreadsheet automation.
matasoft.hr
November 11, 2025 at 8:31 PM