Soft spot for local llms and multi-agent scenarios
github.com/ollama/ollam...
github.com/ollama/ollam...
Fixed for me through the custom model file link to above (which can be imported through ollama create qwk-fix-stop:latest -f qwq-fix-stop-modelfile.md
FROM qwq:latest)
Fixed for me through the custom model file link to above (which can be imported through ollama create qwk-fix-stop:latest -f qwq-fix-stop-modelfile.md
FROM qwq:latest)
See of this helps
github.com/elsewhat/adv...
See of this helps
github.com/elsewhat/adv...