{
"models": [
{
"model": "deepseek-coder:6.7b",
"provider": "ollama",
"baseurl":"http://localhost:11434",
"title": "DeepSeek"
}
]
}
You can add more configuration here but this will do for now.
{
"models": [
{
"model": "deepseek-coder:6.7b",
"provider": "ollama",
"baseurl":"http://localhost:11434",
"title": "DeepSeek"
}
]
}
You can add more configuration here but this will do for now.
ollama serve
The command will start running the model at localhost:1143
ollama serve
The command will start running the model at localhost:1143
There are different models. Choose according to your needs and your hardware specs.
There are different models. Choose according to your needs and your hardware specs.
www.righto.com/2018/01/xero...
www.righto.com/2018/01/xero...