I tried to use Hashbrown with Angular but when I integrate with a local LLM I receive the following message when I try with llama3.1:8b model:
error:ResponseError: registry.ollama.ai/library/llama3.1:8b does not support thinking
or when I try with the following model: deepseek-r1:1.5b
error:ResponseError: think value "high" is not supported for this model
Seems like there are some issues when you try to use local Ollama models.
I tried to use Hashbrown with Angular but when I integrate with a local LLM I receive the following message when I try with
llama3.1:8bmodel:or when I try with the following model:
deepseek-r1:1.5bSeems like there are some issues when you try to use local Ollama models.