Ollama
Ollama is a tool that allows you to run LLMs locally. We are using it to run Llama2, MistralAI and others locally.Install Ollama
Install Ollama from their website. Then run the following command to run Ollama in the background:Run Migrations
Add Ollama Model to Quivr
Now that you have your model running locally, you need to add it to Quivr. In order to allow the user to choose between the Ollama, we need to add a new model to the Quivr backend. Go to supabase and in the tableuser_settings
either add by default or to your user the following value to the models
column:
models
column in the user_settings
table. In order for the change to take effect if you put as default your need to drop the entire table with the following command: