Quickstart
Get your own assistant in less than 5 seconds
If you need to quickly start talking to your list of files, here are the steps.
- Add your API Keys to your environment variables
Check our .env.example
file to see the possible environment variables you can configure. Quivr supports APIs from Anthropic, OpenAI, and Mistral. It also supports local models using Ollama.
- Create a Brain with Quivr default configuration
- Launch a Chat
And now you are all set up to talk with your brain !
Custom Brain
If you want to change the language or embeddings model, you can modify the parameters of the brain.
Let’s say you want to use a LLM from Mistral and a specific embedding model :
Note: The Embeddings class from LangChain lets you choose from a large variety of embedding models. You can configure different models like OpenAI, Cohere, or HuggingFace embeddings based on your needs.
Launch with Chainlit
If you want to quickly launch an interface with streamlit, you can simply do at the root of the project :
For more details, check out our Chatbot Example guide.
Note : Modify the Brain configs directly in examples/chatbot/main.py;
Was this page helpful?