Use Chroma DB vector database in RAG application using llama index & Deepseek R1 local model
In previous article I have explained how to install Deepseek R1 model locally and create the RAG application using the locally downloaded models step by step. In this article I will explain Step-by-step to use the Chroma DB vector database in RAG application using llama index & Deepseek R1 local model with Ollama. Chroma DB is open source vector db which can be used to store the vector embeddings of documents and query the embeddings based on user query. I have used the same example used in previous article and added only Chroma DB integration code. Prerequisites for this example is as follows: Visual studio code Python Ollama Open visual studio code and create the file with name "sample.py". Now in visual studio code and go to terminal menu and click on New terminal link it will open new terminal. In terminal enter below command to install the LlamaIndex library, Chroma DB library, LlamaIndex Ollama and LlamaIndex embedding Ollama library in your mac...