One Nexus Point to access and chat with multiple LLM providers — Mistral, Ollama, Groq, TogetherAI — with GitHub & PDF RAG capabilities.
Supports memory, chat history, persistent messages, and file retrieval from GitHub.
| Layer | Tech |
|---|---|
| Framework | FastAPI |
| LLM Framework | LlamaIndex |
| Validation | Pydantic |
| Database | Redis, PostgreSQL, Qdrant |
| Containerization | Docker |
| Deployment | Azure App Service |
| Git Integration | PyGithub |
- 🔁 Chat with multiple LLM providers
- 💾 Persistent chat history (PostgreSQL, Redis)
- 🧠 Session memory support
- 🔄 Dynamic model switching (Ollama, Groq, Mistral, etc.)
- 📂 GitHub Integration: Fetch files from any public/private repository using username , repo name,branch and commit
- 📄 PDF & Document RAG: Upload PDFs and retrieve context-aware answers
- 🔍 Vector search via Qdrant for fast retrieval
git clone https://github.com/DineshThumma9/centralGPT-backend.git
cd centralGPT-backendpip install -r requirements.txtCreate a .env file in the root directory:
DATABASE_URL=
REDIS_URL=
QDRANT_URL=
QDRANT_API_KEY=
GROQ_API_KEY=
COHERE_API_KEY=
HUGGINGFACE_HUB_TOKEN=
GITHUB_TOKEN=
Clone and set up the frontend:
git clone https://github.com/DineshThumma9/centralGPT.gitFollow the frontend repo instructions to configure and run it.
uvicorn src.main:app --reload