A chatbot application for querying geological data using LLMs (Llama 3.2, Deepseek 1.5, OpenAI) via Ollama.
- Operating System: Windows 10/11
- Python Version: 3.13
- Recommended Environment: Virtual Environment (
venv)
The application uses a Data folder to store:
- 📝 Uploaded files →
uploaded_document.txt - 📝 Processed content →
processed_content.txt - 💬 Chat history →
ChatSessions/ - 📩 Feedback data →
feedback_dataset.json - 📊 Expected query responses →
expected_query_responses.xlsx - 🔍 Query logs →
query_responses.xlsx
git clone https://github.com/hlin-0420/GEO-bot-prototype.git
cd GEO-bot-prototypepython -m venv venv
# Windows
venv\Scripts\activate
# macOS/Linux
source venv/bin/activatepip install -r requirements.txt- Download and install Ollama from Ollama Official Site.
This project uses Llama 3.2, Deepseek 1.5, and OpenAI models. Install them using:
ollama pull llama3.2:latest
ollama pull deepseek1.5
⚠️ Note: OpenAI models are not available viaollama pull.
Instead, configure OpenAI API by setting an environment variable:export OPENAI_API_KEY="your-api-key-here" # macOS/Linux set OPENAI_API_KEY="your-api-key-here" # Windows
Run the application:
python offline-app.py- Wait until the local host link appears.
- Open the link in your browser:
http://127.0.0.1:5000/
✅ Before running, make sure Ollama is open to enable the chatbot's connection with the model.
Sphinx generates the documentation through the following steps:
- Install Sphinx and its required themes:
pip install sphinx sphinx-rtd-theme
- Navigate to the
docsfolder from your project root folder:
cd docs
Based on your OS, build the documentation with the instructions:
- For Windows (Powershell or CMD)
.\make.bat html
- For macOS / Linux
make html
- Move the built documentation file to the Flask templates directory.
Ensure Ollama is running:
- Open Ollama from the Windows search bar (or Terminal for macOS/Linux).
- Verify the installed models using:
If no models appear, re-run:
ollama list
ollama pull llama3.2:latest ollama pull deepseek1.5 ollama pull tinyllama:latest ollama pull gemma3:1b
- Run the command:
python offline-app.py
- If you see
ModuleNotFoundError, try:pip install -r requirements.txt
- Ensure Flask is running and open:
http://127.0.0.1:5000/
Upon successful execution, you should see:
* Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
Copy and paste this link into your browser.
- 📚 Read the [Documentation] by viewing the option
📖 Documentationfrom the menu page. - 🛠️ Customize models in configurations.
- 🚀 Extend functionality with additional APIs.