Skip to content

Add ollama-local support to Microbots#76

Closed
Copilot wants to merge 2 commits intobala/add_ollama_local_modelfrom
copilot/sub-pr-73-another-one
Closed

Add ollama-local support to Microbots#76
Copilot wants to merge 2 commits intobala/add_ollama_local_modelfrom
copilot/sub-pr-73-another-one

Conversation

Copy link
Contributor

Copilot AI commented Dec 1, 2025

Adds Ollama Local as an LLM provider for Microbots, enabling locally-hosted language models alongside Azure OpenAI.

Changes

New OllamaLocal provider (src/microbots/llm/ollama_local.py)

  • Implements LLMInterface for local Ollama server communication
  • JSON extraction handling for responses with extra text

Test infrastructure (test/llm/)

  • Pytest fixtures for Ollama server lifecycle management
  • Unit tests with mocked responses
  • Integration tests marked with @pytest.mark.ollama_local

Usage

from microbots import MicroBot

bot = MicroBot(
    model="ollama-local/codellama:latest",
    folder_to_mount="/path/to/repo"
)

Environment variables: LOCAL_MODEL_NAME, LOCAL_MODEL_PORT (default: 11434)

Code review fixes

  • Grammar/capitalization corrections in comments and prompts
  • Fixed incorrect default value {}"" for missing response handling
  • Removed unused imports
  • Simplified test assertions

✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com>
Copilot AI changed the title [WIP] Add ollama-local support to Microbots Add ollama-local support to Microbots Dec 1, 2025
Copilot AI requested a review from 0xba1a December 1, 2025 14:45
@0xba1a 0xba1a closed this Dec 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants