Skip to content

Local AI diagnostics report Ollama as not running and required models as missing #1293

@Al629176

Description

@Al629176

Summary

Local AI diagnostics report Ollama as not running and required models as missing.

Problem

In Settings > Local AI diagnostics, running diagnostics shows an Ollama error. The server is reported as not running, the binary is not found, and expected chat/embedding models are missing.

Expected behavior is that diagnostics should either detect and start/repair Ollama correctly, or provide clear setup actions for installing Ollama and downloading required models.

Steps to reproduce:

  1. Open OpenHuman desktop app.
  2. Go to Settings > Local AI / diagnostics.
  3. Click Run Diagnostics.
  4. Observe Ollama server not running or unreachable at localhost:11434 and required models marked missing.

Environment:

  • Desktop app
  • Settings > Local AI diagnostics
  • Backend: Ollama
  • Expected endpoint: http://localhost:11434
  • Platform/version unknown

Solution optional

Check Ollama binary detection, configured binary path, bootstrap/resume flow, and model download checks. Improve diagnostics to show one-click repair actions for installing/starting Ollama and downloading missing models.

Acceptance criteria

  • Repro gone - Diagnostics no longer report false Ollama not running/binary missing errors when Ollama is installed and reachable.
  • Regression safety - Unit, integration, or E2E coverage added or updated for local AI diagnostics and bootstrap behavior.
  • Diff coverage >= 80% - Fix PR meets changed-lines coverage gate.
  • Clear repair flow - If Ollama is genuinely missing or stopped, UI provides clear install/start/download actions.
  • Model detection - Required chat and embedding models show accurate installed/missing status.
  • No dead-end state - Bootstrap/resume or force re-bootstrap can recover the local AI setup without manual terminal work where possible.

Related

Screenshot provided showing Ollama Diagnostics with server not running, binary not found, and missing gemma3:1b-it-qat / all-minilm:latest models.

Metadata

Metadata

Assignees

Labels

local-aiLocal model runtime, Ollama integration, and local AI config.priority: highImportant work that should be pulled forward.react-uiReact app work in app/src: pages, components, providers, store, and UX.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions