Loci is a plugin-governed local AI runtime and management plane built as a Rust workspace.
The refactored repository treats the new workspace architecture as authoritative:
crates/core: runtime kernel, model loading, governance seams, plugin inventory, management servicecrates/cli:locibinary for plugin discovery and management HTTP servingcrates/plugin-api: stable manifest and capability types shared by plugins and host codecrates/ffi: stable public C ABI and native integration surfacecrates/legacy-plugin-apiandcrates/legacy-plugin-compat: bounded compatibility island for legacy text plugins
Loci is not positioned as a chat shell. It is the runtime substrate that host products can embed behind desktop apps, IDE assistants, local agent systems, and enterprise automation products.
The new architecture is organized around governable seams instead of hardcoded feature paths. Core behavior can be rewritten by plugins at the component boundary level.
Current core rewriter seams:
inferencemodelhardwareworkflowevent_busplugin_managerui_host
Plugin bundles are manifest-first. Native manifests are the mainline path. Legacy plugin support remains available only through explicit compatibility bridges.
loci/
|-- crates/
| |-- cli/
| |-- core/
| |-- ffi/
| |-- plugin-api/
| |-- legacy-plugin-api/
| `-- legacy-plugin-compat/
|-- plugins/ # example plugin bundle manifests for the new architecture
|-- include/ # public C header for the stable ABI
|-- deps/llama.cpp/ # pinned submodule used by the optional `llama` feature
|-- docs/
| |-- ARCHITECTURE.md
| |-- MANAGEMENT_API.md
| |-- PRODUCT_STRATEGY_2026.md
| `-- architecture/
|-- scripts/
| `-- full_test.ps1
`-- wasm-plugin-sdk/
Clone with the llama.cpp submodule:
git clone https://github.com/decade-afk/loci.git
cd loci
git submodule update --init --recursiveBuild the CLI with real llama.cpp backend support:
cargo build -p loci-cli --release --features llamaStart the management server with bundled example manifests:
cargo run -p loci-cli --features llama -- \
--plugin-dir plugins \
--management-bind 127.0.0.1:8080Basic checks:
curl http://127.0.0.1:8080/health
curl http://127.0.0.1:8080/v1/runtime
curl http://127.0.0.1:8080/v1/core/rewriters/inventoryLoad a model through the control plane:
curl http://127.0.0.1:8080/v1/model/load \
-H "Content-Type: application/json" \
-d "{\"backend_name\":\"llama.cpp\",\"config\":{\"model_path\":\"D:/models/qwen.gguf\"}}"Run generation:
curl http://127.0.0.1:8080/v1/inference/generate \
-H "Content-Type: application/json" \
-d "{\"prompt\":\"hello from loci\"}"Loci now prefers manifest bundles over ad hoc runtime loading contracts.
- Load a whole plugin directory with
--plugin-dir <path> - Load a bundle or directory over HTTP with
POST /v1/plugins/load - Activate rewriter ownership with
POST /v1/core/rewriters/activate - Activate legacy text compatibility only when required with
POST /v1/legacy-text/activate
Example manifests live in:
plugins/example-inferenceplugins/example-infraplugins/example-agentplugins/example-ui-shell
More detail is in PLUGIN_GUIDE.md.
Workspace validation:
cargo test -qllama.cpp integration validation:
cargo test -q -p loci-core --features llama
cargo test -q -p loci-cli --features llamaWindows helper script:
powershell -ExecutionPolicy Bypass -File scripts/full_test.ps1The workspace migration is authoritative. Old root-level monolith sources, old example programs, and old serve/generate-era docs are no longer the source of truth.
What is production-oriented today:
- plugin-governed runtime core
- management HTTP control plane
- runtime snapshot and plugin inventory
- rewriter activation for inference, model, hardware, workflow, event bus, plugin manager, and UI host
- real model load governance and
llama.cppoptional backend integration - stable public C ABI in
crates/ffiwith headerinclude/loci.h - bounded legacy text plugin compatibility
What is intentionally bounded:
- legacy plugin bridging stays isolated in compat crates
Licensed under either of:
- Apache License, Version 2.0 (LICENSE-APACHE)
- MIT license (LICENSE-MIT)
at your option.