Skip to content

memodb-io/Acontext

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

797 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Acontext

Simple Context Storage for Cloud Agents

🌐 Website | πŸ“š Document

Twitter Follow Acontext Discord

❓ What is Acontext

Why Acontext - Problems and Solutions
Acontext is a context storage for your agent products:
  • it can store messages, artifacts, agent skills and sandboxes...
  • it provides observability over the context data.
  • it supports many out-of-box context engineering methods

πŸ’‘ Core Features

  • Context Storage

    • Session: unified message storage for any llm, any modal.
    • Disk: save/download artifacts with file path.
    • Agent Skills - empower your agents with agent skills.
    • Sandbox - run code, analyze data, export artifacts.
  • Context Observability

    • Session Summary: asynchronously summarize agent's progress and user feedback.
    • State Tracking: collect agent's working status in near real-time.
  • View everything in one dashboard

Dashboard

Dashboard of Agent Success Rate and Other Metrics

πŸš€ Connect to Acontext

  1. Go to Acontext.io, claim your free credits.
  2. Go through a one-click onboarding to get your API Key: sk-ac-xxx
Dashboard
πŸ’» Self-host Acontext

We have an acontext-cli to help you do quick proof-of-concept. Download it first in your terminal:

curl -fsSL https://install.acontext.io | sh

You should have docker installed and an OpenAI API Key to start an Acontext backend on your computer:

mkdir acontext_server && cd acontext_server
acontext server up

Make sure your LLM has the ability to call tools. By default, Acontext will use gpt-4.1.

acontext server up will create/use .env and config.yaml for Acontext, and create a db folder to persist data.

Once it's done, you can access the following endpoints:

Step-by-step Quickstart

We're maintaining Python pypi and Typescript npm SDKs. The snippets below are using Python.

Click the doc link to see TS SDK Quickstart.

Install SDKs

pip install acontext

Initialize Client

import os
from acontext import AcontextClient

# For cloud:
client = AcontextClient(
    api_key=os.getenv("ACONTEXT_API_KEY"),
)

# For self-hosted:
client = AcontextClient(
    base_url="http://localhost:8029/api/v1",
    api_key="sk-ac-your-root-api-bearer-token",
)

Store & Get Messages

Docs

Store messages in OpenAI, Anthropic, or Gemini format. Auto-converts on retrieval.

# Create session and store messages
session = client.sessions.create()

# Store text, image, file, etc.
client.sessions.store_message(
    session_id=session.id,
    blob={"role": "user", "content": "Hello!"},
    format="openai"
)

# Retrieve in any format (auto-converts)
result = client.sessions.get_messages(session_id=session.id, format="anthropic")

Context Engineering

Session Summary | Context Editing

Compress context with summaries and edit strategies. Original messages unchanged.

# Session summary for prompt injection
summary = client.sessions.get_session_summary(session_id)
system_prompt = f"Previous tasks:\n{summary}\n\nContinue helping."

# Context editing - limit tokens on retrieval
result = client.sessions.get_messages(
    session_id=session_id,
    edit_strategies=[
        {"type": "remove_tool_result", "params": {"keep_recent_n_tool_results": 3}},
        {"type": "token_limit", "params": {"limit_tokens": 30000}}
    ]
)

Agent Storage Tools

Disk Tool

Tool Docs | SDK Docs

Persistent file storage for agents. Supports read, write, grep, glob.

from acontext.agent.disk import DISK_TOOLS
from openai import OpenAI

disk = client.disks.create()
ctx = DISK_TOOLS.format_context(client, disk.id)

# Pass to LLM
response = OpenAI().chat.completions.create(
    model="gpt-4.1",
    messages=[
        {"role": "system", "content": f"You have disk access.\n\n{ctx.get_context_prompt()}"},
        {"role": "user", "content": "Create a todo.md with 3 tasks"}
    ],
    tools=DISK_TOOLS.to_openai_tool_schema()
)

# Execute tool calls
for tc in response.choices[0].message.tool_calls:
    result = DISK_TOOLS.execute_tool(ctx, tc.function.name, json.loads(tc.function.arguments))
Sandbox Tool

Tool Docs | SDK Docs

Isolated code execution environment with bash, Python, and common tools.

from acontext.agent.sandbox import SANDBOX_TOOLS
from openai import OpenAI

sandbox = client.sandboxes.create()
disk = client.disks.create()
ctx = SANDBOX_TOOLS.format_context(client, sandbox.sandbox_id, disk.id)

# Pass to LLM
response = OpenAI().chat.completions.create(
    model="gpt-4.1",
    messages=[
        {"role": "system", "content": f"You have sandbox access.\n\n{ctx.get_context_prompt()}"},
        {"role": "user", "content": "Run a Python hello world script"}
    ],
    tools=SANDBOX_TOOLS.to_openai_tool_schema()
)

# Execute tool calls
for tc in response.choices[0].message.tool_calls:
    result = SANDBOX_TOOLS.execute_tool(ctx, tc.function.name, json.loads(tc.function.arguments))
Sandbox with Skills

Tool Docs | SDK Docs

Mount reusable Agent Skills into sandbox at /skills/{name}/. Download xlsx skill.

from acontext import FileUpload

# Upload a skill ZIP (e.g., web-artifacts-builder.zip)
with open("web-artifacts-builder.zip", "rb") as f:
    skill = client.skills.create(file=FileUpload(filename="web-artifacts-builder.zip", content=f.read()))

# Mount into sandbox
ctx = SANDBOX_TOOLS.format_context(
    client, sandbox.sandbox_id, disk.id,
    mount_skills=[skill.id]  # Available at /skills/{skill.name}/
)

# Context prompt includes skill instructions
response = OpenAI().chat.completions.create(
    model="gpt-4.1",
    messages=[
        {"role": "system", "content": f"You have sandbox access.\n\n{ctx.get_context_prompt()}"},
        {"role": "user", "content": "Create an Excel file with a simple budget spreadsheet"}
    ],
    tools=SANDBOX_TOOLS.to_openai_tool_schema()
)

# Execute tool calls
for tc in response.choices[0].message.tool_calls:
    result = SANDBOX_TOOLS.execute_tool(ctx, tc.function.name, json.loads(tc.function.arguments))

You can download a full skill interactive demo with acontext-cli:

acontext create my-skill --template-path "python/interactive-agent-skill"

🧐 Use Acontext to build Agent

Download end-to-end scripts with acontext:

Python

acontext create my-proj --template-path "python/openai-basic"

More examples on Python:

  • python/openai-agent-basic: openai agent sdk template
  • python/agno-basic: agno framework template
  • python/openai-agent-artifacts: agent can edit and download artifacts.

Typescript

acontext create my-proj --template-path "typescript/openai-basic"

More examples on Typescript:

  • typescript/vercel-ai-basic: agent in @vercel/ai-sdk

Note

Check our example repo for more templates: Acontext-Examples.

We're cooking more full-stack Agent Applications! Tell us what you want!

πŸ” Document

To understand what Acontext can do better, please view our docs

❀️ Stay Updated

Star Acontext on Github to support and receive instant notifications

click_star

πŸ—οΈ Architecture

click to open
graph TB
    subgraph "Client Layer"
        PY["pip install acontext"]
        TS["npm i @acontext/acontext"]
    end
    
    subgraph "Acontext Backend"
      subgraph " "
          API["API<br/>localhost:8029"]
          CORE["Core"]
          API -->|FastAPI & MQ| CORE
      end
      
      subgraph " "
          Infrastructure["Infrastructures"]
          PG["PostgreSQL"]
          S3["S3"]
          REDIS["Redis"]
          MQ["RabbitMQ"]
      end
    end
    
    subgraph "Dashboard"
        UI["Web Dashboard<br/>localhost:3000"]
    end
    
    PY -->|RESTFUL API| API
    TS -->|RESTFUL API| API
    UI -->|RESTFUL API| API
    API --> Infrastructure
    CORE --> Infrastructure

    Infrastructure --> PG
    Infrastructure --> S3
    Infrastructure --> REDIS
    Infrastructure --> MQ
    
    
    style PY fill:#3776ab,stroke:#fff,stroke-width:2px,color:#fff
    style TS fill:#3178c6,stroke:#fff,stroke-width:2px,color:#fff
    style API fill:#00add8,stroke:#fff,stroke-width:2px,color:#fff
    style CORE fill:#ffd43b,stroke:#333,stroke-width:2px,color:#333
    style UI fill:#000,stroke:#fff,stroke-width:2px,color:#fff
    style PG fill:#336791,stroke:#fff,stroke-width:2px,color:#fff
    style S3 fill:#ff9900,stroke:#fff,stroke-width:2px,color:#fff
    style REDIS fill:#dc382d,stroke:#fff,stroke-width:2px,color:#fff
    style MQ fill:#ff6600,stroke:#fff,stroke-width:2px,color:#fff
Loading

🀝 Stay Together

Join the community for support and discussions:

🌟 Contributing

πŸ₯‡ Badges

Made with Acontext Made with Acontext (dark)

[![Made with Acontext](https://assets.memodb.io/Acontext/badge-made-with-acontext.svg)](https://acontext.io)

[![Made with Acontext](https://assets.memodb.io/Acontext/badge-made-with-acontext-dark.svg)](https://acontext.io)

πŸ“‘ LICENSE

This project is currently licensed under Apache License 2.0.