Skip to content

Releases: provnai/provnzero

ProvnZero v1.1.0-beta - Modernized & Hardened

22 Mar 12:34

Choose a tag to compare

This release continues the ProvnZero beta with updates to modern LLM APIs, additional security hardening, and improvements to distribution and CI reliability.

🛡️ Key Improvements

  • Chat API Migration
    • Migrated OpenAI, Anthropic, and DeepSeek integrations to modern chat-based APIs.
    • Updated endpoints to support newer models and avoid legacy API issues.
  • Security Hardening
    • Added stricter per-endpoint rate limits for initialization routes.
    • Reinforced memory zeroization for intermediate prompt buffers.
  • Improved Distribution
    • Optimized binary for static linking using rustls-tls.
    • Improves compatibility with musl-based environments such as Railway and other minimal containers.

✅ Verified Status

  • All CI checks passing across the automated integration suite.
  • Verified HPKE encryption flow.
  • Verified VEX receipt signing.
  • Verified multi-provider compatibility across supported LLM APIs.

ProvnZero v1.0.0-beta

21 Mar 22:58

Choose a tag to compare

Zero knowledge. Zero retention. Zero compromise.

This is the first public beta release of ProvnZero. The project provides a zero-retention proxy and TypeScript SDK that seals prompts on the client, decrypts only in memory for inference, and immediately destroys plaintext after the response is returned.

🔥 What's Included

  • RFC 9180 HPKE Sealing
    • All prompts are encrypted client-side using HPKE.
    • The proxy only processes encrypted payloads and never sees plaintext over the wire.
  • Hardware-Grade Memory Safety
    • Implemented in Rust.
    • Plaintext is stored only in secure memory buffers and zeroized immediately after inference completes.
  • VEX Audit Trails
    • No plaintext logging.
    • Each request generates an Ed25519-signed VEX receipt proving secure handling.
  • Universal Adapter
    • Supports OpenAI, Anthropic, DeepSeek, and any OpenAI-compatible endpoint (Groq, vLLM, Ollama) via OPENAI_BASE_URL.

🛡️ Validation & Testing

  • 12/12 integration tests passed against live APIs under rapid load.
  • Integrated tower-governor rate limiting for abuse protection.
  • Zero compiler warnings under strict Rust CI configuration.

📦 Quick Start

  1. Install TypeScript SDK
    npm install provnzero-sdk

  2. Pull Docker Proxy
    docker pull provnai/provnzero-proxy:v1.0.0-beta

Full setup instructions are available in the README. Security and zero-data-retention guarantees are documented in the Security Policy.