Skip to content

jmkraus/picochat

Repository files navigation

PicoChat

A lightweight terminal chat client for local and OpenAI-compatible LLM backends.

Introduction

PicoChat provides an interactive CLI workflow with multiline input, session history, structured output options, and image-capable prompts.

Key Features

  • Interactive multiline chat input (Ctrl+D submit, Esc/Ctrl+C cancel)
  • Session history save/load
  • Multiple backend protocols (ollama, openai, responses)
  • Output formatting (plain, json, json-pretty, yaml)
  • Structured content generation via JSON schema
  • Image prompt support
  • Clipboard helpers and runtime commands

Installation

Build from source:

go mod download
go mod tidy
go test ./...
go build

On macOS, unsigned binaries may need quarantine removal:

sudo xattr -rd com.apple.quarantine ./picochat

Quick Start

Interactive mode:

./picochat

Pipe mode:

echo "Write a Haiku about Cheese" | ./picochat -quiet

Documentation

Detailed guides are in /docs:

Acknowledgements

License

MIT

About

AI CLI chat client for AI LLM's, mainly written for Pico AI Server on Mac. But also works well with ollama.

Topics

Resources

License

Stars

Watchers

Forks

Contributors

Languages