diff --git a/docs/dev/README.md b/docs/dev/README.md
index 31626e1..84c9019 100644
--- a/docs/dev/README.md
+++ b/docs/dev/README.md
@@ -54,6 +54,10 @@ changing a contract that other features depend on.
- [Journal](./journal.md) — daily automatic +
user journal entries, the journaling agent, the
`journal_*` tools, and the Journal modal.
+- [Wiki](./wiki.md) — flat encyclopedic articles about
+ the user, the autonomous wiki agent, the per-article
+ manual update flow, the `wiki_*` tools, and the Wiki
+ drawer tab.
- [Cookbook](./cookbook.md) — `recipes` store + Cooklang
parser + the recipe_* tools + the Cookbook modal and
drawer tab.
diff --git a/docs/dev/wiki.md b/docs/dev/wiki.md
new file mode 100644
index 0000000..f0f7d39
--- /dev/null
+++ b/docs/dev/wiki.md
@@ -0,0 +1,399 @@
+# Wiki
+
+Flat encyclopedic articles about the user. A fourth peer to chats,
+memories, and journal entries. The user authors articles directly
+through the Wiki drawer tab; an autonomous background agent maintains
+articles by reading conversations a day after they settle.
+
+## Role
+
+Three knowledge surfaces with deliberately different shapes:
+
+- **Memory** (`docs/dev/memory.md`) - atomic labelled facts, surfaced
+ inline by the chat-loop's opening recall.
+- **Journal** (`docs/dev/journal.md`) - dated reflective prose, with
+ today's automatic entry included in the opening-turn system prompt.
+- **Wiki** (this doc) - longer-form encyclopedic articles, **never
+ auto-injected** into the chat. The main LLM reaches them only
+ through the always-on `wiki_search` tool.
+
+Articles are titled, single-level (no nesting), and unique per
+`(user_id, title)`. The voice is encyclopedic third-person prose -
+intentionally different from chat-style or journal-reflective
+registers.
+
+## Files
+
+Schema:
+
+- `supabase/schema.sql` - the "User Wiki" block defines
+ `wiki_articles`, the `clear_wiki_embedding_on_change` trigger,
+ RLS policies, three new `threads` columns
+ (`last_wiki_processed_msg_id`, `wiki_claim_holder`,
+ `wiki_claim_expires_at`), and five RPCs:
+ `claim_next_thread_for_wiki`,
+ `mark_thread_wiki_processed_if_claimed`,
+ `claim_next_pending_wiki_article`,
+ `save_wiki_article_embedding_if_claimed`,
+ `search_wiki_articles_by_embedding`.
+
+Data layer (main thread + workers):
+
+- `src/lib/supabase.ts` - `WikiArticle` interface,
+ `coerceWikiArticle`, plus the `SupabaseService` methods:
+ `listWikiArticles`, `getWikiArticleById`, `getWikiArticleByTitle`,
+ `createWikiArticle`, `updateWikiArticle`, `deleteWikiArticle`,
+ `searchWikiArticles`, `claimNextThreadForWiki`,
+ `markThreadWikiProcessedIfClaimed`,
+ `claimNextPendingWikiArticle`, `saveWikiArticleEmbedding`. The
+ `UserSettings` interface gains `wikiAutomaticEnabled?: boolean`.
+- `src/lib/wiki.ts` - the search helper
+ (`searchWikiArticlesSemantic`) plus the `MAX_WIKI_TITLE_CHARS`
+ (200) and `MAX_WIKI_CONTENT_CHARS` (16000) ceilings.
+- `src/lib/wiki-store.svelte.ts` - the shared `wikiStore`,
+ `runWikiSearch`, and the `patchWikiRow` / `removeWikiRow` /
+ `addWikiRow` mutators the panel and tools call.
+- `src/lib/wiki-events.ts` - the `WIKI_CHANGE_EVENT` window-event
+ bus parallel to `journal-events.ts` / `cookbook-events.ts`.
+
+Tools:
+
+- `src/lib/tools/wiki_search.{schema.,}ts` - the always-on recall
+ tool (registered in `src/lib/tools/index.ts`'s
+ `alwaysOnToolbox`).
+- `src/lib/tools/wiki_create.{schema.,}ts`,
+ `wiki_update.{schema.,}ts`, `wiki_delete.{schema.,}ts` - the
+ agent-only write tools.
+- `src/lib/tools/wiki_toolbox.ts` - the agent toolbox that bundles
+ the four tools above with lazy-loaded schemas, parallel to
+ `memory_toolbox.ts`.
+
+Embeddings:
+
+- `src/lib/embeddings/sources/wiki.ts` - the `EmbeddingSource`
+ adapter. The generic worker (`src/lib/embeddings/worker.ts`)
+ picks it up alongside memories, threads, samskara substrate, and
+ journal entries.
+
+Autonomous agent:
+
+- `src/lib/agents/wiki/types.ts` - `WikiInput`, `WikiOutput`.
+- `src/lib/agents/wiki/prompt.ts` - `WIKI_AUTONOMOUS_PROMPT` and
+ `WIKI_MANUAL_PROMPT`.
+- `src/lib/agents/wiki/agent.ts` - the `WikiAgent` class. Two
+ entry points: `run()` for the worker path and `updateOne()` for
+ the main-thread per-article manual flow.
+- `src/lib/agents/wiki/loop.ts` - cycle driver (acquire ->
+ claim -> run -> mark).
+- `src/lib/agents/wiki/worker.ts` - Web Worker entry point.
+ Lease partition `'wiki'`.
+- `src/lib/agents/wiki/manager.ts` - `BaseWorkerManager`
+ subclass. Lock name `nak:wiki-worker`, logger source
+ `wiki-worker`. Bubbles `progress: 'processed'` to
+ `emitWikiChange()`.
+
+Model registry:
+
+- `src/lib/models/index.ts` - `AgentRole` adds `'wiki'`,
+ `AGENT_MODELS.wiki = 'deepseek-v4-flash'` (same model as journal
+ and reflection; rationale documented inline above the table).
+
+Main-thread plumbing:
+
+- `src/lib/state.svelte.ts` - lazy-imports the manager,
+ `app.wikiAutomaticEnabled`, `setWikiAutomaticEnabled`,
+ `persistWikiAutomaticEnabled`, hooks into
+ `applyServerSettings`, `startBackgroundWorkers`,
+ `setJournalTimezone` (live tz update), and `lock()`.
+- `src/lib/routing.svelte.ts` - extends `DrawerTab` with
+ `'wiki'` and `Route` with `wiki_article_id`.
+- `src/lib/chat-prompt.ts` - `WIKI_BLOCK` after `JOURNAL_BLOCK`
+ in the section list.
+
+UI:
+
+- `src/components/WikiList.svelte` - drawer listing. Search
+ input + alphabetical sort.
+- `src/screens/Wiki.svelte` - main-panel article view, edit
+ form, create form, delete confirmation, and the "ask agent
+ to update" preview/accept/cancel flow.
+- `src/screens/Chat.svelte` - new tab, drawer branch,
+ main-panel branch, top-bar branch, change-event listener.
+- `src/screens/Settings.svelte` - new "Wiki" group with the
+ `wikiAutomaticEnabled` toggle.
+
+Docs:
+
+- `docs/user/wiki.md` (this feature's user-facing manual).
+- `docs/dev/wiki.md` (this file).
+
+## Entry points
+
+- `wikiManager.start({ supabase, config, timezone })` - called
+ from `state.svelte.ts:startBackgroundWorkers` when
+ `app.wikiAutomaticEnabled === true`. Spawns the worker
+ inside the `nak:wiki-worker` cross-tab Web Lock.
+- `WikiAgent.run({ input: { threadId, terminalMsgId }, ... })`
+ - the worker's per-cycle entry. Slices thread history at
+ `terminalMsgId`, appends `WIKI_AUTONOMOUS_PROMPT` as the
+ final user turn, runs `runHeadlessToolLoop` against
+ `wikiToolbox`. Side effects (the `wiki_*` tool calls) ARE
+ the output; final text is discarded.
+- `WikiAgent.updateOne({ articleId, currentTitle,
+ currentContent, userInstructions, signal })` - the
+ main-thread per-article manual entry. Single Venice
+ completion with `response_format: {type: 'json_object'}`,
+ no tool loop. Returns `{ kind: 'preview', title, content }`
+ or `{ kind: 'noop', reason }`.
+- `wiki_search` tool - registered in
+ `alwaysOnToolbox.tools` so every chat request can reach
+ it without a toolbox toggle.
+
+## Data model
+
+`wiki_articles`:
+
+- `id uuid pk default gen_random_uuid()`
+- `user_id uuid not null references auth.users on delete cascade`
+- `title text not null`
+- `content text not null`
+- `embedding vector(2048)` - padded by the generic embeddings
+ worker, same shape as memories and journal entries.
+- `embedding_model text`, `embedding_claim_holder text`,
+ `embedding_claim_expires timestamptz` - same claim-protocol
+ columns as memories and journal entries (note: `_expires`
+ not `_expires_at`, matching the existing convention).
+- `created_at`, `updated_at timestamptz default now()`
+- `unique (user_id, title)` - the agent's `wiki_create` tool
+ surfaces a unique-violation as actionable text so the
+ autonomous agent reads the conflict and falls through to
+ `wiki_search` + `wiki_update`.
+- Index `(user_id, lower(title))` for the alphabetical drawer
+ listing.
+- Trigger `clear_wiki_embedding_on_change` nulls the embedding
+ and claim columns on title or content change; the embedding
+ worker re-embeds on its next poll.
+
+`threads` extension columns:
+
+- `last_wiki_processed_msg_id uuid references messages(id) on
+ delete set null` - pointer the autonomous agent advances
+ after each cycle.
+- `wiki_claim_holder text`, `wiki_claim_expires_at timestamptz`
+ - per-thread claim columns (note: `_at` suffix here, matching
+ the existing journal claim columns).
+
+These are independent of the memory-reflection
+(`last_reflected_msg_id`) and journal
+(`last_journaled_msg_id`) pointers. All three workers can run
+concurrently against the same thread.
+
+### Eligibility predicate
+
+`claim_next_thread_for_wiki` differs from
+`claim_next_thread_for_journal` in two specific ways:
+
+1. **Newest-message lateral.** The journal RPC reads
+ `threads.updated_at` for the cooldown bucket. The wiki RPC
+ reads the newest message's `created_at` directly via a
+ second lateral. Both columns move on every insert, but
+ reading from messages.created_at is more honest about
+ "when did the conversation actually last move" - a future
+ bump to threads.updated_at from an unrelated write would
+ shift the gate.
+2. **Strict-yesterday gate.** The eligibility predicate is
+ `(newest.created_at at time zone p_timezone)::date <
+ (now() at time zone p_timezone)::date` - newest message
+ must land on a calendar day strictly before today in the
+ user's tz. Effect: chat Monday -> eligible Tuesday; user
+ resumes Wednesday -> the new newest msg lands on Wednesday
+ and the inequality fails again until Thursday.
+
+Same depth guard (>= 2 user messages) and `for update of t
+skip locked` fairness as the journal RPC.
+
+## Contracts
+
+### Claim/mark atomicity
+
+The autonomous agent's loop is:
+
+1. `claimNextThreadForWiki(holderId, ttl, tz)` - returns
+ `{ threadId, terminalMsgId, title, newestMsgAt }` or null.
+2. `WikiAgent.run({ ... })` - tool calls are the side effects.
+3. `markThreadWikiProcessedIfClaimed(threadId, holderId,
+ terminalMsgId)` - returns true on success, false on
+ claim-lost.
+
+Mark is **unconditional on `done`**. Even a no-op cycle
+(agent decided no topic warranted a wiki update) advances the
+pointer so the same conversation isn't re-processed every
+cycle. New turns added later trigger eligibility again via
+the next-day predicate.
+
+This differs from the journal flow, which uses an atomic
+`upsert_journal_entry_and_mark_thread` RPC because the entry
+write and the pointer advance must happen in lockstep. The
+wiki agent's writes are independent tool calls landing
+through the main `wiki_create`/`wiki_update`/`wiki_delete`
+RPCs - those rows are owned by the user, not the claim, so a
+claim-lost during the cycle leaves any already-landed writes
+intact and just drops the pointer-advance for that cycle.
+The next claim will reprocess the conversation.
+
+### Embedding pipeline
+
+`wiki_articles.embedding` is populated by the generic
+embeddings worker. Its source adapter
+(`src/lib/embeddings/sources/wiki.ts`) builds the input
+string as `${title}\n\n${content}` (mirroring memories'
+label-and-data shape), truncates content to
+`MAX_WIKI_CONTENT_CHARS = 16000`, and calls
+`claimNextPendingWikiArticle` / `saveWikiArticleEmbedding`.
+
+The same `text-embedding-bge-m3` model and 2048-dim padded
+vectors as memories and journal entries.
+
+### Autonomous vs manual agent split
+
+Two distinct flows share the `WikiAgent` class:
+
+| Aspect | Autonomous (`run`) | Manual (`updateOne`) |
+| ----------- | ------------------ | -------------------- |
+| Runs in | Web Worker | Main thread |
+| Trigger | Day-after thread | User clicks button |
+| Inputs | Whole conversation | One article + instructions |
+| Tools | Yes (`wikiToolbox`) | No |
+| Output | Tool side effects | JSON preview |
+| Persistence | Tool calls write | UI persists on Accept |
+| Prompt | `WIKI_AUTONOMOUS_PROMPT` | `WIKI_MANUAL_PROMPT` |
+
+Both share `agentModel('wiki').id` (deepseek-v4-flash), the
+encyclopedic-third-person voice, and the "preserve facts
+unless explicitly contradicted" discipline. They differ on
+scope (whole wiki vs one article), input shape (conversation
+vs explicit instructions), and output shape (tool calls vs
+JSON).
+
+### Tool toolbox split
+
+- `alwaysOnToolbox` includes only `wiki_search`. The main
+ LLM never gets a path to write to the wiki - that's the
+ spec's "articles are never auto-injected; recall is
+ pulled, never pushed" stance applied symmetrically.
+- `wikiToolbox` (in `src/lib/tools/wiki_toolbox.ts`) bundles
+ search + create + update + delete for the autonomous
+ agent. The agent receives delete because consolidation
+ (subsuming a stale duplicate into another article it just
+ updated) is a legitimate wiki-maintenance operation. The
+ prompt explicitly forbids deleting on the basis of "the
+ user said something different today" alone.
+
+## Interactions
+
+- **Memory** (`docs/dev/memory.md`) - the wiki's embedding
+ shape, claim-protocol columns, and "polymorphic adapter"
+ worker pattern are clones of memories. Both feature docs
+ reference the canonical adapter contract in
+ `embeddings.md`.
+- **Journal** (`docs/dev/journal.md`) - the wiki's manager,
+ worker, loop, and `wikiAutomaticEnabled` setting all clone
+ the journal subsystem's shape. The eligibility predicate
+ in `claim_next_thread_for_wiki` is the deliberate
+ divergence; the rest is parallel structure. Both share the
+ user's `journalTimezone` preference.
+- **Embeddings** (`docs/dev/embeddings.md`) - the generic
+ worker now polls four (now five) sources in round-robin:
+ memories, threads, samskara substrate, journal entries,
+ wiki articles. Adding a source is a one-line append to
+ `worker.ts`'s sources array plus a new file under
+ `sources/`.
+- **Chat-prompt** (search `WIKI_BLOCK` in
+ `src/lib/chat-prompt.ts`) - the WIKI_BLOCK paragraph names
+ the wiki and the recall tool but stays short, since the
+ framing is "articles are pulled, never pushed".
+- **Settings** (`docs/dev/settings.md`) - the `wiki` group
+ exposes only the toggle. The journal pane owns the
+ user-tz preference, which the wiki shares.
+
+## Gotchas
+
+- **Use `messages.created_at` for the day-gate, not
+ `threads.updated_at`.** The journal RPC reads
+ `threads.updated_at` because journals fired on a same-day
+ cooldown predicate that already matched the journal's
+ semantics. The wiki gate is "newest message's calendar day
+ is strictly before today" - reading off the messages
+ lateral keeps the predicate stable against future bumps to
+ `threads.updated_at` from unrelated writes.
+- **`unique(user_id, title)` + ON CONFLICT in the agent.**
+ The autonomous agent is told to always `wiki_search`
+ before writing, but a near-duplicate title can still slip
+ through (the search returned an unrelated article, or
+ caching missed). The unique constraint surfaces the
+ collision as a tool error the agent reads as "fall through
+ to wiki_search + wiki_update". Removing the constraint
+ would silently allow duplicate articles.
+- **Manual agent must NOT discard facts unless told to.**
+ The "rewrite for tone" / "fix paragraph 2" / "add a
+ sentence" patterns all preserve the rest of the article.
+ This is encoded in the `WIKI_MANUAL_PROMPT` and is
+ load-bearing for the trust contract with the user.
+ Reviewer note: a future change that broadens the prompt to
+ "make it better" would silently rewrite parts the user
+ wanted left alone.
+- **Pointer-advance is unconditional on `done`.** Even a
+ no-op cycle (agent issued zero tool calls) advances the
+ pointer. Without this, every cycle would re-process the
+ same "the model decided this conversation has nothing
+ worth wiki-ing" conversation forever.
+- **`wiki_create` rephrases unique-violations.** The
+ autonomous agent reads tool-error text as guidance; the
+ raw Postgres `duplicate key value violates unique
+ constraint` message is opaque. The tool's `execute`
+ rephrases as "An article titled X already exists. Run
+ wiki_search to find its id, then call wiki_update."
+- **`embedding_claim_expires` (no `_at`).** Schema
+ convention for the embedding-side claim columns matches
+ memories and journal_entries. The thread-side claim
+ columns (`wiki_claim_expires_at`) DO have the suffix,
+ matching `journal_claim_expires_at`. Easy to flip when
+ cloning; both are canonical.
+
+## Verification
+
+End-to-end manual smoke test (mirroring the plan's
+verification list):
+
+1. `mise run sync` against a dev Supabase. Confirm
+ `wiki_articles`, the trigger, the five RPCs, and the
+ three new `threads` columns land. Re-run for
+ idempotency.
+2. **Drawer alphabetical sort.** Add "Zebra", "Apple",
+ "Mango" via the panel. Tab reads Apple, Mango, Zebra.
+3. **Search.** Type "ze" -> "Zebra" only. Clear ->
+ alphabetical returns.
+4. **Create / edit / delete** round-trip via the panel.
+ The drawer reflects each change via `WIKI_CHANGE_EVENT`.
+5. **Ask agent to update - preview / accept / cancel /
+ try again.** Open an article, type instructions ->
+ preview populates -> Accept persists, Cancel dismisses,
+ Try again regenerates.
+6. **Autonomous agent fires the day after.** Substantive
+ conversation today: `claim_next_thread_for_wiki` returns
+ nothing. Thread whose newest message is yesterday-in-tz:
+ agent runs, `wiki_search` then `wiki_create` /
+ `wiki_update` lands, `last_wiki_processed_msg_id`
+ advances.
+7. **Eligibility re-opens after continuation.** New
+ message in that thread today -> RPC returns nothing
+ again until tomorrow.
+8. **Recall tool.** Ask the chat "what do you know about
+ my green tea preference?" - the model issues
+ `wiki_search` and grounds its answer.
+9. **Embeddings filled.** New article -> `embedding is
+ null` initially, populates ~30s later.
+10. **Settings toggle.** Disable "Automatic wiki" ->
+ worker stops, no claims. Enable -> resumes.
+11. `mise run check` green; no `(!)` build warnings or
+ `plugin:vite:reporter` chunking warnings introduced.
diff --git a/docs/user/README.md b/docs/user/README.md
index c452761..338a8d2 100644
--- a/docs/user/README.md
+++ b/docs/user/README.md
@@ -40,6 +40,10 @@ You can reach these pages two ways:
- [Memory](./memory.md) — the long-term store Nak builds up about you
across conversations: what gets remembered, how to correct or
forget something, what's scoped to your account.
+- [Wiki](./wiki.md) — a flat encyclopedia about you: titled articles
+ about projects, people, places, and topics, maintained by both you
+ and a background agent. The assistant reaches them through the
+ always-on `wiki_search` tool.
- [Intuition](./intuition.md) — the subconscious read Nak forms of
each conversation: how the brain icon next to the mood emoji works,
and what the inline cards mean.
diff --git a/docs/user/wiki.md b/docs/user/wiki.md
new file mode 100644
index 0000000..466c1a2
--- /dev/null
+++ b/docs/user/wiki.md
@@ -0,0 +1,161 @@
+# Wiki
+
+The Wiki is a flat encyclopedia about you. Every entry is a titled
+article in encyclopedic third-person prose - about a project, a person
+in your life, a place, an interest, a recurring situation. There's no
+nesting; everything sits at the same level and the drawer lists
+articles alphabetically.
+
+The wiki is a peer to Memory and Journal. Memories are atomic facts
+the assistant references inline. Journal entries are dated reflections
+about how a conversation went. Wiki articles are the longer-form
+topical pages that cover what something IS - "the recipe project",
+"Maya", "Lisbon trip planning". An article sits across many
+conversations.
+
+## Opening the Wiki
+
+Click the **Wiki** tab in the left drawer. The sidebar shows the
+alphabetical listing with a search bar at the top; the main panel
+opens whichever article you click.
+
+When the Wiki tab is open with no article selected, the panel shows
+an empty-state hint plus an "add a new one" link that opens the
+inline create form.
+
+## What goes in an article
+
+Articles are encyclopedic - they read like Wikipedia lead paragraphs,
+not chat replies. Third person, present tense, neutral. They're meant
+to summarize what you'd want to come back to later, not transcribe a
+conversation.
+
+Each article has just two fields:
+
+- **Title** - the topic name. This is the alphabetical sort key in
+ the drawer and must be unique. The title cap is 200 characters.
+- **Content** - the article body, in Markdown. Capped at 16,000
+ characters.
+
+## Searching
+
+The search bar above the listing filters the drawer in place. It uses
+the same semantic-search pipeline the assistant uses for `wiki_search`
+(see "How the assistant uses the wiki" below) - typing a phrase finds
+articles by meaning, not just by literal substring. Substring matches
+are merged in too, so an article you wrote ten seconds ago (before the
+embedding worker has caught up) still surfaces.
+
+Clearing the search returns the alphabetical listing.
+
+## Adding an article
+
+Click the empty-state link **add a new one**, or click **Wiki** in the
+drawer with no article selected. The inline form takes a title and
+content; **Save** persists immediately and surfaces the new article in
+the panel.
+
+Titles are unique per user. If you try to create an article with a
+title that already exists you get a clear error and can either rename
+the new draft or open the existing article and edit it.
+
+## Editing
+
+Open an article and click **Edit**. The view flips to a form with the
+title and content fields. The form shows "Unsaved changes" the moment
+you diverge from the stored row; **Save** persists, **Cancel** drops
+the draft.
+
+Saving an article nulls its embedding - the background embedding
+worker will re-compute on its next poll (within ~30 seconds). Search
+falls back to substring matches in the meantime.
+
+## Deleting
+
+Click **Delete** on the open article. A confirmation strip appears -
+**Delete** is the destructive action, **Cancel** dismisses the prompt.
+
+Deletes are hard - the article doesn't move to a trash bin. If you
+delete by mistake the easy recovery is to ask the assistant to
+reconstruct it from the relevant conversations and call `wiki_create`,
+or to recreate it manually.
+
+## Asking the agent to update an article
+
+Each article has an **Ask agent to update** button that opens an
+instructions textarea. Type what you want the agent to do ("add a
+sentence noting that Maya prefers green tea", "fix the date in
+paragraph two", "rewrite the second paragraph for tone but keep the
+facts") and click **Ask agent**.
+
+The agent runs on the spot and shows a preview. You then have three
+choices:
+
+- **Accept** - persist the agent's version. The article updates and
+ the listing reflects the change.
+- **Try again** - throw away the preview and run the agent again.
+ Useful if the agent's interpretation didn't quite match what you
+ wanted.
+- **Cancel** - close the preview without changing anything.
+
+The agent is told to do exactly what you ask AND to preserve every
+fact already in the article unless you explicitly say to remove or
+replace it. So "add a sentence" adds without rewriting; "rewrite the
+second paragraph" rewrites only that paragraph; "fix the date in
+paragraph 2" patches that single value.
+
+If your instructions don't actually require a change ("looks fine",
+"no edits"), or are too ambiguous to act on without the agent
+inventing facts, the agent emits a "no change applied" note with its
+reasoning instead of a preview. You can Try Again with sharper
+instructions or close the dialog.
+
+## The autonomous background agent
+
+A background agent reads conversations a day after they settle and
+either updates an existing article or creates a new one. The
+specifics:
+
+- A conversation becomes eligible the day **after** its newest
+ message lands (in your timezone). A conversation that wraps Monday
+ evening is eligible Tuesday morning.
+- If you continue the conversation, it becomes eligible again the day
+ after that. So Monday -> agent runs Tuesday -> you continue
+ Wednesday -> agent runs again Thursday on the new turns.
+- The agent decides per topic whether to update, create, or do
+ nothing. The bar for creating an article is "would the user later
+ look this up?", not "did this come up at all?".
+- When updating an existing article, the agent preserves every fact
+ unless the conversation directly contradicts it. The wiki accretes
+ rather than churns.
+
+You can disable the autonomous agent in **Settings -> Wiki**. Manual
+edits and the per-article "Ask agent to update" flow keep working
+when it's off.
+
+## How the assistant uses the wiki
+
+Articles are **never** auto-injected into the chat. The assistant
+reaches them only through `wiki_search`, an always-on tool registered
+on every conversation. When you mention something topical or factual
+about yourself - a project, a person, a place, a habit - the
+assistant will call `wiki_search` to pull the relevant article so its
+reply is grounded rather than guessing.
+
+This is the deliberate split between the three knowledge surfaces:
+
+- **Memory** - atomic facts, may be primed inline at the start of a
+ conversation.
+- **Journal** - dated reflections; today's automatic entry is included
+ in the system prompt on the first turn.
+- **Wiki** - encyclopedic articles, never auto-included. Always
+ retrieved on demand.
+
+If you want the assistant to use a particular article, mention the
+topic (or the title) directly - that's the cue for `wiki_search`.
+
+## Settings controls
+
+The **Settings -> Wiki** pane has one toggle: whether the autonomous
+wiki agent runs in the background. The wiki uses the same day boundary
+timezone you set on the Journal pane.
diff --git a/src/components/WikiList.svelte b/src/components/WikiList.svelte
new file mode 100644
index 0000000..8fd3ade
--- /dev/null
+++ b/src/components/WikiList.svelte
@@ -0,0 +1,133 @@
+
+
+
+ {#if wikiStore.query.trim().length > 0}
+ No matches.
+ {:else}
+ No wiki articles yet. The background agent writes them as you
+ chat, or you can add your own.
+ {/if}
+
+ {:else}
+ {#each sorted as a (a.id)}
+
+
+
+ {/each}
+ {/if}
+
+
+
diff --git a/src/lib/agents/wiki/agent.ts b/src/lib/agents/wiki/agent.ts
new file mode 100644
index 0000000..4e5799b
--- /dev/null
+++ b/src/lib/agents/wiki/agent.ts
@@ -0,0 +1,295 @@
+/**
+ * Wiki agent. Drives two distinct flows over the user's wiki:
+ *
+ * - **Autonomous**: `run()` is called by the worker loop. Reads a
+ * settled thread, appends `WIKI_AUTONOMOUS_PROMPT` as the final
+ * user turn, and runs the headless tool loop with `wikiToolbox`.
+ * The loop's side effects (wiki_search / wiki_create /
+ * wiki_update / wiki_delete calls) ARE the output; final text is
+ * discarded after being captured for logs.
+ *
+ * - **Manual**: `updateOne()` runs synchronously on the main thread
+ * when the user clicks "Ask agent to update" on a single article.
+ * One Venice completion, response_format pinned to JSON, no tool
+ * loop. Returns a structured preview the UI displays before
+ * persisting.
+ *
+ * The two paths share the same model (`agentModel('wiki').id`) and
+ * voice (encyclopedic third-person), but the prompts are distinct -
+ * autonomous reads a conversation and decides per-topic, manual
+ * applies explicit instructions to one article.
+ *
+ * The agent does NOT acquire or release the lease, claim or mark the
+ * thread, or spawn its own worker. Those live in `./loop.ts` and
+ * `./worker.ts` respectively. This class is pure logic.
+ */
+import type { Agent, AgentRunRequest, AgentRunResult } from '../types';
+import type { SupabaseService, Message } from '../../supabase';
+import type { VeniceClient, VeniceMessage, ResponseFormat } from '../../venice';
+import { wikiToolbox } from '../../tools/wiki_toolbox';
+import { runHeadlessToolLoop } from '../../tools/run';
+import { sanitizeToolCallIdForWire, sanitizeToolCallsForWire } from '../../tools/wire';
+import { agentModel } from '../../models';
+import { createLogger } from '../../logger.svelte';
+import { WIKI_AUTONOMOUS_PROMPT, WIKI_MANUAL_PROMPT } from './prompt';
+import type { WikiInput, WikiOutput } from './types';
+
+const log = createLogger('wiki-worker');
+
+/**
+ * Pin response_format=json_object on the manual path. The autonomous
+ * path is tool-driven (no JSON output expected) so we leave
+ * responseFormat unset there.
+ */
+const WIKI_MANUAL_RESPONSE_FORMAT: ResponseFormat = { type: 'json_object' };
+
+function messageToVenice(m: Message): VeniceMessage {
+ if (m.role === 'tool') {
+ return {
+ role: 'tool',
+ content: m.content,
+ tool_call_id:
+ m.tool_call_id != null
+ ? sanitizeToolCallIdForWire(m.tool_call_id)
+ : undefined,
+ name: m.name ?? undefined,
+ };
+ }
+ const out: VeniceMessage = { role: m.role, content: m.content };
+ if (m.role === 'assistant' && m.tool_calls && m.tool_calls.length > 0) {
+ out.tool_calls = sanitizeToolCallsForWire(m.tool_calls);
+ }
+ return out;
+}
+
+/**
+ * Result shape for `updateOne()`. Discriminated union so the UI can
+ * tell "model produced an updated article to preview" apart from
+ * "model decided no change is warranted" without sniffing strings.
+ * Genuine errors (parse failure, abort, network) still throw.
+ */
+export type WikiUpdateOneResult =
+ | { kind: 'preview'; title: string; content: string }
+ | { kind: 'noop'; reason: string };
+
+/**
+ * Parsed shape the manual path expects. `update` carries title +
+ * content; `noop` carries a one-sentence reason. Either action ships
+ * with optional fields the parser is tolerant about.
+ */
+interface ManualDecision {
+ action: 'update' | 'noop';
+ title: string | null;
+ content: string | null;
+ reason: string | null;
+}
+
+/**
+ * Parse the manual-agent's final-text JSON. Tolerant of a markdown
+ * fence wrapping the body - some training data carries the fence even
+ * when response_format=json_object is set, and stripping it here is
+ * cheaper than re-prompting.
+ */
+export function parseManualDecision(text: string): ManualDecision | null {
+ const trimmed = text.trim();
+ if (trimmed.length === 0) return null;
+ const fence = trimmed.match(/^```(?:json)?\s*([\s\S]*?)\s*```$/);
+ const payload = fence ? fence[1] : trimmed;
+ let parsed: unknown;
+ try {
+ parsed = JSON.parse(payload);
+ } catch {
+ return null;
+ }
+ if (!parsed || typeof parsed !== 'object') return null;
+ const obj = parsed as Record;
+ const actionRaw = obj.action;
+ const action: 'update' | 'noop' =
+ actionRaw === 'update' || actionRaw === 'noop' ? actionRaw : 'noop';
+ const title = typeof obj.title === 'string' ? obj.title.trim() : null;
+ const content = typeof obj.content === 'string' ? obj.content : null;
+ const reason =
+ typeof obj.reason === 'string' && obj.reason.trim().length > 0
+ ? obj.reason.trim()
+ : null;
+ return { action, title, content, reason };
+}
+
+export class WikiAgent implements Agent {
+ readonly name = 'wiki';
+ readonly model: string;
+ readonly toolbox = wikiToolbox;
+
+ constructor(
+ private venice: VeniceClient,
+ private supabase: SupabaseService,
+ /**
+ * Optional model override. Defaults to the registry's `wiki`
+ * slot (currently deepseek-v4-flash). Useful for tests.
+ */
+ modelId?: string
+ ) {
+ this.model = modelId ?? agentModel('wiki').id;
+ }
+
+ async run(
+ req: AgentRunRequest
+ ): Promise> {
+ const signal = req.signal ?? new AbortController().signal;
+
+ if (signal.aborted) {
+ return {
+ output: { finalText: '', inputMessageCount: 0 },
+ toolCalls: 0,
+ stoppedReason: 'aborted',
+ };
+ }
+
+ try {
+ const allMessages = await this.supabase.listMessages(req.input.threadId);
+ const terminalIdx = allMessages.findIndex(
+ (m) => m.id === req.input.terminalMsgId
+ );
+ const slice =
+ terminalIdx >= 0 ? allMessages.slice(0, terminalIdx + 1) : allMessages;
+
+ if (slice.length === 0) {
+ // Pathological: no messages. Mark and move on (loop's
+ // pointer-advance is unconditional on `done`).
+ return {
+ output: { finalText: '', inputMessageCount: 0 },
+ toolCalls: 0,
+ stoppedReason: 'done',
+ };
+ }
+
+ const convo: VeniceMessage[] = slice.map(messageToVenice);
+ convo.push({ role: 'user', content: WIKI_AUTONOMOUS_PROMPT });
+
+ log.info(
+ `asking model about thread ${req.input.threadId} ` +
+ `(${slice.length} messages)`
+ );
+
+ const result = await runHeadlessToolLoop({
+ venice: this.venice,
+ model: this.model,
+ messages: convo,
+ toolbox: this.toolbox,
+ toolCtx: {
+ supabase: this.supabase,
+ venice: this.venice,
+ userId: req.userId,
+ threadId: req.input.threadId,
+ },
+ signal,
+ reasoningEffort: 'low',
+ });
+
+ return {
+ output: {
+ finalText: result.finalText,
+ inputMessageCount: slice.length,
+ },
+ toolCalls: result.toolCalls,
+ stoppedReason: signal.aborted ? 'aborted' : 'done',
+ };
+ } catch (err) {
+ return {
+ output: { finalText: '', inputMessageCount: 0 },
+ toolCalls: 0,
+ stoppedReason: 'error',
+ error: err instanceof Error ? err.message : String(err),
+ };
+ }
+ }
+
+ /**
+ * User-initiated "ask agent to update this article" flow. Runs on
+ * the main thread, single completion, structured-JSON output. Does
+ * NOT write to the DB - the caller (Wiki.svelte) shows a preview
+ * and persists on Accept via `supabase.updateWikiArticle`.
+ *
+ * Throws on parse failure, abort, or network error so the UI can
+ * offer a retry. A clean "the instructions don't require a change"
+ * decision returns `kind: 'noop'` (not a throw) so the UI can show
+ * the reason without an error banner.
+ */
+ async updateOne(args: {
+ articleId: string;
+ currentTitle: string;
+ currentContent: string;
+ userInstructions: string;
+ signal?: AbortSignal;
+ }): Promise {
+ const signal = args.signal ?? new AbortController().signal;
+ if (signal.aborted) throw new Error('Update aborted before start.');
+ const instructions = args.userInstructions.trim();
+ if (instructions.length === 0) {
+ throw new Error('Instructions are required.');
+ }
+
+ // The model sees the article first (via the system prompt is the
+ // contract) and then the user's instructions as a user turn. The
+ // user-content block restates the article so the model has the
+ // exact baseline text to preserve.
+ const userTurn = [
+ 'Article to edit:',
+ '',
+ `Title: ${args.currentTitle}`,
+ '',
+ 'Content:',
+ args.currentContent,
+ '',
+ '---',
+ '',
+ 'Instructions:',
+ instructions,
+ ].join('\n');
+
+ const convo: VeniceMessage[] = [
+ { role: 'system', content: WIKI_MANUAL_PROMPT },
+ { role: 'user', content: userTurn },
+ ];
+
+ log.info(
+ `manual update on article ${args.articleId} ` +
+ `(${args.currentContent.length} chars in, ${instructions.length} chars instructions)`
+ );
+
+ const completion = await this.venice.completeChat({
+ model: this.model,
+ messages: convo,
+ signal,
+ responseFormat: WIKI_MANUAL_RESPONSE_FORMAT,
+ reasoningEffort: 'low',
+ });
+ if (signal.aborted) throw new Error('Update aborted mid-stream.');
+
+ const decision = parseManualDecision(completion.text);
+ if (decision === null) {
+ throw new Error(
+ "The model returned a response we couldn't parse. Try again."
+ );
+ }
+ if (decision.action === 'noop') {
+ return {
+ kind: 'noop',
+ reason: decision.reason ?? 'No change applied.',
+ };
+ }
+ // action === 'update'
+ const finalTitle =
+ decision.title && decision.title.length > 0
+ ? decision.title
+ : args.currentTitle;
+ const finalContent =
+ decision.content && decision.content.length > 0 ? decision.content : '';
+ if (finalContent.length === 0) {
+ throw new Error(
+ 'The model returned an update with empty content. Try again.'
+ );
+ }
+ return { kind: 'preview', title: finalTitle, content: finalContent };
+ }
+}
diff --git a/src/lib/agents/wiki/loop.ts b/src/lib/agents/wiki/loop.ts
new file mode 100644
index 0000000..459859b
--- /dev/null
+++ b/src/lib/agents/wiki/loop.ts
@@ -0,0 +1,160 @@
+/**
+ * Single-cycle driver for the wiki worker. Mirrors
+ * `../reflection/loop.ts` and `../journal/loop.ts`: acquire lease ->
+ * claim thread -> run agent -> mark pointer.
+ *
+ * The pointer-advance is unconditional on `done`. The autonomous
+ * agent's `wiki_*` tool calls land their side effects directly; even
+ * a no-op cycle (the agent decided no topic warranted a wiki update)
+ * advances the pointer so the same conversation is not re-processed
+ * every cycle. New turns added to the thread will reset eligibility
+ * via the next-day predicate in `claim_next_thread_for_wiki`.
+ */
+import type { Agent } from '../types';
+import type { SupabaseService } from '../../supabase';
+import type { LeaseCoordinator } from '../../embeddings/lease';
+import type { WikiInput, WikiOutput } from './types';
+import { createLogger } from '../../logger.svelte';
+
+const log = createLogger('wiki-worker');
+
+export type CycleResult =
+ | 'acquired-lease'
+ | 'polling'
+ | 'empty-queue'
+ | 'processed'
+ | 'claim-lost'
+ | 'error';
+
+export interface CycleContext {
+ agent: Agent;
+ supabase: SupabaseService;
+ coordinator: LeaseCoordinator;
+ holderId: string;
+ userId: string;
+ /**
+ * IANA timezone the next-day eligibility predicate buckets against.
+ * May be null (user has not set a preference); the SQL fallback
+ * coerces null to UTC so the day-gate still works.
+ */
+ timezone: string | null;
+ threadClaimTtlSeconds: number;
+ signal: AbortSignal;
+ onLeaseLost: () => void;
+}
+
+export async function runOneCycle(ctx: CycleContext): Promise {
+ if (ctx.signal.aborted) return 'empty-queue';
+
+ if (!ctx.coordinator.isHolding) {
+ const acquired = await ctx.coordinator.acquire();
+ if (!acquired) return 'polling';
+ ctx.coordinator.startHeartbeat(ctx.onLeaseLost);
+ return 'acquired-lease';
+ }
+
+ let claim: {
+ threadId: string;
+ terminalMsgId: string;
+ title: string | null;
+ newestMsgAt: string;
+ } | null = null;
+ try {
+ claim = await ctx.supabase.claimNextThreadForWiki(
+ ctx.holderId,
+ ctx.threadClaimTtlSeconds,
+ ctx.timezone
+ );
+ } catch {
+ return 'error';
+ }
+ if (!claim) return 'empty-queue';
+
+ const titleTag = claim.title ? `"${claim.title}"` : '[untitled]';
+ log.info(
+ `picked up thread ${claim.threadId} @ msg ${claim.terminalMsgId} ${titleTag}`
+ );
+
+ let runResult;
+ try {
+ runResult = await ctx.agent.run({
+ input: {
+ threadId: claim.threadId,
+ terminalMsgId: claim.terminalMsgId,
+ },
+ userId: ctx.userId,
+ threadId: claim.threadId,
+ signal: ctx.signal,
+ });
+ } catch (err) {
+ log.debug(
+ `thread ${claim.threadId} threw unexpectedly`,
+ err instanceof Error ? err.message : String(err)
+ );
+ return 'error';
+ }
+
+ if (runResult.stoppedReason === 'aborted') return 'empty-queue';
+
+ if (runResult.stoppedReason === 'error') {
+ // Side effects from any wiki_* tool calls already landed (the
+ // wiki tools are owned by the user, not the claim). Don't
+ // advance the pointer - next cycle will retry.
+ log.info(
+ `thread ${claim.threadId} agent reported error: ` +
+ `${runResult.error ?? '(no message)'} ${titleTag}`
+ );
+ return 'error';
+ }
+
+ // Done. Advance the pointer regardless of whether the agent wrote
+ // anything - a no-op cycle (model decided no topic warranted an
+ // article) still consumed the conversation, and re-processing it
+ // every cycle would be wasted Venice spend.
+ let marked = false;
+ try {
+ marked = await ctx.supabase.markThreadWikiProcessedIfClaimed(
+ claim.threadId,
+ ctx.holderId,
+ claim.terminalMsgId
+ );
+ } catch (err) {
+ log.debug(
+ `mark RPC threw for thread ${claim.threadId}`,
+ err instanceof Error ? err.message : String(err)
+ );
+ return 'error';
+ }
+ if (marked) {
+ log.info(
+ `finished thread ${claim.threadId} ` +
+ `(${runResult.toolCalls} tool calls over ${runResult.output.inputMessageCount} messages) ${titleTag}`
+ );
+ return 'processed';
+ }
+ log.debug(
+ `claim lost on thread ${claim.threadId} - another device took over ${titleTag}`
+ );
+ return 'claim-lost';
+}
+
+export interface NapConfig {
+ leasePollMs: number;
+ idleIntervalMs: number;
+ errorBackoffMs: number;
+}
+
+export function napForResult(result: CycleResult, config: NapConfig): number {
+ switch (result) {
+ case 'acquired-lease':
+ case 'processed':
+ case 'claim-lost':
+ return 0;
+ case 'polling':
+ return config.leasePollMs;
+ case 'empty-queue':
+ return config.idleIntervalMs;
+ case 'error':
+ return config.errorBackoffMs;
+ }
+}
diff --git a/src/lib/agents/wiki/manager.ts b/src/lib/agents/wiki/manager.ts
new file mode 100644
index 0000000..175a406
--- /dev/null
+++ b/src/lib/agents/wiki/manager.ts
@@ -0,0 +1,94 @@
+/**
+ * Main-thread supervisor for the wiki Web Worker. Mirrors
+ * `../journal/manager.ts`. Lifecycle (cross-tab lock, start/stop,
+ * log routing, auth bridging) lives in `BaseWorkerManager`; this
+ * file carries only the wiki-specific bits:
+ *
+ * - lock name `nak:wiki-worker`
+ * - StartOpts add `timezone` (re-uses the journal preference per
+ * the spec - one user-tz preference covers both)
+ * - `progress: 'processed'` bubbles up to `emitWikiChange()` so an
+ * open Wiki drawer / panel refetches when the worker writes
+ *
+ * Activation is gated on `app.wikiAutomaticEnabled` -
+ * state.svelte.ts skips the `start()` call entirely when false;
+ * Settings calls `stop()` / `start()` when the user flips the
+ * toggle mid-session.
+ */
+import type { Session } from '@supabase/supabase-js';
+import { agentModel } from '../../models';
+import { emitWikiChange } from '../../wiki-events';
+import { BaseWorkerManager, type BaseStartOpts } from '../base-manager';
+
+export interface WikiStartOpts extends BaseStartOpts {
+ /** IANA timezone; worker falls back to UTC if null. */
+ timezone: string | null;
+}
+
+/**
+ * Same timing defaults as the journal/reflection workers. Wiki-
+ * eligible threads appear at roughly the same rate (one calendar
+ * day after a settled conversation) so identical knobs are fine.
+ */
+const WORKER_DEFAULTS = {
+ leaseTtlSeconds: 45,
+ leaseHeartbeatMs: 20_000,
+ threadClaimTtlSeconds: 600,
+ leasePollMs: 20_000,
+ idleIntervalMs: 30_000,
+ errorBackoffMs: 10_000,
+};
+
+class WikiManager extends BaseWorkerManager {
+ protected readonly lockName = 'nak:wiki-worker';
+ protected readonly loggerSource = 'wiki-worker';
+
+ protected createWorker(): Worker {
+ return new Worker(new URL('./worker.ts', import.meta.url), {
+ type: 'module',
+ name: 'nak-wiki',
+ });
+ }
+
+ protected buildStartPayload(
+ opts: WikiStartOpts,
+ session: Session
+ ): Record {
+ return {
+ supabaseUrl: opts.config.supabaseUrl,
+ supabaseAnonKey: opts.config.supabaseAnonKey,
+ accessToken: session.access_token,
+ refreshToken: session.refresh_token,
+ userId: session.user.id,
+ veniceApiKey: opts.config.veniceApiKey,
+ wikiModel: agentModel('wiki').id,
+ timezone: opts.timezone,
+ ...WORKER_DEFAULTS,
+ };
+ }
+
+ protected onWorkerMessage(data: Record): boolean {
+ if (data.type === 'progress' && data.result === 'processed') {
+ emitWikiChange();
+ return true;
+ }
+ return false;
+ }
+
+ /**
+ * Live-update the worker's timezone without a restart. Called by
+ * state.svelte.ts when `setJournalTimezone` runs (the wiki worker
+ * shares the journal tz preference). No-op when the worker isn't
+ * running; the next `start()` picks the new value off StartOpts.
+ */
+ setTimezone(timezone: string | null): void {
+ if (!this.worker) return;
+ this.worker.postMessage({ type: 'timezone', timezone });
+ }
+}
+
+/**
+ * Single app-wide instance. Imported by state.svelte.ts and nowhere
+ * else.
+ */
+export const wikiManager = new WikiManager();
diff --git a/src/lib/agents/wiki/prompt.ts b/src/lib/agents/wiki/prompt.ts
new file mode 100644
index 0000000..9a7d407
--- /dev/null
+++ b/src/lib/agents/wiki/prompt.ts
@@ -0,0 +1,149 @@
+/**
+ * Prompts for the wiki agent - both the autonomous (background, tool-
+ * driven) and manual (per-article, single-completion) paths.
+ *
+ * The voice and "preserve facts" discipline are shared across both;
+ * the framing differs because the inputs differ. Autonomous reads a
+ * conversation and decides what topics warrant updates; manual reads
+ * one article + the user's instructions and applies them.
+ */
+
+/**
+ * Autonomous-agent system prompt. Appended as the final user turn in
+ * a messages array whose prefix IS the original conversation - the
+ * model sees itself as the prior assistant, which is a better angle
+ * for spotting topical content worth committing than reading a
+ * third-party transcript. Same idiom as REFLECTION_PROMPT.
+ *
+ * Framing layers:
+ * - Discarded reply: the value is the side effects (wiki_*
+ * calls), not any final text. Prevents AI-assistant filler.
+ * - Voice: encyclopedic third-person, present tense, neutral.
+ * Refer to the subject directly (their first name, the project
+ * name) rather than "the user" so articles read as encyclopedia
+ * entries rather than session-scoped notes.
+ * - "Search before write" workflow: every per-topic decision
+ * starts with wiki_search; existing relevant article -> update
+ * (preserving facts); no relevant article -> create; create's
+ * unique-violation -> search again -> update. This is the
+ * difference between a wiki that grows coherently and one that
+ * accretes near-duplicates.
+ * - "Preserve facts unless contradicted" is load-bearing: a model
+ * prone to rewriting will overwrite established information
+ * each cycle. The wiki is meant to accrete, not churn.
+ * - Conservative-on-create: the bar is "would the user later look
+ * this up", not "did this come up". A throwaway question about
+ * the weather should not produce a "weather" article.
+ */
+export const WIKI_AUTONOMOUS_PROMPT = [
+ "You've just finished the conversation above. Now step out of that",
+ "role. You're not talking to the user anymore - nobody will read this",
+ 'reply. Your job is to maintain the long-term wiki the user keeps',
+ 'about themselves and the topics they care about, using the wiki tools',
+ 'below.',
+ '',
+ 'The wiki is a flat collection of titled articles (no nesting). Each',
+ 'article is encyclopedic third-person prose about one topic - a',
+ 'project, a person in their life, a place, an interest, a recurring',
+ 'situation. Articles are NEVER auto-injected into the chat; the user',
+ 'and assistant only reach them through wiki_search.',
+ '',
+ '**Voice and tone**:',
+ '',
+ '- Encyclopedic, third-person, present tense, neutral. Like the lead',
+ ' paragraph of a Wikipedia article.',
+ '- Refer to the subject directly when possible (a first name, the',
+ ' project name, the place). Avoid "the user" except when the article',
+ ' topic IS the user themselves.',
+ "- No first person, no second person, no chat phrasing. Don't write",
+ ' "you mentioned" or "I noted"; write the fact directly.',
+ '- One topic per article. If a conversation surfaces multiple topics,',
+ ' consider multiple separate updates.',
+ '',
+ '**Workflow for each topic the conversation surfaced**:',
+ '',
+ '1. Call wiki_search with a query that captures the topic. ALWAYS',
+ ' search first - the unique-key constraint and the wiki-coherence',
+ " discipline both depend on you knowing what's already there.",
+ '2. If a relevant article exists, consider wiki_update. **Preserve',
+ ' every existing fact unless the conversation explicitly',
+ ' contradicts it.** Add new information; do not rewrite for tone',
+ ' or condense. The wiki accretes.',
+ '3. If no relevant article exists, call wiki_create. If create',
+ " raises a unique-violation (the title collides with one you didn't",
+ ' surface), call wiki_search again with the exact title and then',
+ ' wiki_update on the result.',
+ '4. wiki_delete is only for consolidation: when an article you just',
+ ' updated now strictly subsumes another one. Never delete on the',
+ ' basis of "the user said something different today" alone - in',
+ ' that case, update.',
+ '',
+ '**Do not fabricate.** Only assert facts that appear in the',
+ 'conversation above or in existing articles you read via wiki_search.',
+ "Don't import outside knowledge.",
+ '',
+ '**Be conservative.** The bar for creating an article is "would the',
+ 'user later look this up?", not "did this come up at all?". A',
+ 'one-off mention does not warrant an article. Fewer high-signal',
+ 'articles beat many noisy ones.',
+ '',
+ 'When you have nothing more to write, reply with a single word. The',
+ 'word is discarded - only the tool calls matter.',
+].join('\n');
+
+/**
+ * Manual-agent ("ask agent to update this article") system prompt.
+ * Different from the autonomous prompt in three ways:
+ * - Scope: ONE article, not the whole wiki.
+ * - Input: explicit user instructions, not a conversation to
+ * reason from.
+ * - Output shape: a single response_format=json_object completion,
+ * no tool calls. The handler parses and returns a preview; the
+ * UI persists on Accept.
+ *
+ * The "do not discard facts" discipline is intentional and load-
+ * bearing here too - "rewrite for tone" should keep facts; "fix the
+ * date in paragraph 2" should patch only that.
+ */
+export const WIKI_MANUAL_PROMPT = [
+ 'You are editing one article in the user\'s personal wiki, in',
+ 'response to explicit instructions from them.',
+ '',
+ '**Voice**: encyclopedic, third-person, present tense, neutral - the',
+ 'same register as a Wikipedia lead paragraph. No first or second',
+ 'person. Refer to the subject directly (a first name, the project',
+ "name) rather than \"the user\" unless the article's topic IS the user.",
+ '',
+ '**Rules**:',
+ '',
+ "- Do exactly what the user asks. Their instructions are the binding",
+ ' constraint.',
+ "- Do NOT discard existing facts unless the user explicitly asks for",
+ ' that fact to be removed or replaced. "Add" means add. "Fix" means',
+ ' patch the specified part, leaving the rest alone. "Rewrite for',
+ ' tone" means keep facts and only rewrite the prose.',
+ '- Do NOT fabricate. Any new fact must come from the user\'s',
+ ' instructions. If the instructions imply information you don\'t',
+ ' have, ask via the noop path (see below) rather than inventing.',
+ '- Title is editable but discouraged. Only rename when the user',
+ ' asks for it directly.',
+ '',
+ '**Output**: a single JSON object with these fields:',
+ '',
+ ' {',
+ ' "action": "update" | "noop",',
+ ' "title": ,',
+ ' "content": ,',
+ ' "reason": ',
+ ' }',
+ '',
+ 'Use `action: "noop"` when the instructions do not actually require',
+ 'a change ("looks fine", "no edits"), when they are too ambiguous to',
+ 'act on without inventing facts, or when they ask for content you',
+ 'cannot supply faithfully. Include `reason` so the UI can show the',
+ 'user why no change was made.',
+ '',
+ 'On `action: "update"`, include the FULL final article in `content`,',
+ 'not a diff or a patch. The UI will preview your output and the user',
+ 'will accept or reject.',
+].join('\n');
diff --git a/src/lib/agents/wiki/types.ts b/src/lib/agents/wiki/types.ts
new file mode 100644
index 0000000..e52cfd4
--- /dev/null
+++ b/src/lib/agents/wiki/types.ts
@@ -0,0 +1,35 @@
+/**
+ * Request/response shapes for the autonomous wiki agent. Kept in
+ * their own file so the loop and worker can import the types without
+ * reaching into the agent class (which pulls in `runHeadlessToolLoop`
+ * and the wikiToolbox).
+ */
+
+export interface WikiInput {
+ /** Thread to process - claimed by the worker before this runs. */
+ threadId: string;
+ /**
+ * Terminal assistant message the claim was made against. The agent
+ * slices thread history at this id so a race where the user added
+ * more turns mid-run simply queues the thread for the next cycle.
+ * Also stamped into `last_wiki_processed_msg_id` by the loop's
+ * mark-step after the agent returns.
+ */
+ terminalMsgId: string;
+}
+
+export interface WikiOutput {
+ /**
+ * The model's final (post-tool-loop) text. Always discarded by
+ * production callers - "reply with a single word" per the prompt -
+ * but returned here for debug logs and test assertions.
+ */
+ finalText: string;
+ /**
+ * Number of messages fed to the model on round 1, before the
+ * model's own turns extended the conversation. Surface it for
+ * observability - a wiki run over 50 messages is meaningfully
+ * different from one over 5.
+ */
+ inputMessageCount: number;
+}
diff --git a/src/lib/agents/wiki/worker.ts b/src/lib/agents/wiki/worker.ts
new file mode 100644
index 0000000..c9152e4
--- /dev/null
+++ b/src/lib/agents/wiki/worker.ts
@@ -0,0 +1,215 @@
+/**
+ * Wiki Web Worker entry point. Mirrors `../journal/worker.ts` and
+ * `../reflection/worker.ts`: build clients from the `start` message,
+ * instantiate WikiAgent, drive `runOneCycle` until abort.
+ *
+ * - Lease partition is `'wiki'` (distinct from 'journal',
+ * 'reflection', 'embedding') so a single device can hold every
+ * lease at once.
+ * - Carries the user's timezone through (re-using the
+ * `journalTimezone` setting per the spec) so the next-day
+ * eligibility predicate buckets against the user's calendar.
+ */
+import { createClient, type SupabaseClient } from '@supabase/supabase-js';
+import { VeniceClient } from '../../venice';
+import { SupabaseService } from '../../supabase';
+import { LeaseCoordinator } from '../../embeddings/lease';
+import { WikiAgent } from './agent';
+import {
+ runOneCycle,
+ napForResult,
+ type CycleContext,
+ type NapConfig,
+} from './loop';
+
+interface StartMessage {
+ type: 'start';
+ supabaseUrl: string;
+ supabaseAnonKey: string;
+ accessToken: string;
+ refreshToken: string;
+ userId: string;
+ veniceApiKey: string;
+ veniceBaseUrl?: string;
+ wikiModel: string;
+ /** IANA timezone or null (the SQL falls back to UTC). */
+ timezone: string | null;
+ holderId: string;
+ threadClaimTtlSeconds: number;
+ leaseTtlSeconds: number;
+ leaseHeartbeatMs: number;
+ leasePollMs: number;
+ idleIntervalMs: number;
+ errorBackoffMs: number;
+}
+
+interface StopMessage {
+ type: 'stop';
+}
+
+interface SessionMessage {
+ type: 'session';
+ accessToken: string;
+ refreshToken: string;
+}
+
+interface TimezoneMessage {
+ type: 'timezone';
+ timezone: string | null;
+}
+
+type InboundMessage =
+ | StartMessage
+ | StopMessage
+ | SessionMessage
+ | TimezoneMessage;
+
+interface LogOutbound {
+ type: 'log';
+ level: 'info' | 'warn' | 'error';
+ message: string;
+}
+
+interface ProgressOutbound {
+ type: 'progress';
+ result: string;
+ threadId?: string;
+}
+
+const workerGlobal = self as unknown as DedicatedWorkerGlobalScope;
+
+function post(msg: LogOutbound | ProgressOutbound): void {
+ workerGlobal.postMessage(msg);
+}
+
+let currentClient: SupabaseClient | null = null;
+const tzHolder: { value: string | null } = { value: null };
+
+function sleep(ms: number, signal: AbortSignal): Promise {
+ if (ms <= 0) return Promise.resolve();
+ return new Promise((resolve) => {
+ const t = setTimeout(resolve, ms);
+ signal.addEventListener(
+ 'abort',
+ () => {
+ clearTimeout(t);
+ resolve();
+ },
+ { once: true }
+ );
+ });
+}
+
+async function runWorker(msg: StartMessage, signal: AbortSignal): Promise {
+ const client: SupabaseClient = createClient(msg.supabaseUrl, msg.supabaseAnonKey, {
+ auth: {
+ persistSession: false,
+ autoRefreshToken: false,
+ detectSessionInUrl: false,
+ },
+ });
+ const { error: sessionError } = await client.auth.setSession({
+ access_token: msg.accessToken,
+ refresh_token: msg.refreshToken,
+ });
+ if (sessionError) {
+ post({
+ type: 'log',
+ level: 'error',
+ message: `wiki worker setSession failed: ${sessionError.message}`,
+ });
+ return;
+ }
+ currentClient = client;
+ tzHolder.value = msg.timezone;
+
+ const supabase = new SupabaseService(
+ { supabaseUrl: msg.supabaseUrl, supabaseAnonKey: msg.supabaseAnonKey },
+ { client }
+ );
+ const venice = new VeniceClient({
+ apiKey: msg.veniceApiKey,
+ baseUrl: msg.veniceBaseUrl,
+ });
+ const coordinator = new LeaseCoordinator(supabase, 'wiki', msg.holderId, {
+ ttlSeconds: msg.leaseTtlSeconds,
+ heartbeatMs: msg.leaseHeartbeatMs,
+ });
+
+ const agent = new WikiAgent(venice, supabase, msg.wikiModel);
+
+ const napConfig: NapConfig = {
+ leasePollMs: msg.leasePollMs,
+ idleIntervalMs: msg.idleIntervalMs,
+ errorBackoffMs: msg.errorBackoffMs,
+ };
+
+ try {
+ while (!signal.aborted) {
+ const ctx: CycleContext = {
+ agent,
+ supabase,
+ coordinator,
+ holderId: msg.holderId,
+ userId: msg.userId,
+ timezone: tzHolder.value,
+ threadClaimTtlSeconds: msg.threadClaimTtlSeconds,
+ signal,
+ onLeaseLost: () => {
+ post({
+ type: 'log',
+ level: 'warn',
+ message: 'wiki lease lost - re-entering polling',
+ });
+ },
+ };
+ const result = await runOneCycle(ctx);
+ post({ type: 'progress', result });
+ const nap = napForResult(result, napConfig);
+ if (nap > 0) await sleep(nap, signal);
+ }
+ } finally {
+ currentClient = null;
+ await coordinator.release();
+ }
+}
+
+const controller = new AbortController();
+
+workerGlobal.addEventListener('message', (evt: MessageEvent) => {
+ const msg = evt.data;
+ if (msg.type === 'start') {
+ runWorker(msg, controller.signal)
+ .catch((err: Error) => {
+ post({
+ type: 'log',
+ level: 'error',
+ message: `wiki worker loop crashed: ${err.message}`,
+ });
+ })
+ .finally(() => {
+ workerGlobal.close();
+ });
+ } else if (msg.type === 'stop') {
+ controller.abort();
+ } else if (msg.type === 'session') {
+ if (!currentClient) return;
+ void currentClient.auth
+ .setSession({
+ access_token: msg.accessToken,
+ refresh_token: msg.refreshToken,
+ })
+ .catch((err: Error) => {
+ post({
+ type: 'log',
+ level: 'warn',
+ message: `forwarded setSession failed: ${err.message}`,
+ });
+ });
+ } else if (msg.type === 'timezone') {
+ tzHolder.value = msg.timezone;
+ }
+});
+
+export { sleep };
+export type { StartMessage, StopMessage };
diff --git a/src/lib/chat-prompt.ts b/src/lib/chat-prompt.ts
index 2e4e99e..6d94537 100644
--- a/src/lib/chat-prompt.ts
+++ b/src/lib/chat-prompt.ts
@@ -134,6 +134,22 @@ Explore their boundaries and assumptions.
Do NOT hallucinate or invent psychological concepts or insights that have no basis in science or established philosophical traditions.
`;
+// Wiki framing. The user maintains a flat encyclopedia about themselves
+// and the topics they care about - articles by title, written in third
+// person, never auto-injected into the chat. wiki_search is the only
+// path the assistant has to reach this layer: when the user references
+// something topical or factual about themselves (a project name, a
+// person, a place, a recurring habit), call wiki_search to pull the
+// relevant article so the reply is grounded rather than guessing.
+// Distinct from memory (atomic facts) and journal (dated reflections):
+// the wiki carries curated topical articles that span many
+// conversations.
+const WIKI_BLOCK = `\
+The application also maintains a user wiki: a flat collection of titled, encyclopedic articles the user (and a background agent) curate alongside memories and the journal.
+Articles are NEVER auto-injected into the chat - call wiki_search whenever the user references something topical or factual about themselves, a project, a person, or a place to retrieve the relevant article.
+The wiki is the right surface for "what is X" lookups against the user's own knowledge graph; memories carry atomic facts, the journal carries dated reflections, and the wiki carries the longer-form topical entries.
+`;
+
// Toolbox framing. The model sees the catalog below with (on)/(off) marks
// on the gated toolboxes; always-on tools (every read path, plus web search,
// update_title, analyze_image, the recall pair, and the toggle meta-tool)
@@ -329,6 +345,7 @@ export function buildSystemPrompt(opts: SystemPromptOptions = {}): string {
VOICE_BLOCK,
RECALL_BLOCK,
JOURNAL_BLOCK,
+ WIKI_BLOCK,
TOOLBOX_FRAMING_BLOCK,
ACTIVITY_BLOCK,
buildCatalog(enabled),
diff --git a/src/lib/embeddings/sources/wiki.ts b/src/lib/embeddings/sources/wiki.ts
new file mode 100644
index 0000000..c1b77a6
--- /dev/null
+++ b/src/lib/embeddings/sources/wiki.ts
@@ -0,0 +1,54 @@
+/**
+ * EmbeddingSource adapter for the `wiki_articles` table. Lets the
+ * background embeddings worker populate the `embedding` column on new
+ * and edited articles so `wiki_search` can do semantic search.
+ *
+ * Thin wrapper over two SupabaseService RPCs (`claimNextPendingWikiArticle`
+ * and `saveWikiArticleEmbedding`) plus a string-builder helper. Same
+ * shape as `./memories.ts` and `./journal.ts`; the generic loop in
+ * `../loop.ts` drives every source the worker registers.
+ */
+import type { SupabaseService } from '../../supabase';
+import type { EmbeddingSource, PendingItem } from '../types';
+import { MAX_WIKI_CONTENT_CHARS } from '../../wiki';
+
+/**
+ * Compose the text Venice actually embeds. Title carries the topical
+ * load (an article titled "kombucha" with a paragraph of body should
+ * match a query for "fermented tea" via the title alone), so we lead
+ * with title verbatim and double-newline before the body to give the
+ * embedding model a soft boundary.
+ *
+ * Defensive truncation mirrors the memories and journal adapters:
+ * the tool boundary already caps content, but historical rows or
+ * direct DB writes might not have, and silent truncation keeps the
+ * worker from looping forever on a too-long row Venice rejects.
+ *
+ * Exported for direct unit testing.
+ */
+export function buildWikiEmbedInput(title: string, content: string): string {
+ const body =
+ content.length > MAX_WIKI_CONTENT_CHARS
+ ? content.slice(0, MAX_WIKI_CONTENT_CHARS)
+ : content;
+ return `${title}\n\n${body}`;
+}
+
+export function createWikiSource(supabase: SupabaseService): EmbeddingSource {
+ return {
+ name: 'wiki',
+ async claimNext(holderId: string, ttlSeconds: number): Promise {
+ const row = await supabase.claimNextPendingWikiArticle(holderId, ttlSeconds);
+ if (!row) return null;
+ return { id: row.id, input: buildWikiEmbedInput(row.title, row.content) };
+ },
+ async save(
+ id: string,
+ holderId: string,
+ embedding: number[],
+ model: string
+ ): Promise {
+ return supabase.saveWikiArticleEmbedding(id, holderId, embedding, model);
+ },
+ };
+}
diff --git a/src/lib/embeddings/worker.ts b/src/lib/embeddings/worker.ts
index a6e3493..f5420da 100644
--- a/src/lib/embeddings/worker.ts
+++ b/src/lib/embeddings/worker.ts
@@ -31,6 +31,7 @@ import { createMemoriesSource } from './sources/memories';
import { createThreadsSource } from './sources/threads';
import { createSamskaraSubstrateSource } from './sources/samskara-substrate';
import { createJournalSource } from './sources/journal';
+import { createWikiSource } from './sources/wiki';
import { LeaseCoordinator } from './lease';
import {
runOneCycle,
@@ -175,6 +176,7 @@ async function runWorker(msg: StartMessage, signal: AbortSignal): Promise
createThreadsSource(supabase),
createSamskaraSubstrateSource(supabase),
createJournalSource(supabase),
+ createWikiSource(supabase),
];
const napConfig: NapConfig = {
leasePollMs: msg.leasePollMs,
diff --git a/src/lib/models/index.ts b/src/lib/models/index.ts
index 49324e0..3e1d655 100644
--- a/src/lib/models/index.ts
+++ b/src/lib/models/index.ts
@@ -298,6 +298,7 @@ export const DEFAULT_TIER: ModelTier = 'balanced';
export type AgentRole =
| 'journal'
| 'reflection'
+ | 'wiki'
| 'webSearch'
| 'researchDocs'
| 'intuition'
@@ -378,6 +379,16 @@ export type AgentRole =
* thread, make some judgments, call the memory tools. Big-window
* model is the win - the entire conversation is the context.
*
+ * wiki - deepseek-v4-flash. Autonomous wiki agent: read a settled
+ * thread the day after, decide which topics warrant a new article
+ * or an update to an existing one, and dispatch wiki_search /
+ * wiki_create / wiki_update / wiki_delete tool calls. The same
+ * model also runs the synchronous "ask agent to update" flow
+ * from the per-article UI (single completion, response_format
+ * pinned to JSON, no tool loop). Same rationale as reflection -
+ * big window swallows the conversation, capacity isolation from
+ * foreground tiers, and the JSON pin works on the manual path.
+ *
* webSearch - deepseek-v4-flash. The `web_search` tool's sub-
* completion summarises Venice-provided results into 2-4
* sentences with citation markers. Bounded synthesis; the call
@@ -422,6 +433,7 @@ export type AgentRole =
export const AGENT_MODELS = {
journal: 'deepseek-v4-flash',
reflection: 'deepseek-v4-flash',
+ wiki: 'deepseek-v4-flash',
webSearch: 'deepseek-v4-flash',
researchDocs: 'deepseek-v4-flash',
intuition: 'mistral-small-3-2-24b-instruct',
diff --git a/src/lib/routing.svelte.ts b/src/lib/routing.svelte.ts
index 00e4107..d9a2a28 100644
--- a/src/lib/routing.svelte.ts
+++ b/src/lib/routing.svelte.ts
@@ -48,7 +48,7 @@
*/
export type Modal = 'settings' | 'help' | 'samskara' | 'intuition';
-export type DrawerTab = 'chats' | 'recipes' | 'journal' | 'memories';
+export type DrawerTab = 'chats' | 'recipes' | 'journal' | 'memories' | 'wiki';
export interface Route {
cid: string | null;
@@ -67,6 +67,14 @@ export interface Route {
* with content.
*/
memory: string | null;
+ /**
+ * Wiki article id for the focused article panel. Absent means the
+ * panel shows an empty-state hint and the sidebar listing is the
+ * only surface with content. Same shape as `memory` - the Wiki tab
+ * mirrors the Memories tab's "list in drawer, single card in
+ * panel" pattern.
+ */
+ wiki_article_id: string | null;
}
const ROUTED_KEYS = [
@@ -77,6 +85,7 @@ const ROUTED_KEYS = [
'doc',
'journal_date',
'memory',
+ 'wiki_article_id',
] as const;
const MODAL_VALUES: readonly Modal[] = [
'settings',
@@ -89,6 +98,7 @@ const DRAWER_VALUES: readonly DrawerTab[] = [
'recipes',
'journal',
'memories',
+ 'wiki',
];
export const route = $state({
@@ -99,6 +109,7 @@ export const route = $state({
doc: null,
journal_date: null,
memory: null,
+ wiki_article_id: null,
});
function readEnum(
@@ -126,6 +137,7 @@ export function parseUrl(search: string = typeof location !== 'undefined' ? loca
doc: readString(params, 'doc'),
journal_date: readString(params, 'journal_date'),
memory: readString(params, 'memory'),
+ wiki_article_id: readString(params, 'wiki_article_id'),
};
}
@@ -148,6 +160,7 @@ export function buildSearch(
if (r.doc) params.set('doc', r.doc);
if (r.journal_date) params.set('journal_date', r.journal_date);
if (r.memory) params.set('memory', r.memory);
+ if (r.wiki_article_id) params.set('wiki_article_id', r.wiki_article_id);
const s = params.toString();
return s ? `?${s}` : '';
}
@@ -195,6 +208,13 @@ function applyPatch(patch: Partial): boolean {
route.memory = patch.memory;
changed = true;
}
+ if (
+ patch.wiki_article_id !== undefined &&
+ patch.wiki_article_id !== route.wiki_article_id
+ ) {
+ route.wiki_article_id = patch.wiki_article_id;
+ changed = true;
+ }
return changed;
}
@@ -260,6 +280,7 @@ export const __test = {
route.doc = null;
route.journal_date = null;
route.memory = null;
+ route.wiki_article_id = null;
},
syncFromUrl,
};
diff --git a/src/lib/state.svelte.ts b/src/lib/state.svelte.ts
index 42c7ddc..2ea7a68 100644
--- a/src/lib/state.svelte.ts
+++ b/src/lib/state.svelte.ts
@@ -116,6 +116,9 @@ const samskara = lazyManager(() =>
const journal = lazyManager(() =>
import('./agents/journal/manager').then((m) => m.journalManager)
);
+const wiki = lazyManager(() =>
+ import('./agents/wiki/manager').then((m) => m.wikiManager)
+);
export type AppPhase = 'loading' | 'setup' | 'locked' | 'unlocked' | 'edit-config';
@@ -187,6 +190,14 @@ interface AppState {
* `profiles.settings.journalAutomaticEnabled` on unlock.
*/
journalAutomaticEnabled: boolean;
+ /**
+ * User wiki feature: background wiki worker runs unless this is
+ * explicitly false. Same default-on semantics and persistence
+ * shape as `journalAutomaticEnabled`. Seeded to true on
+ * activate(); overwritten from Supabase
+ * `profiles.settings.wikiAutomaticEnabled` on unlock.
+ */
+ wikiAutomaticEnabled: boolean;
/**
* IANA timezone used by the journaling feature to bucket entries.
* Seeded from the browser's detected zone on activate() so a
@@ -234,6 +245,7 @@ export const app = $state({
emphasisMarkdown: false,
notifyOnComplete: false,
journalAutomaticEnabled: true,
+ wikiAutomaticEnabled: true,
journalTimezone: detectTimezone(),
userName: '',
userLocation: '',
@@ -325,14 +337,38 @@ export function setJournalAutomaticEnabled(enabled: boolean): void {
/**
* Apply the journal timezone in memory and push the new zone to the
- * journaling worker so it starts bucketing entries into the right
- * days immediately. Does NOT persist; user-driven changes route
- * through `persistJournalTimezone`. Caller is responsible for
- * normalizing user-supplied input to a valid IANA name first.
+ * journaling and wiki workers so they start bucketing on the right
+ * days immediately. Both background workers share this preference
+ * (per the wiki spec: one user-tz preference covers both). Does NOT
+ * persist; user-driven changes route through `persistJournalTimezone`.
+ * Caller is responsible for normalizing user-supplied input to a
+ * valid IANA name first.
*/
export function setJournalTimezone(tz: string): void {
app.journalTimezone = tz;
journal.whenLoaded((m) => m.setTimezone(tz || null));
+ wiki.whenLoaded((m) => m.setTimezone(tz || null));
+}
+
+/**
+ * Flip the background wiki worker on/off in the current session.
+ * Mirrors `setJournalAutomaticEnabled` - the toggle is the live
+ * switch, not a passive preference. Does NOT persist; user-driven
+ * changes from Settings.svelte route through
+ * `persistWikiAutomaticEnabled`.
+ */
+export function setWikiAutomaticEnabled(enabled: boolean): void {
+ app.wikiAutomaticEnabled = enabled;
+ if (!app.supabase || !app.config) return;
+ if (enabled) {
+ wiki.start({
+ supabase: app.supabase,
+ config: app.config,
+ timezone: app.journalTimezone || null,
+ });
+ } else {
+ wiki.stop();
+ }
}
/**
@@ -497,6 +533,18 @@ export async function persistJournalAutomaticEnabled(enabled: boolean): Promise<
}
}
+export async function persistWikiAutomaticEnabled(enabled: boolean): Promise {
+ if (!app.supabase) throw new Error(NOT_CONNECTED);
+ const prev = app.wikiAutomaticEnabled;
+ setWikiAutomaticEnabled(enabled);
+ try {
+ await app.supabase.updateSettings({ wikiAutomaticEnabled: enabled });
+ } catch (err) {
+ setWikiAutomaticEnabled(prev);
+ throw err;
+ }
+}
+
/**
* Save the journal-day timezone. Caller is responsible for
* normalizing user input to a valid IANA name before calling -
@@ -576,6 +624,7 @@ export function applyServerSettings(s: UserSettings): void {
// false in the blob disables; absent key falls through to the
// seed (also true).
setJournalAutomaticEnabled(s.journalAutomaticEnabled ?? true);
+ setWikiAutomaticEnabled(s.wikiAutomaticEnabled ?? true);
if (s.journalTimezone) setJournalTimezone(s.journalTimezone);
// Profile: empty string is the "not set" sentinel; always
// assign so explicit absence in the blob clears any value
@@ -622,6 +671,13 @@ function startBackgroundWorkers(config: AppConfig): void {
userLocation: app.userLocation,
});
}
+ if (app.wikiAutomaticEnabled) {
+ wiki.start({
+ supabase: app.supabase,
+ config,
+ timezone: app.journalTimezone || null,
+ });
+ }
}
/**
@@ -654,6 +710,7 @@ export function activate(config: AppConfig, opts: { persist?: boolean } = {}): v
app.emphasisMarkdown = false;
app.notifyOnComplete = false;
app.journalAutomaticEnabled = true;
+ app.wikiAutomaticEnabled = true;
app.journalTimezone = detectTimezone();
app.userName = '';
app.userLocation = '';
@@ -702,6 +759,7 @@ export function lock(): void {
attachmentExpiry.stop();
samskara.stop();
journal.stop();
+ wiki.stop();
// Tear down the usage poller and wipe the cache so rows billed
// against the previous API key don't leak into a subsequent
// unlock-with-different-config.
@@ -719,6 +777,7 @@ export function lock(): void {
// them from the new account's Supabase settings rather than
// inheriting the previous account's choices.
app.journalAutomaticEnabled = true;
+ app.wikiAutomaticEnabled = true;
app.journalTimezone = detectTimezone();
// Profile: same rationale - never leak the previous account's
// name/location across a lock-then-unlock-as-someone-else flow.
diff --git a/src/lib/supabase.ts b/src/lib/supabase.ts
index 43ef03f..bc4475e 100644
--- a/src/lib/supabase.ts
+++ b/src/lib/supabase.ts
@@ -509,6 +509,36 @@ function coerceJournalEntry(raw: Record): JournalEntry {
};
}
+/**
+ * One topical article in the user's wiki. Flat list (no nesting), one
+ * article per `(user_id, title)` (the schema enforces uniqueness so the
+ * autonomous agent's `wiki_create` can fall through to `wiki_update` on
+ * conflict). Articles are written in encyclopedic third-person prose
+ * and are never auto-injected into the chat - the main LLM reaches
+ * them only through the always-on `wiki_search` tool.
+ */
+export interface WikiArticle {
+ id: string;
+ title: string;
+ content: string;
+ created_at: string;
+ updated_at: string;
+ /** Populated only by `searchWikiArticlesByEmbedding`. */
+ similarity?: number;
+}
+
+function coerceWikiArticle(raw: Record): WikiArticle {
+ return {
+ id: String(raw.id),
+ title: typeof raw.title === 'string' ? raw.title : '',
+ content: typeof raw.content === 'string' ? raw.content : '',
+ created_at: String(raw.created_at ?? raw.updated_at ?? ''),
+ updated_at: String(raw.updated_at ?? raw.created_at ?? ''),
+ similarity:
+ typeof raw.similarity === 'number' ? (raw.similarity as number) : undefined,
+ };
+}
+
export interface Message {
id: string;
thread_id: string;
@@ -732,6 +762,18 @@ export interface UserSettings {
* wrong day.
*/
journalTimezone?: string;
+ /**
+ * User wiki feature: when true, the background wiki agent processes
+ * settled threads (one calendar day after the newest message in the
+ * user's tz) and updates / creates encyclopedic articles about
+ * topics the conversation surfaced. Same default-on semantics as
+ * `journalAutomaticEnabled` - absent means on; only present when the
+ * user has explicitly disabled. False stops the manager from
+ * starting the worker at unlock and stops it mid-session when
+ * flipped. Manual edits and the per-article "ask agent to update"
+ * button are unaffected by this flag.
+ */
+ wikiAutomaticEnabled?: boolean;
/**
* Free-form display name the user wants the model to address them
* by. Optional - absent / empty string means "no name supplied,
@@ -824,6 +866,9 @@ export function coerceSettings(raw: unknown): UserSettings {
if (typeof r.journalAutomaticEnabled === 'boolean') {
out.journalAutomaticEnabled = r.journalAutomaticEnabled;
}
+ if (typeof r.wikiAutomaticEnabled === 'boolean') {
+ out.wikiAutomaticEnabled = r.wikiAutomaticEnabled;
+ }
if (
typeof r.journalTimezone === 'string' &&
r.journalTimezone.length > 0 &&
@@ -1004,6 +1049,13 @@ export class SupabaseService {
merged.journalAutomaticEnabled = patch.journalAutomaticEnabled;
}
}
+ if ('wikiAutomaticEnabled' in patch) {
+ if (patch.wikiAutomaticEnabled === undefined) {
+ delete merged.wikiAutomaticEnabled;
+ } else if (typeof patch.wikiAutomaticEnabled === 'boolean') {
+ merged.wikiAutomaticEnabled = patch.wikiAutomaticEnabled;
+ }
+ }
if ('journalTimezone' in patch) {
if (patch.journalTimezone === undefined) delete merged.journalTimezone;
else if (
@@ -2698,6 +2750,267 @@ export class SupabaseService {
return data === true;
}
+ // User wiki -------------------------------------------------------------
+
+ /**
+ * Alphabetical listing of every wiki article for the current user.
+ * Sort key is `lower(title)` so case differences ("Apple" vs
+ * "apple") fold together. Limit defaults to 500, matching journal
+ * and memories - a single user is unlikely to author thousands of
+ * encyclopedic articles, and pagination would complicate the
+ * client-side store filtering pattern.
+ */
+ async listWikiArticles(opts: { limit?: number } = {}): Promise {
+ const { data, error } = await this.client
+ .from('wiki_articles')
+ .select('id, title, content, created_at, updated_at')
+ .order('title', { ascending: true })
+ .limit(opts.limit ?? 500);
+ if (error) throw new SupabaseError(error.message);
+ return (data ?? []).map((row) => coerceWikiArticle(row as Record));
+ }
+
+ async getWikiArticleById(id: string): Promise {
+ const { data, error } = await this.client
+ .from('wiki_articles')
+ .select('id, title, content, created_at, updated_at')
+ .eq('id', id)
+ .maybeSingle();
+ if (error) throw new SupabaseError(error.message);
+ if (!data) return null;
+ return coerceWikiArticle(data as Record);
+ }
+
+ /**
+ * Title-keyed lookup. The autonomous agent uses this to resolve a
+ * candidate title to an existing article id when `wiki_create`
+ * raised a unique-violation - the agent then calls `wiki_update`
+ * against the resolved id.
+ */
+ async getWikiArticleByTitle(title: string): Promise {
+ const { data, error } = await this.client
+ .from('wiki_articles')
+ .select('id, title, content, created_at, updated_at')
+ .eq('title', title)
+ .maybeSingle();
+ if (error) throw new SupabaseError(error.message);
+ if (!data) return null;
+ return coerceWikiArticle(data as Record);
+ }
+
+ async createWikiArticle(args: {
+ title: string;
+ content: string;
+ }): Promise {
+ const session = await this.getSession();
+ if (!session) throw new SupabaseError('Not authenticated.');
+ const { data, error } = await this.client
+ .from('wiki_articles')
+ .insert({
+ user_id: session.user.id,
+ title: args.title,
+ content: args.content,
+ })
+ .select('id, title, content, created_at, updated_at')
+ .single();
+ if (error) throw new SupabaseError(error.message);
+ return coerceWikiArticle(data as Record);
+ }
+
+ /**
+ * Patch an article's title or content. RLS owner-scopes the update.
+ * The schema trigger `clear_wiki_embedding_on_change` nulls the
+ * embedding + claim columns when title or content changes so the
+ * worker re-embeds on its next poll.
+ */
+ async updateWikiArticle(
+ id: string,
+ patch: { title?: string; content?: string }
+ ): Promise {
+ const { data, error } = await this.client
+ .from('wiki_articles')
+ .update({ ...patch, updated_at: new Date().toISOString() })
+ .eq('id', id)
+ .select('id, title, content, created_at, updated_at')
+ .single();
+ if (error) throw new SupabaseError(error.message);
+ return coerceWikiArticle(data as Record);
+ }
+
+ async deleteWikiArticle(id: string): Promise {
+ const { error } = await this.client.from('wiki_articles').delete().eq('id', id);
+ if (error) throw new SupabaseError(error.message);
+ }
+
+ /**
+ * Semantic + substring search over wiki articles. Same shape as
+ * `searchJournalEntries`: vector hits first (RPC), then unembedded
+ * ILIKE hits, deduped by id. Empty `query` returns the alphabetical
+ * listing without embedding. `queryEmbedding` may be null - callers
+ * without Venice get ILIKE-only results.
+ */
+ async searchWikiArticles(opts: {
+ query: string;
+ queryEmbedding: number[] | null;
+ limit?: number;
+ }): Promise {
+ const query = opts.query.trim();
+ const limit = opts.limit ?? 20;
+ if (query.length === 0) return this.listWikiArticles({ limit });
+
+ const safe = query.replace(/([,()])/g, '\\$1');
+ const pattern = `%${safe}%`;
+
+ const ilikePromise = this.client
+ .from('wiki_articles')
+ .select('id, title, content, created_at, updated_at')
+ .or(`title.ilike.${pattern},content.ilike.${pattern}`)
+ .order('title', { ascending: true })
+ .limit(limit);
+
+ const semanticPromise = opts.queryEmbedding
+ ? this.client.rpc('search_wiki_articles_by_embedding', {
+ query_embedding: opts.queryEmbedding,
+ match_limit: limit,
+ })
+ : Promise.resolve({ data: [] as unknown[], error: null });
+
+ const [ilikeRes, semRes] = await Promise.all([ilikePromise, semanticPromise]);
+ if (ilikeRes.error) throw new SupabaseError(ilikeRes.error.message);
+ const ilikeRows = (ilikeRes.data ?? []).map((row) =>
+ coerceWikiArticle(row as Record)
+ );
+ const semanticRows =
+ semRes.error !== null
+ ? []
+ : ((semRes.data ?? []) as unknown[]).map((row) =>
+ coerceWikiArticle(row as Record)
+ );
+
+ const out: WikiArticle[] = [];
+ const seen = new Set();
+ // Semantic first - meaning matches outrank substring matches.
+ for (const a of semanticRows) {
+ if (seen.has(a.id)) continue;
+ seen.add(a.id);
+ out.push(a);
+ if (out.length >= limit) return out;
+ }
+ for (const a of ilikeRows) {
+ if (seen.has(a.id)) continue;
+ seen.add(a.id);
+ out.push(a);
+ if (out.length >= limit) return out;
+ }
+ return out;
+ }
+
+ // Wiki background pipeline ---------------------------------------------
+
+ /**
+ * Claim the oldest thread eligible for the wiki agent. Differs from
+ * `claimNextThreadForJournal` in two ways:
+ * (1) The eligibility predicate gates on the newest message's
+ * calendar day in the user's tz being strictly before today
+ * (the "next-day" rule the spec asks for).
+ * (2) The lateral that finds the bucket key reads
+ * `messages.created_at` directly rather than
+ * `threads.updated_at`, so a future bump to threads.updated_at
+ * from an unrelated write can't shift the gate.
+ * Returns null when the queue is empty.
+ */
+ async claimNextThreadForWiki(
+ holderId: string,
+ ttlSeconds: number,
+ timezone: string | null
+ ): Promise<{
+ threadId: string;
+ terminalMsgId: string;
+ title: string | null;
+ /** ISO 8601 timestamp of the newest message in the claimed thread. */
+ newestMsgAt: string;
+ } | null> {
+ const { data, error } = await this.client.rpc('claim_next_thread_for_wiki', {
+ p_holder_id: holderId,
+ p_ttl_seconds: ttlSeconds,
+ // Same null-coerce as the journal claim - PostgREST passes
+ // explicit nulls through, and `at time zone null` returns null
+ // which would null out the WHERE predicate.
+ p_timezone: timezone ?? 'UTC',
+ });
+ if (error) throw new SupabaseError(error.message);
+ const rows = (data ?? []) as {
+ thread_id: string;
+ terminal_msg_id: string;
+ title: string | null;
+ newest_msg_at: string;
+ }[];
+ if (rows.length === 0) return null;
+ return {
+ threadId: rows[0].thread_id,
+ terminalMsgId: rows[0].terminal_msg_id,
+ title: rows[0].title ?? null,
+ newestMsgAt: rows[0].newest_msg_at,
+ };
+ }
+
+ /**
+ * Advance `threads.last_wiki_processed_msg_id` to `msgId` IF our
+ * claim is still ours. Returns false on claim-lost; caller drops
+ * the cycle. Called unconditionally after every agent run so a
+ * no-op cycle (agent decided no topic warranted a wiki update)
+ * still advances the pointer past the terminal message.
+ */
+ async markThreadWikiProcessedIfClaimed(
+ threadId: string,
+ holderId: string,
+ msgId: string
+ ): Promise {
+ const { data, error } = await this.client.rpc(
+ 'mark_thread_wiki_processed_if_claimed',
+ {
+ p_thread_id: threadId,
+ p_holder_id: holderId,
+ p_msg_id: msgId,
+ }
+ );
+ if (error) throw new SupabaseError(error.message);
+ return data === true;
+ }
+
+ async claimNextPendingWikiArticle(
+ holderId: string,
+ ttlSeconds: number
+ ): Promise<{ id: string; title: string; content: string } | null> {
+ const { data, error } = await this.client.rpc('claim_next_pending_wiki_article', {
+ p_holder_id: holderId,
+ p_ttl_seconds: ttlSeconds,
+ });
+ if (error) throw new SupabaseError(error.message);
+ const rows = (data ?? []) as { id: string; title: string; content: string }[];
+ if (rows.length === 0) return null;
+ return rows[0];
+ }
+
+ async saveWikiArticleEmbedding(
+ id: string,
+ holderId: string,
+ embedding: number[],
+ model: string
+ ): Promise {
+ const { data, error } = await this.client.rpc(
+ 'save_wiki_article_embedding_if_claimed',
+ {
+ p_id: id,
+ p_holder_id: holderId,
+ p_embedding: embedding,
+ p_embedding_model: model,
+ }
+ );
+ if (error) throw new SupabaseError(error.message);
+ return data === true;
+ }
+
// Background-worker pipeline --------------------------------------------
//
// Methods in this block drive the background workers in
diff --git a/src/lib/tools/index.ts b/src/lib/tools/index.ts
index 38725ed..caadf07 100644
--- a/src/lib/tools/index.ts
+++ b/src/lib/tools/index.ts
@@ -100,6 +100,7 @@ import { journalListSchema } from './journal_list.schema';
import { journalReadSchema } from './journal_read.schema';
import { journalSearchSchema } from './journal_search.schema';
import { journalDeleteSchema } from './journal_delete.schema';
+import { wikiSearchSchema } from './wiki_search.schema';
// Agent-only toolbox re-exports moved to the bottom of the file
// alongside other re-exports. Direct `export ... from` rather than
@@ -234,6 +235,11 @@ const journalDelete = lazyTool(
() => import('./journal_delete'),
'journalDelete'
);
+const wikiSearch = lazyTool(
+ wikiSearchSchema,
+ () => import('./wiki_search'),
+ 'wikiSearch'
+);
/**
* Always-on toolbox. Rides with every request regardless of the
@@ -262,6 +268,9 @@ const journalDelete = lazyTool(
* conversation titles + summaries.
* - `journal_list` / `journal_read` / `journal_search` - date-
* based and meaning-based reads over the user's daily journal.
+ * - `wiki_search` - semantic search over the user's flat wiki
+ * (encyclopedic articles about topics in their life). Articles
+ * are never auto-injected; this is the only path to reach them.
* - `recipe_list` / `recipe_get` - browse and fetch the user's
* saved recipes.
* - `research_docs` - bounded sub-agent that answers
@@ -291,6 +300,7 @@ export const alwaysOnToolbox: Toolbox = {
journalList,
journalRead,
journalSearch,
+ wikiSearch,
recipeList,
recipeGet,
researchDocs,
@@ -518,6 +528,7 @@ export async function executeToolCall(
// / agent entry points import the toolboxes directly via their
// source paths, not through this barrel.
export { memoryToolbox } from './memory_toolbox';
+export { wikiToolbox } from './wiki_toolbox';
export { recallToolbox } from './recall_toolbox';
export { conversationRecallToolbox } from './conversation_recall_toolbox';
diff --git a/src/lib/tools/wiki_create.schema.ts b/src/lib/tools/wiki_create.schema.ts
new file mode 100644
index 0000000..20537ca
--- /dev/null
+++ b/src/lib/tools/wiki_create.schema.ts
@@ -0,0 +1,33 @@
+/**
+ * Schema-only export for wiki_create. Impl lives in `./wiki_create`.
+ */
+import { MAX_WIKI_TITLE_CHARS, MAX_WIKI_CONTENT_CHARS } from '../wiki';
+
+export const wikiCreateSchema = {
+ name: 'wiki_create',
+ description:
+ "Create a new article in the user's wiki. title is the topic name " +
+ `(1-${MAX_WIKI_TITLE_CHARS} chars, must be unique per user); content is ` +
+ `the article body in encyclopedic third-person prose (max ${MAX_WIKI_CONTENT_CHARS} chars). ` +
+ 'Throws on a title collision; on error, run wiki_search and call wiki_update on the existing id.',
+ shortDescription: 'add a wiki article',
+ parameters: {
+ type: 'object',
+ properties: {
+ title: {
+ type: 'string',
+ minLength: 1,
+ maxLength: MAX_WIKI_TITLE_CHARS,
+ description: 'Article title (the topic name).',
+ },
+ content: {
+ type: 'string',
+ minLength: 1,
+ maxLength: MAX_WIKI_CONTENT_CHARS,
+ description: 'Article body, encyclopedic third-person prose.',
+ },
+ },
+ required: ['title', 'content'],
+ additionalProperties: false,
+ },
+} as const;
diff --git a/src/lib/tools/wiki_create.ts b/src/lib/tools/wiki_create.ts
new file mode 100644
index 0000000..f3d5aa1
--- /dev/null
+++ b/src/lib/tools/wiki_create.ts
@@ -0,0 +1,51 @@
+/**
+ * Persist a new wiki article. Returns the created row so the caller
+ * (autonomous wiki agent or the chat-side tool path - though this tool
+ * is agent-only today) can reference its id in a follow-up update.
+ *
+ * Title uniqueness is enforced at the DB level (unique (user_id, title)).
+ * The Postgres unique-violation surfaces here as a SupabaseError; we
+ * detect and rephrase it as actionable text the agent can read so it
+ * knows to fall back to wiki_search + wiki_update.
+ */
+import type { ToolDef } from './types';
+import { MAX_WIKI_TITLE_CHARS, MAX_WIKI_CONTENT_CHARS } from '../wiki';
+import { wikiCreateSchema } from './wiki_create.schema';
+import { emitWikiChange } from '../wiki-events';
+
+export const wikiCreate: ToolDef = {
+ ...wikiCreateSchema,
+ async execute(args, ctx) {
+ const title = typeof args.title === 'string' ? args.title.trim() : '';
+ const content = typeof args.content === 'string' ? args.content : '';
+ if (!title) throw new Error('title is required');
+ if (!content) throw new Error('content is required');
+ if (title.length > MAX_WIKI_TITLE_CHARS) {
+ throw new Error(
+ `title exceeds ${MAX_WIKI_TITLE_CHARS}-char limit (got ${title.length})`
+ );
+ }
+ if (content.length > MAX_WIKI_CONTENT_CHARS) {
+ throw new Error(
+ `content exceeds ${MAX_WIKI_CONTENT_CHARS}-char limit (got ${content.length}); split or trim`
+ );
+ }
+ try {
+ const article = await ctx.supabase.createWikiArticle({ title, content });
+ emitWikiChange();
+ return article;
+ } catch (err) {
+ // Postgres unique-violation reads as code 23505 in PostgREST's
+ // error wrapper. We don't have the code here, just the message,
+ // so we sniff for the substring. Rephrase as agent-readable so
+ // the autonomous agent flips to wiki_update without retrying.
+ const msg = err instanceof Error ? err.message : String(err);
+ if (/duplicate key|unique constraint|23505/i.test(msg)) {
+ throw new Error(
+ `An article titled "${title}" already exists. Run wiki_search to find its id, then call wiki_update to integrate.`
+ );
+ }
+ throw err;
+ }
+ },
+};
diff --git a/src/lib/tools/wiki_delete.schema.ts b/src/lib/tools/wiki_delete.schema.ts
new file mode 100644
index 0000000..94347c8
--- /dev/null
+++ b/src/lib/tools/wiki_delete.schema.ts
@@ -0,0 +1,23 @@
+/**
+ * Schema-only export for wiki_delete. Impl lives in `./wiki_delete`.
+ */
+export const wikiDeleteSchema = {
+ name: 'wiki_delete',
+ description:
+ 'Delete a wiki article by id. Use only for consolidation - when one ' +
+ 'article is now strictly subsumed by another article you just updated. ' +
+ 'Never delete on the basis of "the user said something contradictory ' +
+ 'today" alone; in that case, update the article to reflect the new view.',
+ shortDescription: 'delete a wiki article',
+ parameters: {
+ type: 'object',
+ properties: {
+ id: {
+ type: 'string',
+ description: 'UUID of the article (from wiki_search).',
+ },
+ },
+ required: ['id'],
+ additionalProperties: false,
+ },
+} as const;
diff --git a/src/lib/tools/wiki_delete.ts b/src/lib/tools/wiki_delete.ts
new file mode 100644
index 0000000..bdf602f
--- /dev/null
+++ b/src/lib/tools/wiki_delete.ts
@@ -0,0 +1,20 @@
+/**
+ * Hard-delete a wiki article. Both the autonomous wiki agent (for
+ * consolidation only) and the user via the Wiki panel can land here -
+ * the panel uses `supabase.deleteWikiArticle` directly rather than
+ * dispatching this tool, but the tool's contract is the same.
+ */
+import type { ToolDef } from './types';
+import { wikiDeleteSchema } from './wiki_delete.schema';
+import { emitWikiChange } from '../wiki-events';
+
+export const wikiDelete: ToolDef = {
+ ...wikiDeleteSchema,
+ async execute(args, ctx) {
+ const id = typeof args.id === 'string' ? args.id : '';
+ if (!id) throw new Error('id is required');
+ await ctx.supabase.deleteWikiArticle(id);
+ emitWikiChange();
+ return { id, deleted: true };
+ },
+};
diff --git a/src/lib/tools/wiki_search.schema.ts b/src/lib/tools/wiki_search.schema.ts
new file mode 100644
index 0000000..ee5cc31
--- /dev/null
+++ b/src/lib/tools/wiki_search.schema.ts
@@ -0,0 +1,35 @@
+/**
+ * Schema-only export for wiki_search. Impl lives in `./wiki_search`.
+ */
+export const WIKI_SEARCH_DEFAULT_LIMIT = 8;
+export const WIKI_SEARCH_MAX_LIMIT = 25;
+
+export const wikiSearchSchema = {
+ name: 'wiki_search',
+ description:
+ "Semantic search over the user's wiki - flat encyclopedic articles " +
+ 'about topics, people, places, and projects in their life (with ' +
+ 'substring fallback for rows the embeddings worker has not yet ' +
+ 'processed). Returns {id, title, content, updated_at, similarity?}[] ' +
+ 'ranked by relevance. Articles are NEVER auto-injected into the chat; ' +
+ 'this is the only way to surface them.',
+ shortDescription: 'look up a wiki article by topic',
+ parameters: {
+ type: 'object',
+ properties: {
+ query: {
+ type: 'string',
+ minLength: 1,
+ description: 'Natural-language query, topic, or article title.',
+ },
+ limit: {
+ type: 'integer',
+ minimum: 1,
+ maximum: WIKI_SEARCH_MAX_LIMIT,
+ description: `Max results (default ${WIKI_SEARCH_DEFAULT_LIMIT}, max ${WIKI_SEARCH_MAX_LIMIT}).`,
+ },
+ },
+ required: ['query'],
+ additionalProperties: false,
+ },
+} as const;
diff --git a/src/lib/tools/wiki_search.ts b/src/lib/tools/wiki_search.ts
new file mode 100644
index 0000000..4711091
--- /dev/null
+++ b/src/lib/tools/wiki_search.ts
@@ -0,0 +1,44 @@
+/**
+ * Semantic + substring search over the user's wiki articles. Same
+ * shape as `journal_search` and `memory_search`: embed the query via
+ * Venice, run vector cosine search against the stored embeddings,
+ * merge with an ILIKE fallback so freshly-written articles still
+ * participate before the embedding worker reaches them.
+ *
+ * Wiki articles are NEVER auto-injected into the chat - this tool is
+ * the only path the assistant has to reach them.
+ */
+import type { ToolDef } from './types';
+import { searchWikiArticlesSemantic } from '../wiki';
+import {
+ wikiSearchSchema,
+ WIKI_SEARCH_DEFAULT_LIMIT,
+ WIKI_SEARCH_MAX_LIMIT,
+} from './wiki_search.schema';
+
+export const wikiSearch: ToolDef = {
+ ...wikiSearchSchema,
+ async execute(args, ctx) {
+ const query = typeof args.query === 'string' ? args.query.trim() : '';
+ if (!query) throw new Error('query is required');
+ const rawLimit =
+ typeof args.limit === 'number' ? args.limit : WIKI_SEARCH_DEFAULT_LIMIT;
+ const limit = Math.max(
+ 1,
+ Math.min(WIKI_SEARCH_MAX_LIMIT, Math.floor(rawLimit))
+ );
+
+ const rows = await searchWikiArticlesSemantic(query, limit, {
+ supabase: ctx.supabase,
+ venice: ctx.venice,
+ signal: ctx.signal,
+ });
+ return rows.map((a) => ({
+ id: a.id,
+ title: a.title,
+ content: a.content,
+ updated_at: a.updated_at,
+ ...(typeof a.similarity === 'number' ? { similarity: a.similarity } : {}),
+ }));
+ },
+};
diff --git a/src/lib/tools/wiki_toolbox.ts b/src/lib/tools/wiki_toolbox.ts
new file mode 100644
index 0000000..75ecd57
--- /dev/null
+++ b/src/lib/tools/wiki_toolbox.ts
@@ -0,0 +1,40 @@
+/**
+ * Toolbox for the autonomous wiki agent. Factored out of
+ * `./index.ts` into its own leaf file for the same reason
+ * `memory_toolbox.ts` exists - importing `./index.ts` from a worker
+ * bundle transitively pulls in `research_docs` and other chat-only
+ * tools, blowing up the worker build.
+ *
+ * Tool impls are lazy-loaded via `lazyTool`; only the schemas are
+ * eagerly imported here. The autonomous wiki agent runs
+ * `runHeadlessToolLoop` against this toolbox (search-then-mutate
+ * shape: search first to find existing articles, then update or
+ * create based on the conversation; delete only for consolidation).
+ *
+ * The chat-side toolbox does NOT include these write tools - the
+ * user's main LLM only reaches wiki_search (registered in the
+ * always-on toolbox in ./index.ts). Article mutations come from
+ * either the autonomous agent here or the per-article "ask agent to
+ * update" UI flow on Wiki.svelte.
+ */
+import type { Toolbox } from './types';
+import { lazyTool } from './lazy';
+import { wikiSearchSchema } from './wiki_search.schema';
+import { wikiCreateSchema } from './wiki_create.schema';
+import { wikiUpdateSchema } from './wiki_update.schema';
+import { wikiDeleteSchema } from './wiki_delete.schema';
+
+export const wikiToolbox: Toolbox = {
+ name: 'wiki',
+ description:
+ "Read, create, update, and delete the signed-in user's wiki articles. " +
+ 'Vector + text search via wiki_search. Article voice is encyclopedic ' +
+ 'third-person prose. Preserve existing facts unless the conversation ' +
+ 'contradicts them; delete only for consolidation.',
+ tools: [
+ lazyTool(wikiSearchSchema, () => import('./wiki_search'), 'wikiSearch'),
+ lazyTool(wikiCreateSchema, () => import('./wiki_create'), 'wikiCreate'),
+ lazyTool(wikiUpdateSchema, () => import('./wiki_update'), 'wikiUpdate'),
+ lazyTool(wikiDeleteSchema, () => import('./wiki_delete'), 'wikiDelete'),
+ ],
+};
diff --git a/src/lib/tools/wiki_update.schema.ts b/src/lib/tools/wiki_update.schema.ts
new file mode 100644
index 0000000..5c35bab
--- /dev/null
+++ b/src/lib/tools/wiki_update.schema.ts
@@ -0,0 +1,36 @@
+/**
+ * Schema-only export for wiki_update. Impl lives in `./wiki_update`.
+ */
+import { MAX_WIKI_TITLE_CHARS, MAX_WIKI_CONTENT_CHARS } from '../wiki';
+
+export const wikiUpdateSchema = {
+ name: 'wiki_update',
+ description:
+ 'Update a wiki article by id. Omit a field to leave it unchanged. ' +
+ `title capped at ${MAX_WIKI_TITLE_CHARS} chars (must remain unique per user); ` +
+ `content capped at ${MAX_WIKI_CONTENT_CHARS} chars. Use wiki_search to find ` +
+ 'the id. Returns the updated row. Preserve existing facts unless the ' +
+ 'user has explicitly contradicted them.',
+ shortDescription: 'edit a wiki article',
+ parameters: {
+ type: 'object',
+ properties: {
+ id: {
+ type: 'string',
+ description: 'UUID of the article (from wiki_search).',
+ },
+ title: {
+ type: 'string',
+ minLength: 1,
+ maxLength: MAX_WIKI_TITLE_CHARS,
+ },
+ content: {
+ type: 'string',
+ minLength: 1,
+ maxLength: MAX_WIKI_CONTENT_CHARS,
+ },
+ },
+ required: ['id'],
+ additionalProperties: false,
+ },
+} as const;
diff --git a/src/lib/tools/wiki_update.ts b/src/lib/tools/wiki_update.ts
new file mode 100644
index 0000000..a53c06e
--- /dev/null
+++ b/src/lib/tools/wiki_update.ts
@@ -0,0 +1,42 @@
+/**
+ * Patch an existing wiki article's title and/or content. Either field
+ * can be omitted. Any change to title or content fires the schema
+ * trigger that nulls the embedding, sending the row back to the
+ * worker's pending queue.
+ */
+import type { ToolDef } from './types';
+import { MAX_WIKI_TITLE_CHARS, MAX_WIKI_CONTENT_CHARS } from '../wiki';
+import { wikiUpdateSchema } from './wiki_update.schema';
+import { emitWikiChange } from '../wiki-events';
+
+export const wikiUpdate: ToolDef = {
+ ...wikiUpdateSchema,
+ async execute(args, ctx) {
+ const id = typeof args.id === 'string' ? args.id : '';
+ if (!id) throw new Error('id is required');
+ const patch: { title?: string; content?: string } = {};
+ if (typeof args.title === 'string' && args.title.trim().length > 0) {
+ const title = args.title.trim();
+ if (title.length > MAX_WIKI_TITLE_CHARS) {
+ throw new Error(
+ `title exceeds ${MAX_WIKI_TITLE_CHARS}-char limit (got ${title.length})`
+ );
+ }
+ patch.title = title;
+ }
+ if (typeof args.content === 'string' && args.content.length > 0) {
+ if (args.content.length > MAX_WIKI_CONTENT_CHARS) {
+ throw new Error(
+ `content exceeds ${MAX_WIKI_CONTENT_CHARS}-char limit (got ${args.content.length}); split or trim`
+ );
+ }
+ patch.content = args.content;
+ }
+ if (Object.keys(patch).length === 0) {
+ throw new Error('provide at least one of title or content');
+ }
+ const article = await ctx.supabase.updateWikiArticle(id, patch);
+ emitWikiChange();
+ return article;
+ },
+};
diff --git a/src/lib/wiki-events.ts b/src/lib/wiki-events.ts
new file mode 100644
index 0000000..f82d9a4
--- /dev/null
+++ b/src/lib/wiki-events.ts
@@ -0,0 +1,25 @@
+/**
+ * Window-level event bus for cross-surface notification of wiki
+ * writes. Parallel to `journal-events.ts`. Fired whenever a wiki
+ * article is created/updated/deleted - by the user via Wiki.svelte,
+ * by the LLM via the wiki_create/_update/_delete tools, or by the
+ * autonomous wiki agent. The drawer listing and the open article
+ * panel both listen and refetch.
+ *
+ * Single-tab consistency only - Supabase realtime is not subscribed
+ * for `wiki_articles`. A future realtime subscriber can fire this
+ * same event from INSERT/UPDATE/DELETE without consumers changing.
+ */
+export const WIKI_CHANGE_EVENT = 'nak:wiki-change';
+
+export function emitWikiChange(): void {
+ if (typeof window === 'undefined') return;
+ window.dispatchEvent(new CustomEvent(WIKI_CHANGE_EVENT));
+}
+
+export function onWikiChange(handler: () => void): () => void {
+ if (typeof window === 'undefined') return () => undefined;
+ const listener = (): void => handler();
+ window.addEventListener(WIKI_CHANGE_EVENT, listener);
+ return () => window.removeEventListener(WIKI_CHANGE_EVENT, listener);
+}
diff --git a/src/lib/wiki-store.svelte.ts b/src/lib/wiki-store.svelte.ts
new file mode 100644
index 0000000..b673a35
--- /dev/null
+++ b/src/lib/wiki-store.svelte.ts
@@ -0,0 +1,103 @@
+/**
+ * Reactive store shared by the Wiki drawer tab. Sidebar
+ * (`WikiList.svelte`) and main panel (`Wiki.svelte`) both bind
+ * against `wikiStore`, so a search keystroke in the sidebar filters
+ * the drawer listing and a panel-side mutation (create / edit /
+ * delete) updates the drawer without a refetch.
+ *
+ * Parallel to `memories-store.svelte.ts` and
+ * `journal-store.svelte.ts`. Search semantics call
+ * `searchWikiArticlesSemantic` so the drawer matches the assistant's
+ * `wiki_search` tool exactly.
+ */
+import type { SupabaseService, WikiArticle } from './supabase';
+import type { VeniceClient } from './venice';
+import { searchWikiArticlesSemantic } from './wiki';
+
+interface WikiStore {
+ results: WikiArticle[];
+ loading: boolean;
+ /** Set true after the first `runWikiSearch` resolves, success or error. */
+ loaded: boolean;
+ error: string | null;
+ /** Bound to the sidebar search input. */
+ query: string;
+}
+
+export const wikiStore = $state({
+ results: [],
+ loading: false,
+ loaded: false,
+ error: null,
+ query: '',
+});
+
+// Match the assistant's `wiki_search` per-call cap so the human UI
+// never hides articles the assistant can reach.
+export const WIKI_LIST_LIMIT = 100;
+
+let currentAbort: AbortController | null = null;
+
+/**
+ * Run a fresh search against `wikiStore.query`. Callers should
+ * debounce - this runs immediately. Cancels any in-flight request so
+ * a stale result can't clobber the latest query.
+ */
+export async function runWikiSearch(
+ supabase: SupabaseService,
+ venice: VeniceClient | null,
+): Promise {
+ if (currentAbort) currentAbort.abort();
+ const ctl = new AbortController();
+ currentAbort = ctl;
+ wikiStore.loading = true;
+ wikiStore.error = null;
+ try {
+ const hits = await searchWikiArticlesSemantic(
+ wikiStore.query.trim(),
+ WIKI_LIST_LIMIT,
+ { supabase, venice, signal: ctl.signal },
+ );
+ if (ctl.signal.aborted) return;
+ wikiStore.results = hits;
+ } catch (err) {
+ if (ctl.signal.aborted) return;
+ wikiStore.error = err instanceof Error ? err.message : String(err);
+ } finally {
+ if (currentAbort === ctl) currentAbort = null;
+ if (!ctl.signal.aborted) {
+ wikiStore.loading = false;
+ wikiStore.loaded = true;
+ }
+ }
+}
+
+/**
+ * Replace one row in `results` without re-querying. Called from the
+ * panel after `updateWikiArticle` so the list doesn't visually
+ * reorder while the user is mid-edit.
+ */
+export function patchWikiRow(id: string, patch: Partial): void {
+ wikiStore.results = wikiStore.results.map((a) =>
+ a.id === id ? { ...a, ...patch } : a,
+ );
+}
+
+/** Drop one row from the list. */
+export function removeWikiRow(id: string): void {
+ wikiStore.results = wikiStore.results.filter((a) => a.id !== id);
+}
+
+/**
+ * Append a freshly-created article into the list. Inserts in the
+ * alphabetical position (case-insensitive on title) so the listing
+ * stays sorted without a refetch.
+ */
+export function addWikiRow(row: WikiArticle): void {
+ const next = [...wikiStore.results];
+ const lc = row.title.toLowerCase();
+ let insertAt = next.findIndex((a) => a.title.toLowerCase() > lc);
+ if (insertAt < 0) insertAt = next.length;
+ next.splice(insertAt, 0, row);
+ wikiStore.results = next;
+}
diff --git a/src/lib/wiki.ts b/src/lib/wiki.ts
new file mode 100644
index 0000000..c99b272
--- /dev/null
+++ b/src/lib/wiki.ts
@@ -0,0 +1,89 @@
+/**
+ * Shared semantic-search pipeline for the user wiki. Parallel to
+ * `src/lib/memories.ts` and `src/lib/journal-events.ts` family - the
+ * UI (`src/components/WikiList.svelte`, `src/screens/Wiki.svelte`) and
+ * the LLM-facing tool (`src/lib/tools/wiki_search.ts`) both call
+ * `searchWikiArticlesSemantic` so the user finds what the assistant
+ * finds.
+ *
+ * The merge contract lives in `SupabaseService.searchWikiArticles`:
+ * vector hits first (RPC), then ILIKE hits the vector path missed,
+ * deduped by id and capped at `limit`. This module's only job is to
+ * embed the query via Venice (when available) and hand the embedding
+ * to the supabase method.
+ *
+ * Why the silent Venice fallback: a transient Venice error here would
+ * otherwise blank the drawer. ILIKE-only is strictly better than a
+ * hard error from the user's POV.
+ */
+
+import type { SupabaseService, WikiArticle } from './supabase';
+import type { VeniceClient } from './venice';
+import { VENICE_EMBEDDING_MODEL, padEmbeddingForStorage } from './models';
+
+export interface SearchWikiDeps {
+ supabase: SupabaseService;
+ venice: VeniceClient | null;
+ signal?: AbortSignal;
+}
+
+/**
+ * Length ceiling on the article title. Defensive cap so a corrupt or
+ * model-generated title can't balloon the prompt or break the
+ * sidebar layout. The `(user_id, title)` unique index has no length
+ * limit at the DB layer; this is the application-side bound.
+ */
+export const MAX_WIKI_TITLE_CHARS = 200;
+
+/**
+ * Length ceiling on the article body. Mirrors the journal's
+ * `MAX_JOURNAL_CONTENT_CHARS = 16000`. The embedding source includes
+ * title + content so the cap also protects the embedding input from
+ * blowing past the embedding model's window.
+ */
+export const MAX_WIKI_CONTENT_CHARS = 16000;
+
+export async function searchWikiArticlesSemantic(
+ query: string,
+ limit: number,
+ deps: SearchWikiDeps,
+): Promise {
+ const { supabase, venice, signal } = deps;
+ const trimmed = query.trim();
+
+ // Empty query: alphabetical listing. searchWikiArticles short-
+ // circuits to listWikiArticles in this branch.
+ if (trimmed.length === 0) {
+ return supabase.searchWikiArticles({ query: '', queryEmbedding: null, limit });
+ }
+
+ // No Venice client: ILIKE-only.
+ if (!venice) {
+ return supabase.searchWikiArticles({ query: trimmed, queryEmbedding: null, limit });
+ }
+
+ let rawEmbedding: number[] | undefined;
+ try {
+ const response = await venice.embed({
+ model: VENICE_EMBEDDING_MODEL,
+ input: trimmed,
+ signal,
+ });
+ rawEmbedding = response.data[0]?.embedding;
+ } catch {
+ // Silent fallback - see file-level comment.
+ return supabase.searchWikiArticles({ query: trimmed, queryEmbedding: null, limit });
+ }
+
+ if (!rawEmbedding || rawEmbedding.length === 0) {
+ return supabase.searchWikiArticles({ query: trimmed, queryEmbedding: null, limit });
+ }
+
+ // Pad to the column's storage dim (2048). Cosine-invariant.
+ const queryEmbedding = padEmbeddingForStorage(rawEmbedding);
+ return supabase.searchWikiArticles({
+ query: trimmed,
+ queryEmbedding,
+ limit,
+ });
+}
diff --git a/src/screens/Chat.svelte b/src/screens/Chat.svelte
index 7ab5441..3218ae7 100644
--- a/src/screens/Chat.svelte
+++ b/src/screens/Chat.svelte
@@ -111,6 +111,7 @@
type CookbookComponent = typeof import('./Cookbook.svelte').default;
type JournalComponent = typeof import('./Journal.svelte').default;
type MemoriesComponent = typeof import('./Memories.svelte').default;
+ type WikiComponent = typeof import('./Wiki.svelte').default;
type SettingsComponent = typeof import('./Settings.svelte').default;
type HelpComponent = typeof import('./Help.svelte').default;
type SamskaraComponent = typeof import('./Samskara.svelte').default;
@@ -118,6 +119,7 @@
import RecipeList from '../components/RecipeList.svelte';
import JournalList from '../components/JournalList.svelte';
import MemoryList from '../components/MemoryList.svelte';
+ import WikiList from '../components/WikiList.svelte';
import IntuitionPill from '../components/IntuitionPill.svelte';
import IntuitionCard from '../components/IntuitionCard.svelte';
import {
@@ -134,6 +136,11 @@
memoriesStore,
runMemoriesSearch,
} from '$lib/memories-store.svelte';
+ import {
+ wikiStore,
+ runWikiSearch,
+ } from '$lib/wiki-store.svelte';
+ import { onWikiChange } from '$lib/wiki-events';
import { todayInZone, shiftDay } from '$lib/journal-day';
import { moodState } from '$lib/samskara/mood.svelte';
import { bandIndexFor, columnFor } from '$lib/samskara/events';
@@ -195,6 +202,7 @@
let CookbookComp: CookbookComponent | null = $state(null);
let JournalComp: JournalComponent | null = $state(null);
let MemoriesComp: MemoriesComponent | null = $state(null);
+ let WikiComp: WikiComponent | null = $state(null);
let SettingsComp: SettingsComponent | null = $state(null);
let HelpComp: HelpComponent | null = $state(null);
let SamskaraComp: SamskaraComponent | null = $state(null);
@@ -233,6 +241,11 @@
void import('./Memories.svelte').then((m) => (MemoriesComp = m.default));
}
});
+ $effect(() => {
+ if (drawerTab === 'wiki' && !WikiComp) {
+ void import('./Wiki.svelte').then((m) => (WikiComp = m.default));
+ }
+ });
$effect(() => {
if (showSettings && !SettingsComp) {
void import('./Settings.svelte').then((m) => (SettingsComp = m.default));
@@ -269,7 +282,7 @@
* use replaceState so a tab flip doesn't fill history with
* UI-chrome entries.
*/
- const drawerTab = $derived<'chats' | 'recipes' | 'journal' | 'memories'>(
+ const drawerTab = $derived<'chats' | 'recipes' | 'journal' | 'memories' | 'wiki'>(
route.drawer ?? 'chats'
);
// Recipe, journal, and memory search/listing state has moved to the
@@ -307,6 +320,17 @@
}
}
+ // Wiki drawer tab. Same lazy-load shape as memories - WikiList's
+ // $effect fires the first search via the shared `wikiStore`, but
+ // kicking it on tab-pick lets a deep-linked panel land on a non-empty
+ // listing.
+ function onPickWikiTab(): void {
+ navigate({ drawer: 'wiki' }, { replace: true });
+ if (app.supabase && !wikiStore.loaded && !wikiStore.loading) {
+ void runWikiSearch(app.supabase, app.venice);
+ }
+ }
+
// When the user (or a popstate pop) lands on `?drawer=recipes`
// without having gone through onPickRecipesTab, still make sure the
// recipe list is fetched so the drawer isn't blank.
@@ -340,6 +364,26 @@
void runMemoriesSearch(app.supabase, app.venice);
});
+ // Parallel for the wiki tab. Same `loaded`-gate rationale - an
+ // account with zero articles would re-fire the load forever
+ // otherwise.
+ $effect(() => {
+ if (route.drawer !== 'wiki') return;
+ if (!app.supabase) return;
+ if (wikiStore.loaded || wikiStore.loading) return;
+ void runWikiSearch(app.supabase, app.venice);
+ });
+
+ // Wiki cross-surface change channel. The chat-side wiki_* tool calls
+ // and the autonomous wiki worker both fire WIKI_CHANGE_EVENT after a
+ // write; refresh the drawer's listing so the new/updated row shows
+ // up without the user navigating away and back.
+ function onWikiStoreChanged(): void {
+ if (!app.supabase) return;
+ if (!wikiStore.loaded) return;
+ void runWikiSearch(app.supabase, app.venice);
+ }
+
function onJournalStoreChanged(): void {
// Any journal write (tool path, worker path, modal compose save)
// invalidates the list - only reload if we've already loaded it,
@@ -1349,10 +1393,12 @@
// fresh unlock that never opened those tabs stays lazy.
const offCookbook = onCookbookChange(onCookbookStoreChanged);
const offJournal = onJournalChange(onJournalStoreChanged);
+ const offWiki = onWikiChange(onWikiStoreChanged);
return () => {
unsubscribe();
offCookbook();
offJournal();
+ offWiki();
};
});
@@ -3949,6 +3995,16 @@
onclick={() => onPickMemoriesTab()}
>Memories
+
+
+
{#if drawerTab === 'chats'}
@@ -4158,11 +4214,16 @@
Clicking a date navigates to that day in the main panel.
onSelect mirrors the recipe + thread flow on mobile. -->
- {:else}
+ {:else if drawerTab === 'memories'}
+ {:else}
+
+
{/if}