diff --git a/SQLITE-PLAN.md b/SQLITE-PLAN.md new file mode 100644 index 0000000000..fdc1396bbc --- /dev/null +++ b/SQLITE-PLAN.md @@ -0,0 +1,320 @@ +# SQLite Support Plan (Prisma Next) + +This document is a handoff plan for implementing first-class SQLite support in Prisma Next, including a duplicated SQLite demo app with all existing demo queries ported. + +It is written to be executable by a coding agent with minimal additional context. + +--- + +## Goals + +1. Add a new SQL target: **SQLite**. +1. Keep the architecture contract-first and target-agnostic in core: **dialect logic stays in targets/adapters/drivers**. +1. Support a pluggable SQLite driver backend: + - Node 24: `node:sqlite` + - Bun: `bun:sqlite` +1. Support the existing query surface used by the demos: + - SQL lane: `select`, `where`, joins, `limit`, `orderBy`, `includeMany` + - DML: `insert`, `update`, `delete`, `returning()` + - ORM lane queries in `examples/prisma-next-demo/src/queries/*` + - Kysely integration queries in `examples/prisma-next-demo/src/kysely/*` (via `@prisma-next/integration-kysely`) +1. Duplicate `examples/prisma-next-demo` and port all queries/commands to SQLite. + +## Non-Goals (Explicitly Out of Scope Unless You Decide Otherwise) + +1. Perfect feature parity with Postgres across the full codebase. +1. Multi-tenant schema namespacing in SQLite beyond "one DB file per contract" (see ADR 122). +1. Optimizing for extreme concurrency (SQLite is single-writer; handle correctness first). +1. Duplicating `examples/prisma-orm-demo` for SQLite parity. + +--- + +## Architectural Boundaries To Respect (Read These First) + +### Domains / Layers / Planes + +Use the repo’s layering and plane rules as the primary constraint system: + +- **Framework domain** (`packages/1-framework/**`): target-agnostic core types/runtime tooling. +- **SQL family domain** (`packages/2-sql/**`): dialect-agnostic SQL lane/runtime/family tooling. +- **Targets domain** (`packages/3-targets/**`): concrete dialect packs: target descriptor, adapter, driver. +- **Extensions domain** (`packages/3-extensions/**`): optional packs (e.g. pgvector). + +Key boundary rules (see `docs/architecture docs/Package-Layering.md` + `architecture.config.json`): + +- No “target branches” in core: keep dialect-specific behavior out of SQL lanes/runtime. +- Migration plane must not import runtime-plane code. +- Shared plane must not import runtime/migration-plane code. + +### Where SQLite Code Must Live + +Implement SQLite as three packages mirroring Postgres: + +- `packages/3-targets/3-targets/sqlite` → `@prisma-next/target-sqlite` + - **Plane**: migration + runtime entrypoints + - **Owns**: migration planner/runner for SQLite (DDL strategy, checks), target descriptor/pack ref +- `packages/3-targets/6-adapters/sqlite` → `@prisma-next/adapter-sqlite` + - **Plane**: shared + migration + runtime entrypoints + - **Owns**: dialect lowering, capabilities, codecs, control-plane introspection + default normalization, error normalization +- `packages/3-targets/7-drivers/sqlite` → `@prisma-next/driver-sqlite` + - **Plane**: migration + runtime entrypoints + - **Owns**: transport/connection to SQLite (file DB), query/execute/explain plumbing, streaming strategy + +Update `architecture.config.json` to map these packages to the correct domain/layer/plane globs. + +--- + +## What Must Change In Existing Code (Extension Points) + +### 1. Marker Storage Is Currently Postgres-Specific In SQL Family Runtime + +Observed: + +- `packages/2-sql/5-runtime/src/sql-marker.ts` hardcodes Postgres-only DDL/types (`create schema`, `jsonb`, `timestamptz`, `now()`, `$1`). +- `packages/2-sql/5-runtime/src/sql-family-adapter.ts` uses `readContractMarker()` from the Postgres-shaped marker module for *all* SQL targets. +- `packages/2-sql/3-tooling/family/src/core/verify.ts` also hardcodes `from prisma_contract.marker where id = $1`. + +Why this blocks SQLite: + +- ADR 021 specifies SQLite marker lives in **`prisma_contract_marker`** (no schemas) and must use SQLite-compatible column types + SQL. + +Required change (choose one approach; prefer A): + +**A. Push marker SQL into adapters (recommended; matches ADR 021 + ADR 005):** + +- Runtime: make SQL family runtime read marker via the **runtime adapter instance** (dialect-owned). +- Control plane: make SQL family control-plane verify/sign/readMarker use the **control adapter instance** (dialect-owned). + +**B. Branch on target inside SQL family for marker only (acceptable fallback):** + +- Add `if (contract.target === 'sqlite') ...` branches in SQL family runtime/control to generate target-specific marker SQL. +- This violates “thin core, fat targets” more than A; document the debt in “Architectural Challenges”. + +Acceptance criteria for this workstream: + +- Runtime verification works for both Postgres and SQLite contracts. +- `pnpm lint:deps` passes (no new plane violations). + +Implementation notes (current repo state): + +- Marker reads are adapter-owned at runtime (SQL family runtime calls an adapter hook to obtain the marker read statement). +- Control-plane marker DDL/reads are target-aware to support SQLite’s flat namespace marker table. + +### 2. includeMany Gating Uses `lateral` + `jsonAgg` + +Observed: + +- SQL lane gates `includeMany()` on `contract.capabilities[contract.target].lateral === true` and `jsonAgg === true` (`packages/2-sql/4-lanes/sql-lane/src/utils/capabilities.ts`). +- Postgres implements includeMany via `LEFT JOIN LATERAL ... json_agg(...)`. + +SQLite reality: + +- SQLite does **not** have `LATERAL`, but *can* implement includeMany using correlated subqueries plus JSON aggregation (JSON1) if available. + +Required change: + +- Either: + - Keep the existing capability keys but reinterpret `lateral` as “supports includeMany strategy” (not ideal; document it). + - Or introduce a new capability key (recommended), e.g. `includeMany: true`, and update lane gating + docs accordingly. + +Acceptance criteria: + +- `includeMany()` works on SQLite demo queries and returns `[]` for no children (see `decodeRow()` behavior in `packages/2-sql/5-runtime/src/codecs/decoding.ts`). + +Implementation notes (current repo state): + +- We kept the existing gating keys for now and set `capabilities.sqlite.lateral = true` and `capabilities.sqlite.jsonAgg = true` on the SQLite demo contract so `includeMany()` is enabled. +- SQLite lowering implements includeMany via a correlated subquery plus JSON1 (`json_group_array/json_object`). + +### 3. Control-Plane Introspection Must Be Implemented For SQLite + +Observed: + +- SQL family control instance delegates introspection to the adapter’s `SqlControlAdapter.introspect()` implementation (Postgres exists). + +Required: + +- Add `SqliteControlAdapter` in `@prisma-next/adapter-sqlite`: + - tables: `sqlite_master` + - columns: `pragma_table_info` + - indexes: `pragma_index_list` + `pragma_index_info` + - foreign keys: `pragma_foreign_key_list` + - defaults: use `dflt_value` and implement `parseSqliteDefault` normalization + +Acceptance criteria: + +- `prisma-next db introspect` and `db schema-verify` work with SQLite demo DB. + +Implementation notes (current repo state): + +- SQLite introspection normalizes `nativeType` to lower-case (`INTEGER` → `integer`) to match contract native types. +- SQLite introspection excludes target-owned control tables (`prisma_contract_marker`, `prisma_contract_ledger`) so strict schema verification does not fail on internal tables. + +### 4. Migration Planner/Runner For SQLite Must Exist (At Least For “db init”) + +You need a `@prisma-next/target-sqlite` planner/runner analogous to Postgres: + +- Planner: additive-only init planner (empty DB → contract schema) + - SQLite DDL limitations: ALTER TABLE is limited; prefer “init from empty DB” as MVP. +- Runner: execute plan, verify schema, write marker + ledger (SQLite-flavored tables). +- Locks: SQLite has no advisory locks (ADR 043). Either implement lease table lock for correctness or document the limitation explicitly. + +Acceptance criteria: + +- The duplicated SQLite demo can run end-to-end: init DB, seed, execute queries. + +Documentation requirement: + +- Add `SQLITE_MIGRATIONS.md` describing the additive-only MVP and a concrete future strategy for non-additive diffs (table rebuild). + +### 5. Driver: Implement SQLite Transport (Runtime + Control) + +Add `@prisma-next/driver-sqlite`: + +- Control driver (`./control`): implements `ControlDriverInstance<'sql','sqlite'>` with `query()`. +- Runtime driver (`./runtime`): implements `SqlDriver` with: + - `connect()` + - `acquireConnection()` / `beginTransaction()` / commit/rollback + - `execute()` as `AsyncIterable` (chunked iteration is fine; see ADR 125) + - optional `explain()` (SQLite `EXPLAIN QUERY PLAN ...`) + +Chosen in this repo (current state): + +- Support both: + - Node 24’s built-in `node:sqlite` module (`DatabaseSync`) + - Bun’s `bun:sqlite` +- Work around bundlers that strip `node:` prefixes by loading the module via `createRequire()` with a runtime-built specifier (see `packages/3-targets/7-drivers/sqlite/src/node-sqlite.ts`). +- Raw SQL lane `$1` placeholders are normalized to SQLite `?1` placeholders in the SQLite driver. +- Prisma-style `file:./dev.db` connection strings must be resolved relative to `process.cwd()` (the URL constructor is not sufficient). + +### 6. Adapter: Implement SQLite Lowering + Codecs + +Add `@prisma-next/adapter-sqlite`: + +- Capabilities: at minimum: `orderBy`, `limit`, `returning`, `jsonAgg` (if JSON1), and whatever you decide for includeMany gating. +- Lowering: + - Identifiers quoted with `"` (SQLite compatible). + - Params: prefer `?{n}` placeholders (ADR 065 baseline). + - SELECT: joins/where/order/limit. + - includeMany: correlated subquery producing JSON array string: + - `SELECT (SELECT json_group_array(json_object(...)) FROM child WHERE ... ORDER BY ... LIMIT ...) AS posts` + - DML: INSERT/UPDATE/DELETE with `RETURNING` when capability is enabled. +- Codecs: + - int → number + - text → string + - datetime/timestamp → string (or Date, but be consistent; Postgres demo currently treats timestamp codecs as string) + - bool: store as integer 0/1 (encode/decode) + +### 7. Kysely Integration Must Support SQLite + +The native demo includes a Kysely example (`examples/prisma-next-demo/src/kysely/*`) using +`@prisma-next/integration-kysely`. Porting the demo to SQLite therefore requires SQLite support +in the Kysely integration extension. + +Required change: + +- Extend `packages/3-extensions/integration-kysely` so `KyselyPrismaDialect` supports `contract.target === 'sqlite'`: + - Use Kysely’s SQLite dialect primitives (`SqliteAdapter`, `SqliteIntrospector`, `SqliteQueryCompiler`). + - Ensure the runtime driver’s placeholder strategy works with Kysely (SQLite uses `?`/`?1` placeholders). + +Acceptance criteria: + +- SQLite demo commands `user-kysely` and `user-transaction-kysely` work end-to-end. + +### 8. Duplicate Demo Apps + Port Queries + +Create new example(s) under `examples/`: + +1. `examples/prisma-next-demo-sqlite/` (native lane demo) + +Porting tasks: + +- New `prisma-next.config.ts` using sqlite target/adapter/driver and a SQLite connection string (likely a file path). +- New contract definition using sqlite column types and `@prisma-next/target-sqlite/pack`. +- Update runtime factory to use sqlite driver options. +- Seed script: + - If you cannot support vector similarity in SQLite, you must still “port” the query by either: + - Implementing a SQLite vector extension pack (preferred), or + - Replacing similarity search with a SQLite-available operation and documenting the semantic change. + +Important: “Port all queries” means every demo query module has a SQLite equivalent and runs: + +- `examples/prisma-next-demo/src/queries/*.ts` +- `examples/prisma-next-demo/src/kysely/*.ts` (requires `@prisma-next/integration-kysely` SQLite support) + +Implementation notes (current repo state): + +- `examples/prisma-next-demo-sqlite/` exists and has all demo queries ported. +- Vector similarity is supported via `@prisma-next/extension-sqlite-vector` (stores vectors as JSON text) plus a SQLite UDF (`cosine_distance`) registered by the demo runtime. + +### 9. Tests + +Add SQLite-focused tests at three levels: + +- Package tests: + - `@prisma-next/adapter-sqlite`: lowering golden tests (AST → SQL), includeMany SQL shape tests. + - `@prisma-next/driver-sqlite`: execute/query/transaction behavior tests. + - `@prisma-next/target-sqlite`: planner/runner unit tests. +- Example tests: + - Copy `examples/prisma-next-demo/test/*` patterns but use a SQLite temp DB file fixture instead of `withDevDatabase`. +- Integration/e2e: + - Add at least one CLI test covering `db init` and `db verify` on SQLite. + +--- + +## Concrete Implementation Steps (Recommended Order) + +1. Scaffold packages: + - `@prisma-next/target-sqlite` + - `@prisma-next/adapter-sqlite` + - `@prisma-next/driver-sqlite` + - Wire `architecture.config.json` mappings + - Add `tsconfig` references (see `tsconfig.base.json` pattern) +1. Implement `@prisma-next/driver-sqlite` (runtime + control). +1. Implement `@prisma-next/adapter-sqlite`: + - codecs + - lowering for select/join/where/order/limit + - DML + returning + - includeMany lowering strategy + - control adapter introspection + default normalization +1. Fix marker plumbing to support both Postgres and SQLite (prefer adapter-owned marker statements). +1. Implement `@prisma-next/target-sqlite` migrations init planner/runner. +1. Duplicate `examples/prisma-next-demo` → `examples/prisma-next-demo-sqlite` and port: + - contract + - config + - runtime wiring + - seed + - queries +1. Add tests and make `pnpm test:packages` + `pnpm test:examples` green. +1. Update docs: + - `docs/reference/capabilities.md` (ensure it matches actual capability namespaces/keys) + - Add a short `packages/3-targets/**/sqlite/README.md` trilogy similar to Postgres. + +--- + +## Architectural Challenges (Write This As You Implement) + +As you implement, maintain a running list under this section (in this file or a PR description) of architectural problems you encounter, with: + +1. The exact file(s) involved +1. What boundary/ADR/rule it violates or stresses +1. The minimal viable fix +1. The “correct” long-term fix +1. Any rule/doc updates required + +You should expect at least these issues to come up: + +- SQL marker code (`packages/2-sql/5-runtime/src/sql-marker.ts`) is Postgres-only but lives in SQL family runtime. +- SQL family tooling core is mapped as “shared plane” but currently imports runtime code (`@prisma-next/sql-runtime`) in `packages/2-sql/3-tooling/family/src/core/control-instance.ts`. +- Capability namespaces/keys in code vs `docs/reference/capabilities.md` diverge (code uses `contract.capabilities[contract.target]` with keys like `lateral`, not `sql.lateral`). +- includeMany capability key (`lateral`) does not generalize to SQLite’s correlated-subquery implementation. +- `node:sqlite` is only available under the `node:` scheme, but some bundlers strip `node:` prefixes in output. The SQLite driver must avoid static imports from `node:sqlite` or compensate. +- `bun:sqlite` differs from `node:sqlite` in parameter binding semantics. In particular, Bun supports positional binding for `?1` placeholders, while Node requires numeric binding objects; the driver must implement backend-specific binding behavior. +- `bun:sqlite` does not support registering JS UDFs in the same way as `node:sqlite`. Avoid designs that require UDF registration for core/demo functionality (e.g. prefer pure SQL lowerings for extension packs when feasible). +- Prisma-style SQLite connection strings like `file:./dev.db` are not valid standard file URLs. Do not rely on `new URL()` to resolve them; resolve relative paths against `process.cwd()`. +- SQLite has no schema namespace, so strict schema verification must ignore target-owned control tables (`prisma_contract_*`) or they will be reported as “extra tables”. + +If you decide rules must change: + +- List the exact `.cursor/rules/*.mdc` file(s) and the minimal change. +- Update the corresponding docs/ADR references where appropriate. diff --git a/SQLITE_MIGRATIONS.md b/SQLITE_MIGRATIONS.md new file mode 100644 index 0000000000..6942e361e5 --- /dev/null +++ b/SQLITE_MIGRATIONS.md @@ -0,0 +1,136 @@ +# SQLite Migrations (Prisma Next) + +This document describes the current SQLite migration scope in Prisma Next and a future roadmap for broader migration support. + +It is intentionally **target-owned**: the implementation lives in `@prisma-next/target-sqlite` and must not leak dialect-specific logic into SQL family lanes/runtime. + +## Current Scope (MVP) + +SQLite migrations are currently **additive-only** and optimized for the primary MVP flow: + +- `prisma-next db init` on an empty SQLite database file +- contract marker + ledger tables written in the same DB + +The SQLite migration planner/runner live in: + +- `packages/3-targets/3-targets/sqlite/src/core/migrations/planner.ts` +- `packages/3-targets/3-targets/sqlite/src/core/migrations/runner.ts` + +### Supported Operations + +- Create missing tables +- Create missing columns (nullable only) +- Create indexes / unique indexes (where supported by SQLite) +- Create foreign keys as part of **new table creation** +- Create and maintain target-owned control tables: + - `prisma_contract_marker` + - `prisma_contract_ledger` + +### Explicitly Unsupported (For Now) + +These changes require a table rebuild strategy and are expected to fail fast with actionable errors: + +- Dropping columns +- Changing column types or nullability +- Changing default expressions +- Adding/removing/changing foreign keys on existing tables +- Renaming tables/columns (unless represented as drop+add with rebuild) +- Altering primary keys + +## Why This Is Hard In SQLite + +SQLite has limited `ALTER TABLE` support. Many schema changes require: + +- Creating a new table with the desired schema +- Copying data over +- Dropping the old table +- Renaming the new table +- Recreating indexes and constraints + +This is feasible, but it has non-trivial edge cases and needs a careful, deterministic planner so `db init` and `db verify` remain reliable. + +## Roadmap: Full Diff Migrations (Plan B) + +This section describes a concrete future approach to support broader diffs while preserving Prisma Next architectural boundaries. + +### 1) Detect Which Tables Need Rebuild + +During planning, classify per-table diffs into: + +- **Additive** (can be done with CREATE TABLE / ADD COLUMN / CREATE INDEX) +- **Rebuild-required** (anything that SQLite cannot express as a safe ALTER) + +Examples of rebuild-required diffs: + +- Drop/rename column +- Type change +- Not-null change +- PK change +- FK change on existing table + +### 2) Rebuild Algorithm (Single Table) + +For each table `T` that must be rebuilt: + +1. Compute `T_new` schema from the **desired contract**. +1. Create a temp table, e.g. `__prisma_next_new_T`: + - Use the desired column list, PK, FKs, and constraints. +1. Copy data: + - `INSERT INTO __prisma_next_new_T(colA, colB, ...) SELECT oldA, oldB, ... FROM T;` + - For dropped/renamed columns, omit or map them. + - For new NOT NULL columns, require a default or fail the plan. +1. Drop old table `T`. +1. Rename temp table to `T`. +1. Recreate indexes (and any triggers/views if Prisma Next ever owns them). + +Transaction strategy: + +- Prefer one transaction per rebuild wave. +- Use `BEGIN IMMEDIATE` to avoid mid-migration write races. +- Temporarily disable FK enforcement if required for the drop/rename steps: + - `PRAGMA foreign_keys = OFF` (then re-enable and validate at the end). + +### 3) Dependency Ordering (Multiple Tables) + +When multiple tables require rebuild, plan in waves: + +- Rebuild tables without inbound FKs first (or disable FKs temporarily). +- Rebuild referenced tables before referencing tables if enforcing FKs during copy. + +If FK disable is used, do a final validation phase: + +- `PRAGMA foreign_key_check` +- Fail with a structured error that includes the violating row/table. + +### 4) Data Safety and Explicit Failures + +A rebuild plan must fail fast when it cannot guarantee correctness, for example: + +- Dropping a column would lose required data and there is no explicit mapping. +- Type conversion is lossy or invalid for existing values. +- New NOT NULL column has no default and no mapping. + +These failures must be reported as stable errors with: + +- the table/column involved +- what diff triggered the rebuild +- the required user action (e.g. "provide a default" or "write a manual migration") + +### 5) How This Fits Prisma Next Boundaries + +- The **planner/runner** remain fully within `@prisma-next/target-sqlite`. +- SQL family runtime/lane remains dialect-agnostic (no branching on `sqlite`). +- Introspection stays adapter-owned (`@prisma-next/adapter-sqlite`) and should not embed migration logic. +- Marker and ledger schema remain target-owned and must be excluded from strict schema verification. + +## Testing Strategy + +When implementing Plan B, add targeted tests under `@prisma-next/target-sqlite`: + +- Unit tests for diff classification (additive vs rebuild) +- Golden tests for planned SQL statements +- Integration tests with a real temp DB file: + - seed schema v1 + data + - migrate to v2 via rebuild + - verify data preservation and FK correctness + diff --git a/architecture.config.json b/architecture.config.json index c0db6cb767..7107ed3f67 100644 --- a/architecture.config.json +++ b/architecture.config.json @@ -108,6 +108,18 @@ "layer": "targets", "plane": "runtime" }, + { + "glob": "packages/3-targets/3-targets/sqlite/src/exports/control.ts", + "domain": "extensions", + "layer": "targets", + "plane": "migration" + }, + { + "glob": "packages/3-targets/3-targets/sqlite/src/exports/runtime.ts", + "domain": "extensions", + "layer": "targets", + "plane": "runtime" + }, { "glob": "packages/3-targets/6-adapters/postgres/src/core/**", "domain": "targets", @@ -126,6 +138,24 @@ "layer": "adapters", "plane": "runtime" }, + { + "glob": "packages/3-targets/6-adapters/sqlite/src/core/**", + "domain": "targets", + "layer": "adapters", + "plane": "shared" + }, + { + "glob": "packages/3-targets/6-adapters/sqlite/src/exports/control.ts", + "domain": "targets", + "layer": "adapters", + "plane": "migration" + }, + { + "glob": "packages/3-targets/6-adapters/sqlite/src/exports/runtime.ts", + "domain": "targets", + "layer": "adapters", + "plane": "runtime" + }, { "glob": "packages/3-targets/7-drivers/postgres/src/exports/control.ts", "domain": "targets", @@ -138,6 +168,18 @@ "layer": "drivers", "plane": "runtime" }, + { + "glob": "packages/3-targets/7-drivers/sqlite/src/exports/control.ts", + "domain": "targets", + "layer": "drivers", + "plane": "migration" + }, + { + "glob": "packages/3-targets/7-drivers/sqlite/src/exports/runtime.ts", + "domain": "targets", + "layer": "drivers", + "plane": "runtime" + }, { "glob": "packages/3-extensions/compat-prisma/**", "domain": "extensions", @@ -179,6 +221,54 @@ "domain": "extensions", "layer": "adapters", "plane": "shared" + }, + { + "glob": "packages/3-extensions/sqlite-vector/src/core/**", + "domain": "extensions", + "layer": "adapters", + "plane": "shared" + }, + { + "glob": "packages/3-extensions/sqlite-vector/src/types/**", + "domain": "extensions", + "layer": "adapters", + "plane": "shared" + }, + { + "glob": "packages/3-extensions/sqlite-vector/src/exports/control.ts", + "domain": "extensions", + "layer": "adapters", + "plane": "migration" + }, + { + "glob": "packages/3-extensions/sqlite-vector/src/exports/runtime.ts", + "domain": "extensions", + "layer": "adapters", + "plane": "runtime" + }, + { + "glob": "packages/3-extensions/sqlite-vector/src/exports/codec-types.ts", + "domain": "extensions", + "layer": "adapters", + "plane": "shared" + }, + { + "glob": "packages/3-extensions/sqlite-vector/src/exports/operation-types.ts", + "domain": "extensions", + "layer": "adapters", + "plane": "shared" + }, + { + "glob": "packages/3-extensions/sqlite-vector/src/exports/column-types.ts", + "domain": "extensions", + "layer": "adapters", + "plane": "shared" + }, + { + "glob": "packages/3-extensions/sqlite-vector/src/exports/pack.ts", + "domain": "extensions", + "layer": "adapters", + "plane": "shared" } ], "rules": { diff --git a/docs/reference/capabilities.md b/docs/reference/capabilities.md index 4491cbb727..df6151a888 100644 --- a/docs/reference/capabilities.md +++ b/docs/reference/capabilities.md @@ -4,6 +4,26 @@ This document defines the canonical capability keys and reserved namespaces used Capabilities describe **what the database environment can do**. Adapters report capabilities at connect time, and the runtime negotiates them with extension packs. The contract only **declares requirements** (`contract.capabilities`) and pins the resulting `profileHash`; it does not define capabilities. +## Implementation Note (Current Prototype) + +The current Prisma Next implementation stores contract capability requirements **per target**: + +- Lanes gate features via `contract.capabilities[contract.target]` (keys like `returning`, `lateral`, `jsonAgg`). +- Extension packs typically declare required flags under the same target key (e.g. `sqlitevector/cosine` under `sqlite`). + +The `sql.*` namespace model described below is the long-term intended shape for adapter capability advertisement and negotiation. It is not yet the shape used by lane gating. + +Example contract capabilities (today): + +```json +{ + "capabilities": { + "postgres": { "returning": true, "lateral": true, "jsonAgg": true, "pgvector/cosine": true }, + "sqlite": { "returning": true, "lateral": true, "jsonAgg": true, "sqlitevector/cosine": true } + } +} +``` + ## Adapter (database) capabilities Adapter-reported features of the database runtime. These are not contract-owned; they are discovered and negotiated. diff --git a/examples/prisma-next-demo-sqlite/README.md b/examples/prisma-next-demo-sqlite/README.md new file mode 100644 index 0000000000..b1e28e06c6 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/README.md @@ -0,0 +1,130 @@ +# Prisma Next Demo (SQLite) + +This example demonstrates **Prisma Next in its native form**, using the Prisma Next APIs directly without the compatibility layer. + +## Purpose + +This demo shows: +- Using Prisma Next's query lanes (SQL DSL, Raw SQL, etc.) +- Creating Plans and executing them via the Runtime +- Contract verification and marker management +- Native Prisma Next patterns and best practices +- **Two workflows**: Emit workflow (JSON-based) and No-Emit workflow (TypeScript-based) + +## Comparison + +- **`prisma-next-demo`** (this example): Shows Prisma Next native APIs +- **`prisma-orm-demo`**: Shows using Prisma Next via the compatibility layer (mimics legacy Prisma Client API) + +## Workflows + +This demo includes two runtime implementations demonstrating different approaches: + +### 1. Emit Workflow (Default) + +Uses emitted `contract.json` and `contract.d.ts` files: + +- **Files**: `src/prisma/runtime.ts`, `src/prisma/query.ts`, `src/main.ts` +- **Contract source**: `src/prisma/contract.json` (emitted from `prisma/contract.ts`) +- **Usage**: `pnpm start -- [command]` +- **Benefits**: + - Contract is validated and normalized at emit time + - JSON can be loaded from external sources + - Type definitions are separate from runtime code + +**Setup**: +```bash +# Emit contract artifacts first +pnpm emit + +# Then run the app +pnpm start -- users +``` + +### 2. No-Emit Workflow + +Uses contract directly from TypeScript: + +- **Files**: `src/prisma/runtime-no-emit.ts`, `src/prisma/query-no-emit.ts`, `src/main-no-emit.ts` +- **Contract source**: `prisma/contract.ts` (direct import) +- **Usage**: `pnpm start:no-emit -- [command]` +- **Benefits**: + - No emit step required - contract is used directly + - Full type safety from TypeScript + - Simpler workflow for development + +**Usage**: +```bash +# No emit step needed - just run the app +pnpm start:no-emit -- users +``` + +## Architecture + +```mermaid +flowchart LR + Contract[Contract] --> Stack[ExecutionStack] + Stack --> StackI[ExecutionStackInstance] + StackI --> Context[ExecutionContext] + Context --> QueryRoots[Query Roots] + AppConfig[App Config] --> Runtime[Runtime] + StackI --> Runtime + Context --> Runtime +``` + +## Related Docs + +- **[Query Lanes](../../docs/architecture%20docs/subsystems/3.%20Query%20Lanes.md)** — DSL and ORM authoring surfaces +- **[Runtime & Plugin Framework](../../docs/architecture%20docs/subsystems/4.%20Runtime%20&%20Plugin%20Framework.md)** — Runtime execution pipeline + +## Setup + +1. Install dependencies: + ```bash + pnpm install + ``` + +2. Set up your database connection: + - Create a `.env` file + - Set `DATABASE_URL` to a SQLite database file path or a `file:` URL: + - `DATABASE_URL=/absolute/path/to/prisma_next_demo.sqlite` + - `DATABASE_URL=file:///absolute/path/to/prisma_next_demo.sqlite` + - `DATABASE_URL=file:./prisma_next_demo.sqlite` (Prisma-style, relative to current working directory) + - This demo uses `@prisma-next/extension-sqlite-vector` (vectors stored as JSON text). No database-side extension install or runtime UDF registration is required. + +3. Initialize the database schema and write the contract marker: + ```bash + pnpm exec prisma-next db init + ``` + +4. Seed the database: + ```bash + pnpm seed + ``` + +5. Run tests: + ```bash + pnpm test + ``` + +## Key Files + +- `prisma/contract.ts` - Contract definition (source of truth) +- `src/prisma/contract.json` - Emitted contract (emit workflow only) +- `src/prisma/contract.d.ts` - Emitted types (emit workflow only) +- `src/prisma/execution-context.ts` - Env-free execution stack/context (emit workflow) +- `src/prisma/query.ts` - Env-free query roots (emit workflow) +- `src/prisma-no-emit/query-no-emit.ts` - Env-free execution stack/context + query roots (no-emit workflow) +- `src/prisma/runtime.ts` - Runtime factory (emit workflow) +- `src/prisma-no-emit/runtime-no-emit.ts` - Runtime factory (no-emit workflow) +- `src/main.ts` - App entrypoint with arktype config validation (emit workflow) +- `src/main-no-emit.ts` - App entrypoint with arktype config validation (no-emit workflow) +- `scripts/drop-db.ts` - Deletes the SQLite database file (and `-wal`/`-shm` sidecars) +- `scripts/seed.ts` - Database seeding (includes vector embeddings) +- `src/queries/similarity-search.ts` - Example vector similarity search query +- `test/` - Integration tests demonstrating Prisma Next usage + +## Features Demonstrated + +- **Vector Similarity Search**: The demo includes a `similarity-search.ts` query that demonstrates cosine distance operations using the SQLite vector extension pack. +- **Extension Packs**: Shows how to configure and use extension packs (`@prisma-next/extension-sqlite-vector`) in `prisma-next.config.ts`. diff --git a/examples/prisma-next-demo-sqlite/biome.jsonc b/examples/prisma-next-demo-sqlite/biome.jsonc new file mode 100644 index 0000000000..b8994a7330 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/biome.jsonc @@ -0,0 +1,4 @@ +{ + "$schema": "https://biomejs.dev/schemas/2.3.11/schema.json", + "extends": "//" +} diff --git a/examples/prisma-next-demo-sqlite/index.html b/examples/prisma-next-demo-sqlite/index.html new file mode 100644 index 0000000000..03ca078a18 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/index.html @@ -0,0 +1,101 @@ + + + + + + Prisma Next Demo + + + +
+

Prisma Next Contract Viewer

+

This page is loaded from contract.json. Edit contract.ts to see changes.

+
Loading contract...
+
+ + + diff --git a/examples/prisma-next-demo-sqlite/package.json b/examples/prisma-next-demo-sqlite/package.json new file mode 100644 index 0000000000..c53548d808 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/package.json @@ -0,0 +1,53 @@ +{ + "name": "prisma-next-demo-sqlite", + "private": true, + "type": "module", + "engines": { + "node": ">=24" + }, + "scripts": { + "dev": "vite", + "emit": "prisma-next contract emit", + "db:drop": "tsx scripts/drop-db.ts", + "seed": "tsx scripts/seed.ts", + "start": "tsx src/main.ts", + "start:no-emit": "tsx src/main-no-emit.ts", + "test": "vitest run --config vitest.config.ts", + "typecheck": "tsc --project tsconfig.json --noEmit", + "lint": "biome check . --error-on-warnings" + }, + "dependencies": { + "@prisma-next/adapter-sqlite": "workspace:*", + "@prisma-next/contract": "workspace:*", + "@prisma-next/core-execution-plane": "workspace:*", + "@prisma-next/driver-sqlite": "workspace:*", + "@prisma-next/extension-sqlite-vector": "workspace:*", + "@prisma-next/family-sql": "workspace:*", + "@prisma-next/sql-contract": "workspace:*", + "@prisma-next/sql-contract-ts": "workspace:*", + "@prisma-next/sql-lane": "workspace:*", + "@prisma-next/sql-orm-lane": "workspace:*", + "@prisma-next/sql-relational-core": "workspace:*", + "@prisma-next/sql-runtime": "workspace:*", + "@prisma-next/target-sqlite": "workspace:*", + "@prisma-next/integration-kysely": "workspace:*", + "arktype": "^2.1.29", + "dotenv": "^16.4.5", + "kysely": "catalog:" + }, + "devDependencies": { + "@prisma-next/cli": "workspace:*", + "@prisma-next/core-control-plane": "workspace:*", + "@prisma-next/emitter": "workspace:*", + "@prisma-next/sql-contract-emitter": "workspace:*", + "@prisma-next/test-utils": "workspace:*", + "@prisma-next/tsconfig": "workspace:*", + "@prisma-next/vite-plugin-contract-emit": "workspace:*", + "@types/node": "catalog:", + "tsup": "catalog:", + "tsx": "^4.19.2", + "typescript": "catalog:", + "vite": "catalog:", + "vitest": "catalog:" + } +} diff --git a/examples/prisma-next-demo-sqlite/prisma-next.config.ts b/examples/prisma-next-demo-sqlite/prisma-next.config.ts new file mode 100644 index 0000000000..bc7d15ee82 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/prisma-next.config.ts @@ -0,0 +1,24 @@ +import 'dotenv/config'; +import sqliteAdapter from '@prisma-next/adapter-sqlite/control'; +import { defineConfig } from '@prisma-next/cli/config-types'; +import sqliteDriver from '@prisma-next/driver-sqlite/control'; +import sqlitevector from '@prisma-next/extension-sqlite-vector/control'; +import sql from '@prisma-next/family-sql/control'; +import sqlite from '@prisma-next/target-sqlite/control'; +import { contract } from './prisma/contract'; + +export default defineConfig({ + family: sql, + target: sqlite, + driver: sqliteDriver, + adapter: sqliteAdapter, + extensionPacks: [sqlitevector], + contract: { + source: contract, + output: 'src/prisma/contract.json', + }, + db: { + // biome-ignore lint/style/noNonNullAssertion: loaded from .env + connection: process.env['DATABASE_URL']!, + }, +}); diff --git a/examples/prisma-next-demo-sqlite/prisma/contract.ts b/examples/prisma-next-demo-sqlite/prisma/contract.ts new file mode 100644 index 0000000000..05106798cb --- /dev/null +++ b/examples/prisma-next-demo-sqlite/prisma/contract.ts @@ -0,0 +1,93 @@ +import type { CodecTypes } from '@prisma-next/adapter-sqlite/codec-types'; +import { datetimeColumn, intColumn, textColumn } from '@prisma-next/adapter-sqlite/column-types'; +import type { CodecTypes as SqliteVectorCodecTypes } from '@prisma-next/extension-sqlite-vector/codec-types'; +import { vectorColumn } from '@prisma-next/extension-sqlite-vector/column-types'; +import sqlitevector from '@prisma-next/extension-sqlite-vector/pack'; +import { defineContract } from '@prisma-next/sql-contract-ts/contract-builder'; +import sqlitePack from '@prisma-next/target-sqlite/pack'; + +type AllCodecTypes = CodecTypes & SqliteVectorCodecTypes; + +export const contract = defineContract() + .target(sqlitePack) + .table('user', (t) => + t + .column('id', { + type: intColumn, + nullable: false, + default: { kind: 'function', expression: 'autoincrement()' }, + }) + .column('email', { type: textColumn, nullable: false }) + .column('createdAt', { + type: datetimeColumn, + nullable: false, + default: { kind: 'function', expression: 'now()' }, + }) + .primaryKey(['id']), + ) + .table('post', (t) => + t + .column('id', { + type: intColumn, + nullable: false, + default: { kind: 'function', expression: 'autoincrement()' }, + }) + .column('title', { type: textColumn, nullable: false }) + .column('userId', { type: intColumn, nullable: false }) + .column('createdAt', { + type: datetimeColumn, + nullable: false, + default: { kind: 'function', expression: 'now()' }, + }) + .column('embedding', { type: vectorColumn, nullable: true }) + .primaryKey(['id']) + .foreignKey(['userId'], { table: 'user', columns: ['id'] }, 'post_userId_fkey'), + ) + .model('User', 'user', (m) => + m + .field('id', 'id') + .field('email', 'email') + .field('createdAt', 'createdAt') + .relation('posts', { + toModel: 'Post', + toTable: 'post', + cardinality: '1:N', + on: { + parentTable: 'user', + parentColumns: ['id'], + childTable: 'post', + childColumns: ['userId'], + }, + }), + ) + .model('Post', 'post', (m) => + m + .field('id', 'id') + .field('title', 'title') + .field('userId', 'userId') + .field('embedding', 'embedding') + .field('createdAt', 'createdAt') + .relation('user', { + toModel: 'User', + toTable: 'user', + cardinality: 'N:1', + on: { + parentTable: 'post', + parentColumns: ['userId'], + childTable: 'user', + childColumns: ['id'], + }, + }), + ) + .extensionPacks({ sqlitevector }) + .capabilities({ + sqlite: { + lateral: true, + jsonAgg: true, + returning: true, + 'sqlitevector/cosine': true, + 'defaults.autoincrement': true, + 'defaults.now': true, + }, + }) + .build(); diff --git a/examples/prisma-next-demo-sqlite/prisma/schema.prisma b/examples/prisma-next-demo-sqlite/prisma/schema.prisma new file mode 100644 index 0000000000..7f6f4c6e62 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/prisma/schema.prisma @@ -0,0 +1,25 @@ +// This is illustrative only. The actual contract is in prisma/contract.ts. + +generator client { + provider = "prisma-client-js" +} + +datasource db { + provider = "postgresql" +} + +model User { + id Int @id @default(autoincrement()) + email String @unique + createdAt DateTime @default(now()) + posts Post[] +} + +model Post { + id Int @id @default(autoincrement()) + title String + userId Int + user User @relation(fields: [userId], references: [id]) + createdAt DateTime @default(now()) +} + diff --git a/examples/prisma-next-demo-sqlite/scripts/drop-db.ts b/examples/prisma-next-demo-sqlite/scripts/drop-db.ts new file mode 100644 index 0000000000..c27a73a6c8 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/scripts/drop-db.ts @@ -0,0 +1,60 @@ +import 'dotenv/config'; +import { existsSync, rmSync } from 'node:fs'; +import { resolve as resolvePath } from 'node:path'; +import { fileURLToPath } from 'node:url'; + +function resolveSqliteFilename(urlOrPath: string): string { + if (!urlOrPath.startsWith('file:')) { + return urlOrPath; + } + + if (urlOrPath.startsWith('file://')) { + return fileURLToPath(new URL(urlOrPath)); + } + + const rest = urlOrPath.slice('file:'.length); + const pathPart = rest.split('?', 1)[0] ?? rest; + if (pathPart === ':memory:') { + return ':memory:'; + } + + return resolvePath(process.cwd(), decodeURIComponent(pathPart)); +} + +function dropSqliteDatabaseFile(urlOrPath: string): void { + const filename = resolveSqliteFilename(urlOrPath); + + // Clean up sqlite sidecar files too. + const files = [filename, `${filename}-wal`, `${filename}-shm`, `${filename}-journal`]; + + let removedAny = false; + for (const file of files) { + if (!existsSync(file)) { + continue; + } + rmSync(file); + removedAny = true; + // eslint-disable-next-line no-console + console.log(`✔ Removed ${file}`); + } + + if (!removedAny) { + // eslint-disable-next-line no-console + console.log('No database file found to remove.'); + } +} + +const databaseUrl = process.env['DATABASE_URL']; +if (!databaseUrl) { + // eslint-disable-next-line no-console + console.error('DATABASE_URL environment variable is required'); + process.exit(1); +} + +if (databaseUrl === ':memory:') { + // eslint-disable-next-line no-console + console.log('DATABASE_URL is :memory:, nothing to remove'); + process.exit(0); +} + +dropSqliteDatabaseFile(databaseUrl); diff --git a/examples/prisma-next-demo-sqlite/scripts/seed.ts b/examples/prisma-next-demo-sqlite/scripts/seed.ts new file mode 100644 index 0000000000..1fb736a2f1 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/scripts/seed.ts @@ -0,0 +1,170 @@ +/** + * Database Seed Script + * + * Populates the demo database with sample data using Prisma Next's SQL DSL. + * Demonstrates INSERT with RETURNING clause and parameterized queries. + * + * Run with: pnpm seed + * + * Creates: + * - 2 users (alice, bob) + * - 3 posts with vector embeddings (for similarity search demos) + * + * Prerequisites: + * - DATABASE_URL environment variable set + * - Database schema already applied (run pnpm db:push first) + */ +import 'dotenv/config'; + +import { param } from '@prisma-next/sql-relational-core/param'; +import type { ResultType } from '@prisma-next/sql-relational-core/types'; +import { schema, sql } from '../src/prisma/query'; +import { getRuntime } from '../src/prisma/runtime'; + +async function main() { + // biome-ignore lint/style/noNonNullAssertion: don't care about type safety in seed script + const runtime = getRuntime(process.env['DATABASE_URL']!); + + try { + const tables = schema.tables; + const userTable = tables.user; + const postTable = tables.post; + const userColumns = userTable.columns; + const postColumns = postTable.columns; + + // Insert users + const alicePlan = sql + .insert(userTable, { + id: param('id'), + email: param('email'), + createdAt: param('createdAt'), + }) + .returning(userColumns.id, userColumns.email) + .build({ + params: { + id: 1, + email: 'alice@example.com', + createdAt: new Date(), + }, + }); + + const alice = (await runtime.execute(alicePlan).toArray())[0]; + + const bobPlan = sql + .insert(userTable, { + id: param('id'), + email: param('email'), + createdAt: param('createdAt'), + }) + .returning(userColumns.id, userColumns.email) + .build({ + params: { + id: 2, + email: 'bob@example.com', + createdAt: new Date(), + }, + }); + + const bob = (await runtime.execute(bobPlan).toArray())[0]; + + if (!alice || !bob) { + throw new Error('Failed to create users'); + } + + type UserRow = ResultType; + const aliceUser = alice as UserRow; + const bobUser = bob as UserRow; + + console.log(`Created user: ${aliceUser.email} (id: ${aliceUser.id})`); + console.log(`Created user: ${bobUser.email} (id: ${bobUser.id})`); + + // Generate sample embedding vectors (1536 dimensions, matching common embedding models) + const generateEmbedding = (seed: number): number[] => { + const embedding: number[] = []; + for (let i = 0; i < 1536; i++) { + embedding.push(Math.sin(seed + i) * 0.1); + } + return embedding; + }; + + // Insert posts with embeddings + const post1Plan = sql + .insert(postTable, { + id: param('id'), + title: param('title'), + userId: param('userId'), + embedding: param('embedding'), + createdAt: param('createdAt'), + }) + .returning(postColumns.id, postColumns.title, postColumns.userId) + .build({ + params: { + id: 1, + title: 'First Post', + userId: alice.id, + embedding: generateEmbedding(1), + createdAt: new Date(), + }, + }); + + const post1 = (await runtime.execute(post1Plan).toArray())[0]; + + const post2Plan = sql + .insert(postTable, { + id: param('id'), + title: param('title'), + userId: param('userId'), + embedding: param('embedding'), + createdAt: param('createdAt'), + }) + .returning(postColumns.id, postColumns.title, postColumns.userId) + .build({ + params: { + id: 2, + title: 'Second Post', + userId: alice.id, + embedding: generateEmbedding(2), + createdAt: new Date(), + }, + }); + + const post2 = (await runtime.execute(post2Plan).toArray())[0]; + + const post3Plan = sql + .insert(postTable, { + id: param('id'), + title: param('title'), + userId: param('userId'), + embedding: param('embedding'), + createdAt: param('createdAt'), + }) + .returning(postColumns.id, postColumns.title, postColumns.userId) + .build({ + params: { + id: 3, + title: 'Third Post', + userId: bob.id, + embedding: generateEmbedding(3), + createdAt: new Date(), + }, + }); + + const post3 = (await runtime.execute(post3Plan).toArray())[0]; + + if (post1) + console.log(`Created post: ${post1.title} (id: ${post1.id}, userId: ${post1.userId})`); + if (post2) + console.log(`Created post: ${post2.title} (id: ${post2.id}, userId: ${post2.userId})`); + if (post3) + console.log(`Created post: ${post3.title} (id: ${post3.id}, userId: ${post3.userId})`); + + console.log('Seed completed successfully!'); + } finally { + await runtime.close(); + } +} + +main().catch((e) => { + console.error('Error seeding database:', e); + process.exitCode = 1; +}); diff --git a/examples/prisma-next-demo-sqlite/src/entry.ts b/examples/prisma-next-demo-sqlite/src/entry.ts new file mode 100644 index 0000000000..dd32408568 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/entry.ts @@ -0,0 +1,168 @@ +/** + * Browser Application Entry Point (Contract Visualization) + * + * This is a Vite-powered browser application that renders the emitted + * contract.json as an interactive HTML visualization. It demonstrates: + * + * - Machine-readable contracts: The JSON structure can be consumed by tools + * - Hot Module Replacement: Edit contract.ts, re-emit, and watch it update live + * - Contract introspection: Models, tables, relations, capabilities, extensions + * + * Run with: pnpm dev (starts Vite dev server with HMR) + * + * See also: + * - main.ts: CLI app using the same emitted contract + * - main-no-emit.ts: CLI app using inline contract definition + */ +import type { ModelDefinition, SqlContract, SqlStorage } from '@prisma-next/sql-contract/types'; +import contractJson from './prisma/contract.json'; + +type Relation = { + readonly cardinality: string; + readonly to: string; + readonly on: { readonly parentCols: readonly string[]; readonly childCols: readonly string[] }; +}; + +// Temporary demo-only shim: today the validated Contract type doesn't fully reflect the +// traversable IR shape we visualize here (target/relations/capabilities/extensionPacks). +// TML-1831 will make this redundant by aligning Contract with validateContract() output and +// moving derived mappings onto ExecutionContext: +// https://linear.app/prisma-company/issue/TML-1831/runtime-dx-ir-shaped-contract-mappings-on-executioncontext +type ContractIR = SqlContract> & { + target: string; + relations: Record>; + capabilities: Record>; + extensionPacks: Record; +}; + +function renderContract(c: ContractIR): string { + const models = Object.entries(c.models) + .map(([name, model]) => { + const tableName = model.storage.table; + const tableRelations = c.relations[tableName] ?? {}; + + const fields = Object.entries(model.fields) + .map(([fieldName, field]) => { + return ` +
+ ${fieldName} + → ${field?.column} +
+ `; + }) + .join(''); + + const relations = Object.entries(tableRelations) + .map(([relName, rel]) => { + const arrow = rel.cardinality === '1:N' ? '⇉' : '→'; + return ` +
+ ${arrow} ${relName} + ${rel.cardinality} → ${rel.to} +
+ `; + }) + .join(''); + + return ` +
+
+ 🧩 ${name} + table: ${tableName} +
+
+ ${fields} + ${relations ? `
${relations}
` : ''} +
+
+ `; + }) + .join(''); + + const tables = Object.entries(c.storage.tables) + .map(([name, table]) => { + const pk = table.primaryKey?.columns ?? []; + const columns = Object.entries(table.columns) + .map(([colName, col]) => { + const isPk = pk.includes(colName); + const nullable = col.nullable ? 'nullable' : ''; + return ` +
+ ${isPk ? '🔑 ' : ''}${colName} + ${col.nativeType} + ${nullable} +
+ `; + }) + .join(''); + + const fks = (table.foreignKeys ?? []) + .map( + (fk) => + `
→ ${fk.columns.join(', ')}→ ${fk.references.table}(${fk.references.columns.join(', ')})
`, + ) + .join(''); + + return ` +
+
+ 📦 ${name} + PK: ${pk.join(', ')} +
+
${columns}${fks}
+
+ `; + }) + .join(''); + + const caps = Object.entries(c.capabilities) + .flatMap(([ns, flags]) => + Object.entries(flags) + .filter(([, v]) => v) + .map(([k]) => `${ns}/${k}`), + ) + .join(''); + + const extensions = Object.keys(c.extensionPacks) + .map((ext) => `${ext}`) + .join(''); + + return ` +
+ Contract Hash: + ${c.coreHash} +
+
+
Target: ${c.target}
+
+
+
Models
+ ${models} +
+
+
Tables
+ ${tables} +
+
+
Capabilities
+
${caps || 'None'}
+
+
+
Extensions
+
${extensions || 'None'}
+
+ `; +} + +const app = document.getElementById('contract-view'); +if (app) { + app.innerHTML = renderContract(contractJson as unknown as ContractIR); +} + +if (import.meta.hot) { + import.meta.hot.accept('./prisma/contract.json', (newContract) => { + if (app && newContract) { + app.innerHTML = renderContract(newContract as unknown as ContractIR); + } + }); +} diff --git a/examples/prisma-next-demo-sqlite/src/kysely/get-user-by-id.ts b/examples/prisma-next-demo-sqlite/src/kysely/get-user-by-id.ts new file mode 100644 index 0000000000..7cceb650c7 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/kysely/get-user-by-id.ts @@ -0,0 +1,18 @@ +import { type KyselifyContract, KyselyPrismaDialect } from '@prisma-next/integration-kysely'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { Kysely } from 'kysely'; +import { executionContext } from '../prisma/execution-context'; + +export async function getUserById(userId: number, runtime: Runtime) { + const contract = executionContext.contract; + const kysely = new Kysely>({ + dialect: new KyselyPrismaDialect({ runtime, contract }), + }); + + return kysely + .selectFrom('user') + .selectAll() + .where('id', '=', userId) + .limit(1) + .executeTakeFirstOrThrow(); +} diff --git a/examples/prisma-next-demo-sqlite/src/kysely/insert-user-transaction.ts b/examples/prisma-next-demo-sqlite/src/kysely/insert-user-transaction.ts new file mode 100644 index 0000000000..7734ad16ce --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/kysely/insert-user-transaction.ts @@ -0,0 +1,36 @@ +import { type KyselifyContract, KyselyPrismaDialect } from '@prisma-next/integration-kysely'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { Kysely } from 'kysely'; +import { executionContext } from '../prisma/execution-context'; + +export async function insertUserTransaction(userId: number, runtime: Runtime) { + const contract = executionContext.contract; + const kysely = new Kysely>({ + dialect: new KyselyPrismaDialect({ runtime, contract }), + }); + + await kysely + .insertInto('user') + .values({ id: userId, email: 'jane@doe.com', createdAt: new Date().toISOString() }) + .execute(); + + await kysely + .transaction() + .execute(async (trx) => { + await trx + .updateTable('user') + .set({ email: 'john@doe.com' }) + .where('id', '=', userId) + .execute(); + + throw new Error('Simulated error to trigger rollback'); + }) + .catch((err) => { + if (err.message !== 'Simulated error to trigger rollback') { + // Ignore error + throw err; + } + }); + + return kysely.selectFrom('user').selectAll().where('id', '=', userId).executeTakeFirst(); +} diff --git a/examples/prisma-next-demo-sqlite/src/main-no-emit.ts b/examples/prisma-next-demo-sqlite/src/main-no-emit.ts new file mode 100644 index 0000000000..774e1da1dd --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/main-no-emit.ts @@ -0,0 +1,96 @@ +/** + * CLI Application Entry Point (No-Emit Workflow) + * + * This is a command-line demo that showcases the "no-emit" workflow where the + * contract is defined inline in TypeScript (prisma-no-emit/query-no-emit.ts) + * rather than emitted to separate JSON/d.ts files. + * + * This workflow is useful for: + * - Rapid prototyping without a build step + * - Simpler projects that don't need contract serialization + * - Understanding the contract structure before committing to emission + * + * Run with: pnpm start:no-emit -- [args] + * + * Available commands: + * - users [limit] List users with optional limit + * - user Get user by ID + * - posts Get posts for a user + * - users-with-posts [limit] Users with nested posts + * + * See also: + * - main.ts: Full CLI using emitted contract.json + contract.d.ts + * - entry.ts: Browser app for visualizing contract.json + */ +import 'dotenv/config'; +import { type as arktype } from 'arktype'; +import { getRuntime } from './prisma-no-emit/runtime-no-emit'; +import { getUserById } from './queries/get-user-by-id-no-emit'; +import { getUserPosts } from './queries/get-user-posts-no-emit'; +import { getUsers } from './queries/get-users-no-emit'; +import { getUsersWithPosts } from './queries/get-users-with-posts-no-emit'; + +const appConfigSchema = arktype({ + DATABASE_URL: 'string', +}); + +function loadAppConfig() { + const result = appConfigSchema({ + DATABASE_URL: process.env['DATABASE_URL'], + }); + if (result instanceof arktype.errors) { + const message = result.map((p: { message: string }) => p.message).join('; '); + throw new Error(`Invalid app configuration: ${message}`); + } + const parsed = result as { DATABASE_URL: string }; + return { databaseUrl: parsed.DATABASE_URL }; +} + +const argv = process.argv.slice(2).filter((arg) => arg !== '--'); +const [cmd, ...args] = argv; + +async function main() { + const { databaseUrl } = loadAppConfig(); + const runtime = getRuntime(databaseUrl); + try { + if (cmd === 'users') { + const limit = args[0] ? Number.parseInt(args[0], 10) : 10; + const users = await getUsers(runtime, limit); + console.log(JSON.stringify(users, null, 2)); + } else if (cmd === 'user') { + const [userIdStr] = args; + if (!userIdStr) { + console.error('Usage: pnpm start:no-emit -- user '); + process.exit(1); + } + const userId = Number.parseInt(userIdStr, 10); + const user = await getUserById(userId, runtime); + console.log(JSON.stringify(user, null, 2)); + } else if (cmd === 'posts') { + const [userIdStr] = args; + if (!userIdStr) { + console.error('Usage: pnpm start:no-emit -- posts '); + process.exit(1); + } + const userId = Number.parseInt(userIdStr, 10); + const posts = await getUserPosts(userId, runtime); + console.log(JSON.stringify(posts, null, 2)); + } else if (cmd === 'users-with-posts') { + const limit = args[0] ? Number.parseInt(args[0], 10) : 10; + const users = await getUsersWithPosts(runtime, limit); + console.log(JSON.stringify(users, null, 2)); + } else { + console.log( + 'Usage: pnpm start:no-emit -- [users [limit] | user | posts | users-with-posts [limit]]', + ); + process.exit(1); + } + } catch (error) { + console.error('Error:', error); + process.exit(1); + } finally { + await runtime.close(); + } +} + +await main(); diff --git a/examples/prisma-next-demo-sqlite/src/main.ts b/examples/prisma-next-demo-sqlite/src/main.ts new file mode 100644 index 0000000000..d4a2c26e4a --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/main.ts @@ -0,0 +1,186 @@ +/** + * CLI Application Entry Point (Emitted Contract Workflow) + * + * This is a command-line demo application that showcases Prisma Next's query + * capabilities using the standard emitted contract workflow: + * - contract.json (runtime contract data) + * - contract.d.ts (compile-time types) + * + * Run with: pnpm start -- [args] + * + * Available commands: + * - users [limit] List users with optional limit + * - user Get user by ID + * - posts Get posts for a user + * - users-with-posts [limit] Users with nested posts (includeMany) + * - users-paginate [cursor] Cursor-based pagination + * - similarity-search Vector similarity search (pgvector) + * - budget-violation Demo budget enforcement error + * + * See also: + * - main-no-emit.ts: Same CLI using inline contract (no emission step) + * - entry.ts: Browser app for visualizing contract.json + */ +import 'dotenv/config'; +import { type as arktype } from 'arktype'; +import { getUserById as getUserByIdKysely } from './kysely/get-user-by-id'; +import { insertUserTransaction as insertUserTransactionKysely } from './kysely/insert-user-transaction'; +import { getRuntime } from './prisma/runtime'; +import { getAllPostsUnbounded } from './queries/get-all-posts-unbounded'; +import { getUserById } from './queries/get-user-by-id'; +import { getUserPosts } from './queries/get-user-posts'; +import { getUsers } from './queries/get-users'; +import { getUsersWithPosts } from './queries/get-users-with-posts'; +import { ormGetUsersBackward, ormGetUsersByIdCursor } from './queries/orm-pagination'; +import { similaritySearch } from './queries/similarity-search'; + +const appConfigSchema = arktype({ + DATABASE_URL: 'string', +}); + +export function loadAppConfig() { + const result = appConfigSchema({ + DATABASE_URL: process.env['DATABASE_URL'], + }); + if (result instanceof arktype.errors) { + const message = result.map((p: { message: string }) => p.message).join('; '); + throw new Error(`Invalid app configuration: ${message}`); + } + const parsed = result as { DATABASE_URL: string }; + return { databaseUrl: parsed.DATABASE_URL }; +} + +const argv = process.argv.slice(2).filter((arg) => arg !== '--'); +const [cmd, ...args] = argv; + +async function main() { + const { databaseUrl } = loadAppConfig(); + const runtime = getRuntime(databaseUrl); + try { + if (cmd === 'users') { + const limit = args[0] ? Number.parseInt(args[0], 10) : 10; + const users = await getUsers(runtime, limit); + console.log(JSON.stringify(users, null, 2)); + } else if (cmd === 'user') { + const [userIdStr] = args; + if (!userIdStr) { + console.error('Usage: pnpm start -- user '); + process.exit(1); + } + const userId = Number.parseInt(userIdStr, 10); + const user = await getUserById(userId, runtime); + console.log(JSON.stringify(user, null, 2)); + } else if (cmd === 'posts') { + const [userIdStr] = args; + if (!userIdStr) { + console.error('Usage: pnpm start -- posts '); + process.exit(1); + } + const userId = Number.parseInt(userIdStr, 10); + const posts = await getUserPosts(userId, runtime); + console.log(JSON.stringify(posts, null, 2)); + } else if (cmd === 'users-with-posts') { + const limit = args[0] ? Number.parseInt(args[0], 10) : 10; + const users = await getUsersWithPosts(runtime, limit); + console.log(JSON.stringify(users, null, 2)); + } else if (cmd === 'similarity-search') { + const [queryVectorStr, limitStr] = args; + if (!queryVectorStr) { + console.error('Usage: pnpm start -- similarity-search [limit]'); + console.error(' queryVector: JSON array of numbers, e.g., "[0.1,0.2,0.3]"'); + process.exit(1); + } + let queryVector: number[]; + try { + queryVector = JSON.parse(queryVectorStr) as number[]; + if (!Array.isArray(queryVector) || !queryVector.every((v) => typeof v === 'number')) { + throw new Error('queryVector must be an array of numbers'); + } + } catch (error) { + console.error( + 'Error parsing queryVector:', + error instanceof Error ? error.message : String(error), + ); + console.error('Expected JSON array of numbers, e.g., "[0.1,0.2,0.3]"'); + process.exit(1); + } + const limit = limitStr ? Number.parseInt(limitStr, 10) : 10; + const results = await similaritySearch(queryVector, runtime, limit); + console.log(JSON.stringify(results, null, 2)); + } else if (cmd === 'users-paginate') { + const [cursorStr, limitStr] = args; + const cursor = cursorStr ? Number.parseInt(cursorStr, 10) : null; + const limit = limitStr ? Number.parseInt(limitStr, 10) : 10; + const users = await ormGetUsersByIdCursor(cursor, limit, runtime); + console.log(JSON.stringify(users, null, 2)); + } else if (cmd === 'users-paginate-back') { + const [cursorStr, limitStr] = args; + if (!cursorStr) { + console.error('Usage: pnpm start -- users-paginate-back [limit]'); + process.exit(1); + } + const cursor = Number.parseInt(cursorStr, 10); + const limit = limitStr ? Number.parseInt(limitStr, 10) : 10; + const users = await ormGetUsersBackward(cursor, limit, runtime); + console.log(JSON.stringify(users, null, 2)); + } else if (cmd === 'budget-violation') { + console.log('Running unbounded query to demonstrate budget violation...'); + console.log('This query has no LIMIT clause and will trigger BUDGET.ROWS_EXCEEDED error.\n'); + try { + const result = await getAllPostsUnbounded(runtime); + console.log(JSON.stringify(result, null, 2)); + } catch (error) { + console.error('Budget violation caught:'); + if (error instanceof Error) { + const budgetError = error as { code?: string; category?: string; details?: unknown }; + console.error(' Code:', budgetError.code); + console.error(' Category:', budgetError.category); + console.error(' Message:', error.message); + if (budgetError.details) { + console.error(' Details:', JSON.stringify(budgetError.details, null, 2)); + } + } else { + console.error(' Error:', error); + } + throw error; // Re-throw to show the full error stack + } + } else if (cmd === 'user-kysely') { + const [userIdStr] = args; + if (!userIdStr) { + console.error('Usage: pnpm start -- user-kysely '); + process.exit(1); + } + const userId = Number.parseInt(userIdStr, 10); + // use a runtime without plugins to avoid false positive linting errors + const kyselyRuntime = getRuntime(databaseUrl, []); + const user = await getUserByIdKysely(userId, kyselyRuntime); + console.log(JSON.stringify(user, null, 2)); + } else if (cmd === 'user-transaction-kysely') { + const [userIdStr] = args; + if (!userIdStr) { + console.error('Usage: pnpm start -- user-transaction-kysely '); + process.exit(1); + } + const userId = Number.parseInt(userIdStr, 10); + // use a runtime without plugins to avoid false positive linting errors + const kyselyRuntime = getRuntime(databaseUrl, []); + const newUser = await insertUserTransactionKysely(userId, kyselyRuntime); + console.log('Inserted user:', JSON.stringify(newUser, null, 2)); + } else { + console.log( + 'Usage: pnpm start -- [users [limit] | user | posts | ' + + 'users-with-posts [limit] | users-paginate [cursor] [limit] | ' + + 'users-paginate-back [limit] | similarity-search [limit] | ' + + 'budget-violation | user-kysely | user-transaction-kysely ]', + ); + process.exit(1); + } + } catch (error) { + console.error('Error:', error); + process.exit(1); + } finally { + await runtime.close(); + } +} + +await main(); diff --git a/examples/prisma-next-demo-sqlite/src/prisma-no-emit/query-no-emit.ts b/examples/prisma-next-demo-sqlite/src/prisma-no-emit/query-no-emit.ts new file mode 100644 index 0000000000..d67329fa6b --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/prisma-no-emit/query-no-emit.ts @@ -0,0 +1,38 @@ +import sqliteAdapter from '@prisma-next/adapter-sqlite/runtime'; +import { + createExecutionStack, + instantiateExecutionStack, +} from '@prisma-next/core-execution-plane/stack'; +import sqliteDriver from '@prisma-next/driver-sqlite/runtime'; +import sqlitevectorDescriptor from '@prisma-next/extension-sqlite-vector/runtime'; +import { sql as sqlBuilder } from '@prisma-next/sql-lane'; +import { orm as ormBuilder } from '@prisma-next/sql-orm-lane'; +import { schema as schemaBuilder } from '@prisma-next/sql-relational-core/schema'; +import { createExecutionContext } from '@prisma-next/sql-runtime'; +import sqliteTarget from '@prisma-next/target-sqlite/runtime'; +// Use contract directly from TypeScript - no emit needed! +import { contract } from '../../prisma/contract'; + +export const executionStack = createExecutionStack({ + target: sqliteTarget, + adapter: sqliteAdapter, + driver: sqliteDriver, + extensionPacks: [sqlitevectorDescriptor], +}); + +export const executionStackInstance = instantiateExecutionStack(executionStack); +export const executionContext = createExecutionContext({ + contract, + stackInstance: executionStackInstance, +}); + +export const sql = sqlBuilder({ + context: executionContext, +}); + +export const schema = schemaBuilder(executionContext); +export const tables = schema.tables; + +export const orm = ormBuilder({ + context: executionContext, +}); diff --git a/examples/prisma-next-demo-sqlite/src/prisma-no-emit/runtime-no-emit.ts b/examples/prisma-next-demo-sqlite/src/prisma-no-emit/runtime-no-emit.ts new file mode 100644 index 0000000000..7c7f284740 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/prisma-no-emit/runtime-no-emit.ts @@ -0,0 +1,25 @@ +import { budgets, createRuntime, type Runtime } from '@prisma-next/sql-runtime'; +import { executionContext, executionStackInstance } from './query-no-emit'; + +export function getRuntime(databaseUrl: string): Runtime { + return createRuntime({ + stackInstance: executionStackInstance, + contract: executionContext.contract, + context: executionContext, + driverOptions: { + connect: { filename: databaseUrl }, + }, + verify: { + mode: 'onFirstUse', + requireMarker: false, + }, + plugins: [ + budgets({ + maxRows: 10_000, + defaultTableRows: 10_000, + tableRows: { user: 10_000, post: 10_000 }, + maxLatencyMs: 1_000, + }), + ], + }); +} diff --git a/examples/prisma-next-demo-sqlite/src/prisma/contract.d.ts b/examples/prisma-next-demo-sqlite/src/prisma/contract.d.ts new file mode 100644 index 0000000000..d12cb75847 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/prisma/contract.d.ts @@ -0,0 +1,176 @@ +// ⚠️ GENERATED FILE - DO NOT EDIT +// This file is automatically generated by 'prisma-next contract emit'. +// To regenerate, run: prisma-next contract emit +import type { CodecTypes as SqliteTypes } from '@prisma-next/adapter-sqlite/codec-types'; +import type { CodecTypes as SqliteVectorTypes } from '@prisma-next/extension-sqlite-vector/codec-types'; +import type { Vector } from '@prisma-next/extension-sqlite-vector/codec-types'; +import type { OperationTypes as SqliteVectorOperationTypes } from '@prisma-next/extension-sqlite-vector/operation-types'; + +import type { CoreHashBase, ProfileHashBase } from '@prisma-next/contract/types'; +import type { + SqlContract, + SqlStorage, + SqlMappings, + ModelDefinition, +} from '@prisma-next/sql-contract/types'; + +export type CoreHash = + CoreHashBase<'sha256:470f2b3ae9a5f47c9ddda58fff0406d5cbac59bb09770c2e00b33c2502897820'>; +export type ProfileHash = + ProfileHashBase<'sha256:e668a0ecff98e7aa08a807ae2cf547b997002d8a6e0d5b7741d2df62116d6de7'>; + +export type CodecTypes = SqliteTypes & SqliteVectorTypes; +export type LaneCodecTypes = CodecTypes; +export type OperationTypes = SqliteVectorOperationTypes; + +export type Contract = SqlContract< + { + readonly tables: { + readonly user: { + columns: { + readonly id: { + readonly nativeType: 'integer'; + readonly codecId: 'sqlite/int@1'; + readonly nullable: false; + }; + readonly email: { + readonly nativeType: 'text'; + readonly codecId: 'sqlite/text@1'; + readonly nullable: false; + }; + readonly createdAt: { + readonly nativeType: 'text'; + readonly codecId: 'sqlite/datetime@1'; + readonly nullable: false; + }; + }; + primaryKey: { readonly columns: readonly ['id'] }; + uniques: readonly []; + indexes: readonly []; + foreignKeys: readonly []; + }; + readonly post: { + columns: { + readonly id: { + readonly nativeType: 'integer'; + readonly codecId: 'sqlite/int@1'; + readonly nullable: false; + }; + readonly title: { + readonly nativeType: 'text'; + readonly codecId: 'sqlite/text@1'; + readonly nullable: false; + }; + readonly userId: { + readonly nativeType: 'integer'; + readonly codecId: 'sqlite/int@1'; + readonly nullable: false; + }; + readonly createdAt: { + readonly nativeType: 'text'; + readonly codecId: 'sqlite/datetime@1'; + readonly nullable: false; + }; + readonly embedding: { + readonly nativeType: 'text'; + readonly codecId: 'sqlite/vector@1'; + readonly nullable: true; + }; + }; + primaryKey: { readonly columns: readonly ['id'] }; + uniques: readonly []; + indexes: readonly []; + foreignKeys: readonly [ + { + readonly columns: readonly ['userId']; + readonly references: { readonly table: 'user'; readonly columns: readonly ['id'] }; + readonly name: 'post_userId_fkey'; + }, + ]; + }; + }; + readonly types: Record; + }, + { + readonly User: { + storage: { readonly table: 'user' }; + fields: { + readonly id: CodecTypes['sqlite/int@1']['output']; + readonly email: CodecTypes['sqlite/text@1']['output']; + readonly createdAt: CodecTypes['sqlite/datetime@1']['output']; + }; + }; + readonly Post: { + storage: { readonly table: 'post' }; + fields: { + readonly id: CodecTypes['sqlite/int@1']['output']; + readonly title: CodecTypes['sqlite/text@1']['output']; + readonly userId: CodecTypes['sqlite/int@1']['output']; + readonly embedding: CodecTypes['sqlite/vector@1']['output'] | null; + readonly createdAt: CodecTypes['sqlite/datetime@1']['output']; + }; + }; + }, + { + readonly user: { + readonly posts: { + readonly to: 'Post'; + readonly cardinality: '1:N'; + readonly on: { + readonly parentCols: readonly ['id']; + readonly childCols: readonly ['userId']; + }; + }; + }; + readonly post: { + readonly user: { + readonly to: 'User'; + readonly cardinality: 'N:1'; + readonly on: { + readonly parentCols: readonly ['userId']; + readonly childCols: readonly ['id']; + }; + }; + }; + }, + { + modelToTable: { readonly User: 'user'; readonly Post: 'post' }; + tableToModel: { readonly user: 'User'; readonly post: 'Post' }; + fieldToColumn: { + readonly User: { + readonly id: 'id'; + readonly email: 'email'; + readonly createdAt: 'createdAt'; + }; + readonly Post: { + readonly id: 'id'; + readonly title: 'title'; + readonly userId: 'userId'; + readonly embedding: 'embedding'; + readonly createdAt: 'createdAt'; + }; + }; + columnToField: { + readonly user: { + readonly id: 'id'; + readonly email: 'email'; + readonly createdAt: 'createdAt'; + }; + readonly post: { + readonly id: 'id'; + readonly title: 'title'; + readonly userId: 'userId'; + readonly embedding: 'embedding'; + readonly createdAt: 'createdAt'; + }; + }; + codecTypes: SqliteTypes & SqliteVectorTypes; + operationTypes: SqliteVectorOperationTypes; + }, + CoreHash, + ProfileHash +>; + +export type Tables = Contract['storage']['tables']; +export type Models = Contract['models']; +export type Relations = Contract['relations']; diff --git a/examples/prisma-next-demo-sqlite/src/prisma/contract.json b/examples/prisma-next-demo-sqlite/src/prisma/contract.json new file mode 100644 index 0000000000..c46c561df5 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/prisma/contract.json @@ -0,0 +1,193 @@ +{ + "schemaVersion": "1", + "targetFamily": "sql", + "target": "sqlite", + "coreHash": "sha256:470f2b3ae9a5f47c9ddda58fff0406d5cbac59bb09770c2e00b33c2502897820", + "profileHash": "sha256:e668a0ecff98e7aa08a807ae2cf547b997002d8a6e0d5b7741d2df62116d6de7", + "models": { + "Post": { + "fields": { + "createdAt": { + "column": "createdAt" + }, + "embedding": { + "column": "embedding" + }, + "id": { + "column": "id" + }, + "title": { + "column": "title" + }, + "userId": { + "column": "userId" + } + }, + "relations": {}, + "storage": { + "table": "post" + } + }, + "User": { + "fields": { + "createdAt": { + "column": "createdAt" + }, + "email": { + "column": "email" + }, + "id": { + "column": "id" + } + }, + "relations": {}, + "storage": { + "table": "user" + } + } + }, + "storage": { + "tables": { + "post": { + "columns": { + "createdAt": { + "codecId": "sqlite/datetime@1", + "default": { + "expression": "now()", + "kind": "function" + }, + "nativeType": "text", + "nullable": false + }, + "embedding": { + "codecId": "sqlite/vector@1", + "nativeType": "text", + "nullable": true + }, + "id": { + "codecId": "sqlite/int@1", + "default": { + "expression": "autoincrement()", + "kind": "function" + }, + "nativeType": "integer", + "nullable": false + }, + "title": { + "codecId": "sqlite/text@1", + "nativeType": "text" + }, + "userId": { + "codecId": "sqlite/int@1", + "nativeType": "integer" + } + }, + "foreignKeys": [ + { + "columns": [ + "userId" + ], + "name": "post_userId_fkey", + "references": { + "columns": [ + "id" + ], + "table": "user" + } + } + ], + "indexes": [], + "primaryKey": { + "columns": [ + "id" + ] + }, + "uniques": [] + }, + "user": { + "columns": { + "createdAt": { + "codecId": "sqlite/datetime@1", + "default": { + "expression": "now()", + "kind": "function" + }, + "nativeType": "text", + "nullable": false + }, + "email": { + "codecId": "sqlite/text@1", + "nativeType": "text" + }, + "id": { + "codecId": "sqlite/int@1", + "default": { + "expression": "autoincrement()", + "kind": "function" + }, + "nativeType": "integer", + "nullable": false + } + }, + "foreignKeys": [], + "indexes": [], + "primaryKey": { + "columns": [ + "id" + ] + }, + "uniques": [] + } + } + }, + "capabilities": { + "sqlite": { + "defaults.autoincrement": true, + "defaults.now": true, + "jsonAgg": true, + "lateral": true, + "returning": true, + "sqlitevector/cosine": true + } + }, + "extensionPacks": { + "sqlitevector": {} + }, + "meta": {}, + "sources": {}, + "relations": { + "post": { + "user": { + "cardinality": "N:1", + "on": { + "childCols": [ + "id" + ], + "parentCols": [ + "userId" + ] + }, + "to": "User" + } + }, + "user": { + "posts": { + "cardinality": "1:N", + "on": { + "childCols": [ + "userId" + ], + "parentCols": [ + "id" + ] + }, + "to": "Post" + } + } + }, + "_generated": { + "warning": "⚠️ GENERATED FILE - DO NOT EDIT", + "message": "This file is automatically generated by \"prisma-next contract emit\".", + "regenerate": "To regenerate, run: prisma-next contract emit" + } +} \ No newline at end of file diff --git a/examples/prisma-next-demo-sqlite/src/prisma/execution-context.ts b/examples/prisma-next-demo-sqlite/src/prisma/execution-context.ts new file mode 100644 index 0000000000..c443e92b7e --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/prisma/execution-context.ts @@ -0,0 +1,27 @@ +import sqliteAdapter from '@prisma-next/adapter-sqlite/runtime'; +import { + createExecutionStack, + instantiateExecutionStack, +} from '@prisma-next/core-execution-plane/stack'; +import sqliteDriver from '@prisma-next/driver-sqlite/runtime'; +import sqlitevectorDescriptor from '@prisma-next/extension-sqlite-vector/runtime'; +import { validateContract } from '@prisma-next/sql-contract-ts/contract'; +import { createExecutionContext } from '@prisma-next/sql-runtime'; +import sqliteTarget from '@prisma-next/target-sqlite/runtime'; +import type { Contract } from './contract.d'; +import contractJson from './contract.json' with { type: 'json' }; + +const contract = validateContract(contractJson); + +export const executionStack = createExecutionStack({ + target: sqliteTarget, + adapter: sqliteAdapter, + driver: sqliteDriver, + extensionPacks: [sqlitevectorDescriptor], +}); + +export const executionStackInstance = instantiateExecutionStack(executionStack); +export const executionContext = createExecutionContext({ + contract, + stackInstance: executionStackInstance, +}); diff --git a/examples/prisma-next-demo-sqlite/src/prisma/query.ts b/examples/prisma-next-demo-sqlite/src/prisma/query.ts new file mode 100644 index 0000000000..2d4fed0db2 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/prisma/query.ts @@ -0,0 +1,9 @@ +import { sql as sqlBuilder } from '@prisma-next/sql-lane'; +import { orm as ormBuilder } from '@prisma-next/sql-orm-lane'; +import { schema as schemaBuilder } from '@prisma-next/sql-relational-core/schema'; +import { executionContext } from './execution-context'; + +export const schema = schemaBuilder(executionContext); +export const tables = schema.tables; +export const sql = sqlBuilder({ context: executionContext }); +export const orm = ormBuilder({ context: executionContext }); diff --git a/examples/prisma-next-demo-sqlite/src/prisma/runtime.ts b/examples/prisma-next-demo-sqlite/src/prisma/runtime.ts new file mode 100644 index 0000000000..1f0a2a6d29 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/prisma/runtime.ts @@ -0,0 +1,28 @@ +import { budgets, createRuntime, type Plugin, type Runtime } from '@prisma-next/sql-runtime'; +import { executionContext, executionStackInstance } from './execution-context'; + +export function getRuntime( + databaseUrl: string, + plugins: Plugin[] = [ + budgets({ + maxRows: 10_000, + defaultTableRows: 10_000, + tableRows: { user: 10_000, post: 10_000 }, + maxLatencyMs: 1_000, + }), + ], +): Runtime { + return createRuntime({ + stackInstance: executionStackInstance, + contract: executionContext.contract, + context: executionContext, + driverOptions: { + connect: { filename: databaseUrl }, + }, + verify: { + mode: 'onFirstUse', + requireMarker: false, + }, + plugins, + }); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/dml-operations.ts b/examples/prisma-next-demo-sqlite/src/queries/dml-operations.ts new file mode 100644 index 0000000000..1fa3511a7e --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/dml-operations.ts @@ -0,0 +1,73 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma/query'; + +export async function insertUser(email: string, runtime: Runtime) { + const userTable = tables.user; + const userColumns = userTable.columns; + + const plan = sql + .insert(userTable, { + email: param('email'), + }) + .returning(userColumns.id, userColumns.email) + .build({ + params: { + email, + }, + }); + + const rows: Array<{ id: number; email: string }> = []; + for await (const row of runtime.execute(plan)) { + rows.push(row as { id: number; email: string }); + } + + return rows[0]; +} + +export async function updateUser(userId: number, newEmail: string, runtime: Runtime) { + const userTable = tables.user; + const userColumns = userTable.columns; + + const plan = sql + .update(userTable, { + email: param('newEmail'), + }) + .where(userColumns.id.eq(param('userId'))) + .returning(userColumns.id, userColumns.email) + .build({ + params: { + newEmail, + userId, + }, + }); + + const rows: Array<{ id: number; email: string }> = []; + for await (const row of runtime.execute(plan)) { + rows.push(row as { id: number; email: string }); + } + + return rows[0]; +} + +export async function deleteUser(userId: number, runtime: Runtime) { + const userTable = tables.user; + const userColumns = userTable.columns; + + const plan = sql + .delete(userTable) + .where(userColumns.id.eq(param('userId'))) + .returning(userColumns.id, userColumns.email) + .build({ + params: { + userId, + }, + }); + + const rows: Array<{ id: number; email: string }> = []; + for await (const row of runtime.execute(plan)) { + rows.push(row as { id: number; email: string }); + } + + return rows[0]; +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-all-posts-unbounded.ts b/examples/prisma-next-demo-sqlite/src/queries/get-all-posts-unbounded.ts new file mode 100644 index 0000000000..b51a34a498 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-all-posts-unbounded.ts @@ -0,0 +1,35 @@ +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma/query'; +import { collect } from './utils'; + +/** + * WARNING: This query intentionally violates the row budget to demonstrate + * budget enforcement. It selects all posts without a LIMIT clause, which + * will trigger a BUDGET.ROWS_EXCEEDED error when the estimated row count + * exceeds the budget (default: 10,000 rows). + * + * This demonstrates the budget workflow: + * 1. Budget plugin checks the query before execution + * 2. Detects unbounded SELECT (no LIMIT) + * 3. Throws BUDGET.ROWS_EXCEEDED error + * 4. Query execution is blocked + * + * To fix this query, add a .limit() clause or add proper filtering. + */ +export async function getAllPostsUnbounded(runtime: Runtime) { + const postTable = tables.post; + + // This query has no LIMIT, so it will violate the budget + const plan = sql + .from(postTable) + .select({ + id: postTable.columns.id, + title: postTable.columns.title, + userId: postTable.columns.userId, + createdAt: postTable.columns.createdAt, + }) + // Intentionally missing .limit() to trigger budget violation + .build(); + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-user-by-id-no-emit.ts b/examples/prisma-next-demo-sqlite/src/queries/get-user-by-id-no-emit.ts new file mode 100644 index 0000000000..2cd9955b65 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-user-by-id-no-emit.ts @@ -0,0 +1,22 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma-no-emit/query-no-emit'; +import { collect } from './utils'; + +export async function getUserById(userId: number, runtime: Runtime) { + const userTable = tables.user; + + const plan = sql + .from(userTable) + .where(userTable.columns.id.eq(param('userId'))) + .select({ + id: userTable.columns.id, + email: userTable.columns.email, + createdAt: userTable.columns.createdAt, + }) + .limit(1) + .build({ params: { userId } }); + + const rows = await collect(runtime.execute(plan)); + return rows[0] ?? null; +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-user-by-id.ts b/examples/prisma-next-demo-sqlite/src/queries/get-user-by-id.ts new file mode 100644 index 0000000000..07af881b84 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-user-by-id.ts @@ -0,0 +1,22 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma/query'; +import { collect } from './utils'; + +export async function getUserById(userId: number, runtime: Runtime) { + const userTable = tables.user; + + const plan = sql + .from(userTable) + .where(userTable.columns.id.eq(param('userId'))) + .select({ + id: userTable.columns.id, + email: userTable.columns.email, + createdAt: userTable.columns.createdAt, + }) + .limit(1) + .build({ params: { userId } }); + + const rows = await collect(runtime.execute(plan)); + return rows[0] ?? null; +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-user-posts-no-emit.ts b/examples/prisma-next-demo-sqlite/src/queries/get-user-posts-no-emit.ts new file mode 100644 index 0000000000..b45b732e50 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-user-posts-no-emit.ts @@ -0,0 +1,21 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma-no-emit/query-no-emit'; +import { collect } from './utils'; + +export async function getUserPosts(userId: number, runtime: Runtime) { + const postTable = tables.post; + + const plan = sql + .from(postTable) + .where(postTable.columns.userId.eq(param('userId'))) + .select({ + id: postTable.columns.id, + title: postTable.columns.title, + userId: postTable.columns.userId, + createdAt: postTable.columns.createdAt, + }) + .build({ params: { userId } }); + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-user-posts.test-d.ts b/examples/prisma-next-demo-sqlite/src/queries/get-user-posts.test-d.ts new file mode 100644 index 0000000000..6ebd3a3b8a --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-user-posts.test-d.ts @@ -0,0 +1,32 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { ResultType } from '@prisma-next/sql-relational-core/types'; +import { expectTypeOf, test } from 'vitest'; +import { sql, tables } from '../prisma/query'; + +/** + * Type test to verify that ResultType correctly infers number[] | null for nullable vector column. + * This matches the actual query in get-user-posts.ts. + */ +test('ResultType correctly infers number[] | null for nullable embedding column', () => { + const postTable = tables.post; + + const _plan = sql + .from(postTable) + .where(postTable.columns.userId.eq(param('userId'))) + .select({ + id: postTable.columns.id, + title: postTable.columns.title, + userId: postTable.columns.userId, + createdAt: postTable.columns.createdAt, + embedding: postTable.columns.embedding, + }) + .build({ params: { userId: 1 } }); + + type Row = ResultType; + + expectTypeOf().toEqualTypeOf(); + expectTypeOf().toEqualTypeOf(); + expectTypeOf().toEqualTypeOf(); + expectTypeOf().toEqualTypeOf(); + expectTypeOf().not.toEqualTypeOf(); +}); diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-user-posts.ts b/examples/prisma-next-demo-sqlite/src/queries/get-user-posts.ts new file mode 100644 index 0000000000..e95d4efc94 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-user-posts.ts @@ -0,0 +1,27 @@ +import type { ResultType } from '@prisma-next/contract/types'; +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma/query'; +import { collect } from './utils'; + +export async function getUserPosts(userId: number, runtime: Runtime) { + const postTable = tables.post; + + const plan = sql + .from(postTable) + .where(postTable.columns.userId.eq(param('userId'))) + .select({ + id: postTable.columns.id, + title: postTable.columns.title, + userId: postTable.columns.userId, + createdAt: postTable.columns.createdAt, + embedding: postTable.columns.embedding, + }) + .build({ params: { userId } }); + + type Row = ResultType; + // @ts-expect-error - This is to test the type inference + type _Test = Row['embedding']; + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-users-no-emit.ts b/examples/prisma-next-demo-sqlite/src/queries/get-users-no-emit.ts new file mode 100644 index 0000000000..8064f860f1 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-users-no-emit.ts @@ -0,0 +1,19 @@ +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma-no-emit/query-no-emit'; +import { collect } from './utils'; + +export async function getUsers(runtime: Runtime, limit = 10) { + const userTable = tables.user; + + const plan = sql + .from(userTable) + .select({ + id: userTable.columns.id, + email: userTable.columns.email, + createdAt: userTable.columns.createdAt, + }) + .limit(limit) + .build(); + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-users-with-posts-no-emit.ts b/examples/prisma-next-demo-sqlite/src/queries/get-users-with-posts-no-emit.ts new file mode 100644 index 0000000000..be46e93e87 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-users-with-posts-no-emit.ts @@ -0,0 +1,34 @@ +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma-no-emit/query-no-emit'; +import { collect } from './utils'; + +export async function getUsersWithPosts(runtime: Runtime, limit = 10) { + const userTable = tables.user; + const postTable = tables.post; + + const plan = sql + .from(userTable) + .includeMany( + postTable, + (on) => on.eqCol(userTable.columns.id, postTable.columns.userId), + (child) => + child + .select({ + id: postTable.columns.id, + title: postTable.columns.title, + createdAt: postTable.columns.createdAt, + }) + .orderBy(postTable.columns.createdAt.desc()), + { alias: 'posts' }, + ) + .select({ + id: userTable.columns.id, + email: userTable.columns.email, + createdAt: userTable.columns.createdAt, + posts: true, + }) + .limit(limit) + .build(); + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-users-with-posts.ts b/examples/prisma-next-demo-sqlite/src/queries/get-users-with-posts.ts new file mode 100644 index 0000000000..b3b6762f42 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-users-with-posts.ts @@ -0,0 +1,34 @@ +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma/query'; +import { collect } from './utils'; + +export async function getUsersWithPosts(runtime: Runtime, limit = 10) { + const userTable = tables.user; + const postTable = tables.post; + + const plan = sql + .from(userTable) + .includeMany( + postTable, + (on) => on.eqCol(userTable.columns.id, postTable.columns.userId), + (child) => + child + .select({ + id: postTable.columns.id, + title: postTable.columns.title, + createdAt: postTable.columns.createdAt, + }) + .orderBy(postTable.columns.createdAt.desc()), + { alias: 'posts' }, + ) + .select({ + id: userTable.columns.id, + email: userTable.columns.email, + createdAt: userTable.columns.createdAt, + posts: true, + }) + .limit(limit) + .build(); + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/get-users.ts b/examples/prisma-next-demo-sqlite/src/queries/get-users.ts new file mode 100644 index 0000000000..d71cd25b31 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/get-users.ts @@ -0,0 +1,19 @@ +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma/query'; +import { collect } from './utils'; + +export async function getUsers(runtime: Runtime, limit = 10) { + const userTable = tables.user; + + const plan = sql + .from(userTable) + .select({ + id: userTable.columns.id, + email: userTable.columns.email, + createdAt: userTable.columns.createdAt, + }) + .limit(limit) + .build(); + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/orm-get-user-by-id.ts b/examples/prisma-next-demo-sqlite/src/queries/orm-get-user-by-id.ts new file mode 100644 index 0000000000..f903ac6878 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/orm-get-user-by-id.ts @@ -0,0 +1,21 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { orm } from '../prisma/query'; +import { collect } from './utils'; + +export async function ormGetUserById(userId: number, runtime: Runtime) { + const plan = orm + .user() + .where((u) => u.id.eq(param('userId'))) + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + })) + .findFirst({ + params: { userId }, + }); + + const rows = await collect(runtime.execute(plan)); + return rows[0] ?? null; +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/orm-get-users.ts b/examples/prisma-next-demo-sqlite/src/queries/orm-get-users.ts new file mode 100644 index 0000000000..5f2ccb1c37 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/orm-get-users.ts @@ -0,0 +1,18 @@ +import type { Runtime } from '@prisma-next/sql-runtime'; +import { orm } from '../prisma/query'; +import { collect } from './utils'; + +export async function ormGetUsers(limit: number, runtime: Runtime) { + const plan = orm + .user() + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + })) + .orderBy((u) => u.createdAt.desc()) + .take(limit) + .findMany(); + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/orm-includes.ts b/examples/prisma-next-demo-sqlite/src/queries/orm-includes.ts new file mode 100644 index 0000000000..4c21296b7d --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/orm-includes.ts @@ -0,0 +1,37 @@ +import type { ResultType } from '@prisma-next/contract/types'; +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { orm } from '../prisma/query'; +import { collect } from './utils'; + +export async function ormGetUsersWithPosts(limit: number, runtime: Runtime) { + const plan = orm + .user() + .include.posts((child) => + child + .where((m) => m.id.eq(param('postId'))) + .select((m) => ({ + id: m.id, + title: m.title, + createdAt: m.createdAt, + })) + .orderBy((m) => m.createdAt.desc()), + ) + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + posts: true, + })) + .take(limit) + .findMany({ + params: { postId: 1 }, + }); + type Row = ResultType; + // @ts-expect-error - This is to test the type inference + type _Test = Row['posts']; + // @ts-expect-error - This is to test the type inference + type _Post = Row['posts'][0]; + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/orm-pagination.ts b/examples/prisma-next-demo-sqlite/src/queries/orm-pagination.ts new file mode 100644 index 0000000000..5766241cf1 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/orm-pagination.ts @@ -0,0 +1,102 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { orm } from '../prisma/query'; +import { collect } from './utils'; + +/** + * ID-based cursor pagination (forward) + * Uses the ID column as the cursor for stable, efficient pagination + */ +export async function ormGetUsersByIdCursor( + cursor: number | null, + pageSize: number, + runtime: Runtime, +) { + let builder = orm + .user() + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + })) + .orderBy((u) => u.id.asc()) + .take(pageSize); + + if (cursor !== null) { + builder = builder.where((u) => u.id.gt(param('cursor'))); + } + + const plan = builder.findMany({ + params: cursor !== null ? { cursor } : {}, + }); + + return collect(runtime.execute(plan)); +} + +/** + * Timestamp-based cursor pagination (forward, most recent first) + * Uses createdAt timestamp for pagination, ordered by most recent first + */ +export async function ormGetUsersByTimestampCursor( + cursor: Date | null, + pageSize: number, + runtime: Runtime, +) { + let builder = orm + .user() + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + })) + .orderBy((u) => u.createdAt.desc()) + .take(pageSize); + + if (cursor !== null) { + builder = builder.where((u) => u.createdAt.lt(param('cursor'))); + } + + const plan = builder.findMany({ + params: cursor !== null ? { cursor } : {}, + }); + + return collect(runtime.execute(plan)); +} + +/** + * Backward pagination (previous page) + * Fetches records before the cursor, useful for "previous page" navigation + */ +export async function ormGetUsersBackward(cursor: number, pageSize: number, runtime: Runtime) { + const plan = orm + .user() + .where((u) => u.id.lt(param('cursor'))) + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + })) + .orderBy((u) => u.id.desc()) + .take(pageSize) + .findMany({ + params: { cursor }, + }); + + return collect(runtime.execute(plan)); +} + +/** + * Pagination helper: Get first page + * Convenience function for getting the initial page + */ +export async function ormGetUsersFirstPage(pageSize: number, runtime: Runtime) { + return ormGetUsersByIdCursor(null, pageSize, runtime); +} + +/** + * Pagination helper: Get next page + * Convenience function for getting the next page after a cursor + */ +export async function ormGetUsersNextPage(lastId: number, pageSize: number, runtime: Runtime) { + return ormGetUsersByIdCursor(lastId, pageSize, runtime); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/orm-relation-filters.ts b/examples/prisma-next-demo-sqlite/src/queries/orm-relation-filters.ts new file mode 100644 index 0000000000..337d8fe88c --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/orm-relation-filters.ts @@ -0,0 +1,55 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { orm } from '../prisma/query'; +import { collect } from './utils'; + +export async function ormGetUsersWithPosts(runtime: Runtime) { + const plan = orm + .user() + .where.related.posts.some((p) => p.where((m) => m.id.eq(param('postId')))) + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + })) + .take(100) + .findMany({ + params: { postId: 1 }, + }); + + return collect(runtime.execute(plan)); +} + +export async function ormGetUsersWithoutPosts(runtime: Runtime) { + const plan = orm + .user() + .where.related.posts.none((p) => p.where((m) => m.id.eq(param('postId')))) + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + })) + .take(100) + .findMany({ + params: { postId: 1 }, + }); + + return collect(runtime.execute(plan)); +} + +export async function ormGetUsersWhereAllPostsMatch(runtime: Runtime) { + const plan = orm + .user() + .where.related.posts.every((p) => p.where((m) => m.userId.eq(param('userId')))) + .select((u) => ({ + id: u.id, + email: u.email, + createdAt: u.createdAt, + })) + .take(100) + .findMany({ + params: { userId: 1 }, + }); + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/orm-writes.ts b/examples/prisma-next-demo-sqlite/src/queries/orm-writes.ts new file mode 100644 index 0000000000..7def7bbd97 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/orm-writes.ts @@ -0,0 +1,47 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { orm } from '../prisma/query'; + +export async function ormCreateUser( + data: { id: number; email: string; createdAt: Date }, + runtime: Runtime, +) { + const plan = orm.user().create({ id: data.id, email: data.email, createdAt: data.createdAt }); + + // Drain the result stream (DML operations don't return rows without RETURNING) + for await (const _row of runtime.execute(plan)) { + // DML operations without RETURNING don't yield rows + } + + // For now, return 1 if no error was thrown (actual row count would require RETURNING or telemetry) + // This is a limitation - we'd need to add RETURNING to get actual affected row count + return 1; +} + +export async function ormUpdateUser(userId: number, newEmail: string, runtime: Runtime) { + const plan = orm + .user() + .update((u) => u.id.eq(param('userId')), { email: newEmail }, { params: { userId } }); + + // Drain the result stream (DML operations don't return rows without RETURNING) + for await (const _row of runtime.execute(plan)) { + // DML operations without RETURNING don't yield rows + } + + // For now, return 1 if no error was thrown (actual row count would require RETURNING or telemetry) + // This is a limitation - we'd need to add RETURNING to get actual affected row count + return 1; +} + +export async function ormDeleteUser(userId: number, runtime: Runtime) { + const plan = orm.user().delete((u) => u.id.eq(param('userId')), { params: { userId } }); + + // Drain the result stream (DML operations don't return rows without RETURNING) + for await (const _row of runtime.execute(plan)) { + // DML operations without RETURNING don't yield rows + } + + // For now, return 1 if no error was thrown (actual row count would require RETURNING or telemetry) + // This is a limitation - we'd need to add RETURNING to get actual affected row count + return 1; +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/similarity-search.test-d.ts b/examples/prisma-next-demo-sqlite/src/queries/similarity-search.test-d.ts new file mode 100644 index 0000000000..026056b97c --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/similarity-search.test-d.ts @@ -0,0 +1,32 @@ +import { param } from '@prisma-next/sql-relational-core/param'; +import type { ResultType } from '@prisma-next/sql-relational-core/types'; +import { expectTypeOf, test } from 'vitest'; +import { sql, tables } from '../prisma/query'; + +/** + * Type test to verify that ResultType correctly infers the distance column as number + * when using cosineDistance operation. + */ +test('ResultType correctly infers number for cosineDistance operation result', () => { + const postTable = tables.post; + const queryParam = param('queryVector'); + const distanceExpr = postTable.columns.embedding.cosineDistance(queryParam); + + const _plan = sql + .from(postTable) + .select({ + id: postTable.columns.id, + title: postTable.columns.title, + distance: distanceExpr, + }) + .orderBy(distanceExpr.asc()) + .limit(10) + .build({ params: { queryVector: [1, 2, 3] } }); + + type Row = ResultType; + + // Verify that distance is correctly inferred as number + expectTypeOf().toEqualTypeOf(); + expectTypeOf().toEqualTypeOf(); + expectTypeOf().toEqualTypeOf(); +}); diff --git a/examples/prisma-next-demo-sqlite/src/queries/similarity-search.ts b/examples/prisma-next-demo-sqlite/src/queries/similarity-search.ts new file mode 100644 index 0000000000..529c0326e9 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/similarity-search.ts @@ -0,0 +1,33 @@ +import type { ResultType } from '@prisma-next/contract/types'; +import { param } from '@prisma-next/sql-relational-core/param'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { sql, tables } from '../prisma/query'; +import { collect } from './utils'; + +/** + * Search for posts by cosine distance to a query vector. + * Returns the top N posts ordered by similarity (closest first). + */ +export async function similaritySearch(queryVector: number[], runtime: Runtime, limit = 10) { + const postTable = tables.post; + + const queryParam = param('queryVector'); + const distanceExpr = postTable.columns.embedding.cosineDistance(queryParam); + + const plan = sql + .from(postTable) + .select({ + id: postTable.columns.id, + title: postTable.columns.title, + distance: distanceExpr, + }) + .orderBy(distanceExpr.asc()) + .limit(limit) + .build({ params: { queryVector } }); + + type Row = ResultType; + // @ts-expect-error - This is to test the type inference + type _Test = Row['distance']; // This is correctly inferred as number + + return collect(runtime.execute(plan)); +} diff --git a/examples/prisma-next-demo-sqlite/src/queries/utils.ts b/examples/prisma-next-demo-sqlite/src/queries/utils.ts new file mode 100644 index 0000000000..4a83a58670 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/src/queries/utils.ts @@ -0,0 +1,7 @@ +export async function collect(iterator: AsyncIterable): Promise { + const rows: T[] = []; + for await (const row of iterator) { + rows.push(row); + } + return rows; +} diff --git a/examples/prisma-next-demo-sqlite/test/control-client.integration.test.ts b/examples/prisma-next-demo-sqlite/test/control-client.integration.test.ts new file mode 100644 index 0000000000..9642ab5d44 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/test/control-client.integration.test.ts @@ -0,0 +1,90 @@ +/** + * Integration test demonstrating the programmatic control client. + * + * This test shows how to use createControlClient for database operations + * instead of manual SQL and the stampMarker script. + */ + +import { DatabaseSync } from 'node:sqlite'; +import { validateContract } from '@prisma-next/sql-contract-ts/contract'; +import { describe, expect, it } from 'vitest'; +import type { Contract } from '../src/prisma/contract.d'; +import contractJson from '../src/prisma/contract.json' with { type: 'json' }; +import { createPrismaNextControlClient, initTestDatabase } from './utils/control-client'; +import { withTempSqliteDatabase } from './utils/with-temp-sqlite-db'; + +// Use the emitted JSON contract which has the real computed hashes +const contract = validateContract(contractJson); + +describe('control client integration', () => { + it('initializes database schema from contract', async () => { + await withTempSqliteDatabase(async ({ connectionString, filename }) => { + // Use control client to initialize the database + await initTestDatabase({ connection: connectionString, contractIR: contract }); + + const db = new DatabaseSync(filename); + try { + const rows = db + .prepare("select name from sqlite_master where type = 'table' order by name") + .all() as Array<{ name: string }>; + const names = rows.map((r) => r.name); + expect(names).toContain('user'); + expect(names).toContain('post'); + } finally { + db.close(); + } + }); + }); + + it('verifies database marker after sign', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + // Initialize and sign database + await initTestDatabase({ connection: connectionString, contractIR: contract }); + + // Create a new client to verify + const client = createPrismaNextControlClient({ connection: connectionString }); + try { + const verifyResult = await client.verify({ contractIR: contract }); + + expect(verifyResult).toMatchObject({ + ok: true, + contract: { coreHash: expect.anything() }, + }); + } finally { + await client.close(); + } + }); + }); + + it('schema verify passes after dbInit', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + + const client = createPrismaNextControlClient({ connection: connectionString }); + try { + const schemaResult = await client.schemaVerify({ contractIR: contract }); + + expect(schemaResult.ok).toBe(true); + } finally { + await client.close(); + } + }); + }); + + it('introspects database schema', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + + const client = createPrismaNextControlClient({ connection: connectionString }); + try { + const schema = await client.introspect(); + + // Schema should be an object with tables + expect(schema).toBeDefined(); + expect(typeof schema).toBe('object'); + } finally { + await client.close(); + } + }); + }); +}); diff --git a/examples/prisma-next-demo-sqlite/test/orm.integration.test.ts b/examples/prisma-next-demo-sqlite/test/orm.integration.test.ts new file mode 100644 index 0000000000..c9b6670f6e --- /dev/null +++ b/examples/prisma-next-demo-sqlite/test/orm.integration.test.ts @@ -0,0 +1,303 @@ +import { sql } from '@prisma-next/sql-lane'; +import { param } from '@prisma-next/sql-relational-core/param'; +import { schema } from '@prisma-next/sql-relational-core/schema'; +import type { Runtime } from '@prisma-next/sql-runtime'; +import { describe, expect, it } from 'vitest'; +import { executionContext } from '../src/prisma/execution-context'; +import { getRuntime } from '../src/prisma/runtime'; +import { initTestDatabase } from './utils/control-client'; +import { withTempSqliteDatabase } from './utils/with-temp-sqlite-db'; + +const { contract } = executionContext; + +/** + * Seeds test data using the runtime and query DSL. + */ +async function seedTestData( + runtime: Runtime, + data: { users?: string[]; posts?: Array<{ title: string; userIndex: number }> }, +): Promise<{ userIds: number[] }> { + const tables = schema(executionContext).tables; + const userTable = tables['user']!; + const postTable = tables['post']!; + + const userIds: number[] = []; + + // Insert users (provide all required columns since contract doesn't have defaults) + if (data.users) { + for (let i = 0; i < data.users.length; i++) { + const email = data.users[i]!; + const id = i + 1; + const createdAt = new Date(); + + const plan = sql({ context: executionContext }) + .insert(userTable, { + id: param('id'), + email: param('email'), + createdAt: param('createdAt'), + }) + .returning(userTable.columns['id']!) + .build({ params: { id, email, createdAt } }); + + for await (const row of runtime.execute(plan)) { + userIds.push((row as { id: number }).id); + } + } + } + + // Insert posts (provide all required columns) + if (data.posts) { + for (let i = 0; i < data.posts.length; i++) { + const post = data.posts[i]!; + const userId = userIds[post.userIndex]; + if (userId === undefined) continue; + + const id = i + 1; + const createdAt = new Date(); + + const plan = sql({ context: executionContext }) + .insert(postTable, { + id: param('id'), + title: param('title'), + userId: param('userId'), + createdAt: param('createdAt'), + }) + .build({ params: { id, title: post.title, userId, createdAt } }); + + for await (const _row of runtime.execute(plan)) { + // consume iterator + } + } + } + + return { userIds }; +} + +describe('ORM integration tests', () => { + it('orm.getUsers returns users with selected fields, respects limit and ordering', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + // Initialize schema using control client + await initTestDatabase({ connection: connectionString, contractIR: contract }); + + const runtime = getRuntime(connectionString); + try { + // Seed data using runtime + await seedTestData(runtime, { + users: ['alice@example.com', 'bob@example.com', 'charlie@example.com'], + }); + + const { ormGetUsers } = await import('../src/queries/orm-get-users'); + const users = await ormGetUsers(2, runtime); + + expect(users).toHaveLength(2); + expect(users[0]).toMatchObject({ + id: expect.any(Number), + email: expect.any(String), + createdAt: expect.anything(), + }); + expect(users[0]).not.toMatchObject({ posts: expect.anything() }); + } finally { + await runtime.close(); + } + }); + }); + + it('orm.getUserById returns single user by ID', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + const runtime = getRuntime(connectionString); + + try { + await seedTestData(runtime, { users: ['alice@example.com'] }); + + const { ormGetUserById } = await import('../src/queries/orm-get-user-by-id'); + const user = await ormGetUserById(1, runtime); + + expect(user).not.toBeNull(); + expect(user).toMatchObject({ + id: 1, + email: 'alice@example.com', + createdAt: expect.anything(), + }); + } finally { + await runtime.close(); + } + }); + }); + + it('orm relation filters: where.related.posts.some() returns users with at least one post', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + const runtime = getRuntime(connectionString); + + try { + await seedTestData(runtime, { + users: ['alice@example.com', 'bob@example.com'], + posts: [{ title: 'First Post', userIndex: 0 }], + }); + + const { ormGetUsersWithPosts } = await import('../src/queries/orm-relation-filters'); + const users = await ormGetUsersWithPosts(runtime); + + expect(users.length).toBeGreaterThan(0); + expect(users[0]).toMatchObject({ + id: expect.anything(), + email: expect.anything(), + }); + } finally { + await runtime.close(); + } + }); + }); + + it('orm includes: include.posts() returns users with nested posts arrays', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + const runtime = getRuntime(connectionString); + + try { + await seedTestData(runtime, { + users: ['alice@example.com', 'bob@example.com'], + posts: [ + { title: 'First Post', userIndex: 0 }, + { title: 'Second Post', userIndex: 0 }, + { title: 'Third Post', userIndex: 1 }, + ], + }); + + const { ormGetUsersWithPosts } = await import('../src/queries/orm-includes'); + const users = await ormGetUsersWithPosts(10, runtime); + + expect(users.length).toBeGreaterThan(0); + expect(users[0]).toMatchObject({ + id: expect.anything(), + email: expect.anything(), + posts: expect.any(Array), + }); + } finally { + await runtime.close(); + } + }); + }); + + it('orm writes: create() inserts a user', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + const runtime = getRuntime(connectionString); + + try { + const { ormCreateUser } = await import('../src/queries/orm-writes'); + const affectedRows = await ormCreateUser( + { id: 1, email: 'alice@example.com', createdAt: new Date() }, + runtime, + ); + + expect(affectedRows).toBe(1); + } finally { + await runtime.close(); + } + }); + }); + + it('orm writes: update() updates a user', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + const runtime = getRuntime(connectionString); + + try { + await seedTestData(runtime, { users: ['alice@example.com'] }); + + const { ormUpdateUser } = await import('../src/queries/orm-writes'); + const affectedRows = await ormUpdateUser(1, 'alice-updated@example.com', runtime); + + expect(affectedRows).toBe(1); + } finally { + await runtime.close(); + } + }); + }); + + it('orm writes: delete() deletes a user', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + const runtime = getRuntime(connectionString); + + try { + await seedTestData(runtime, { users: ['alice@example.com'] }); + + const { ormDeleteUser } = await import('../src/queries/orm-writes'); + const affectedRows = await ormDeleteUser(1, runtime); + + expect(affectedRows).toBe(1); + } finally { + await runtime.close(); + } + }); + }); + + it('orm pagination: ormGetUsersByIdCursor returns paginated users with gt cursor', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + const runtime = getRuntime(connectionString); + + try { + const emails = Array.from({ length: 10 }, (_, i) => `user${i + 1}@example.com`); + await seedTestData(runtime, { users: emails }); + + const { ormGetUsersByIdCursor } = await import('../src/queries/orm-pagination'); + + const firstPage = await ormGetUsersByIdCursor(null, 3, runtime); + expect(firstPage).toHaveLength(3); + expect(firstPage.map((u) => u.id)).toEqual([1, 2, 3]); + + const secondPage = await ormGetUsersByIdCursor(3, 3, runtime); + expect(secondPage).toHaveLength(3); + expect(secondPage.map((u) => u.id)).toEqual([4, 5, 6]); + + const thirdPage = await ormGetUsersByIdCursor(6, 3, runtime); + expect(thirdPage).toHaveLength(3); + expect(thirdPage.map((u) => u.id)).toEqual([7, 8, 9]); + + const lastPage = await ormGetUsersByIdCursor(9, 3, runtime); + expect(lastPage).toHaveLength(1); + expect(lastPage.map((u) => u.id)).toEqual([10]); + + const emptyPage = await ormGetUsersByIdCursor(10, 3, runtime); + expect(emptyPage).toHaveLength(0); + } finally { + await runtime.close(); + } + }); + }); + + it('orm pagination: ormGetUsersBackward returns users before cursor with lt operator', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ connection: connectionString, contractIR: contract }); + const runtime = getRuntime(connectionString); + + try { + const emails = Array.from({ length: 10 }, (_, i) => `user${i + 1}@example.com`); + await seedTestData(runtime, { users: emails }); + + const { ormGetUsersBackward } = await import('../src/queries/orm-pagination'); + + const page = await ormGetUsersBackward(8, 3, runtime); + expect(page).toHaveLength(3); + expect(page.map((u) => u.id)).toEqual([7, 6, 5]); + + const earlierPage = await ormGetUsersBackward(4, 3, runtime); + expect(earlierPage).toHaveLength(3); + expect(earlierPage.map((u) => u.id)).toEqual([3, 2, 1]); + + const partialPage = await ormGetUsersBackward(2, 3, runtime); + expect(partialPage).toHaveLength(1); + expect(partialPage.map((u) => u.id)).toEqual([1]); + + const emptyPage = await ormGetUsersBackward(1, 3, runtime); + expect(emptyPage).toHaveLength(0); + } finally { + await runtime.close(); + } + }); + }); +}); diff --git a/examples/prisma-next-demo-sqlite/test/runtime.integration.test.ts b/examples/prisma-next-demo-sqlite/test/runtime.integration.test.ts new file mode 100644 index 0000000000..8c4218d361 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/test/runtime.integration.test.ts @@ -0,0 +1,457 @@ +/** biome-ignore-all lint/style/noNonNullAssertion: non-null assertions are fine for tests */ + +import type { IncludeChildBuilder, JoinOnBuilder } from '@prisma-next/sql-lane'; +import { sql } from '@prisma-next/sql-lane'; +import { param } from '@prisma-next/sql-relational-core/param'; +import { schema } from '@prisma-next/sql-relational-core/schema'; +import type { ResultType } from '@prisma-next/sql-relational-core/types'; +import { budgets, createRuntime, type Runtime } from '@prisma-next/sql-runtime'; +import { describe, expect, it } from 'vitest'; +import { executionContext, executionStackInstance } from '../src/prisma/execution-context'; +import { getRuntime } from '../src/prisma/runtime'; +import { initTestDatabase } from './utils/control-client'; +import { withTempSqliteDatabase } from './utils/with-temp-sqlite-db'; + +const { contract } = executionContext; + +/** + * Seeds test data using the runtime and query DSL. + * Uses column defaults for id (autoincrement) and createdAt (now). + */ +async function seedTestData( + runtime: Runtime, + data: { + users?: string[]; + posts?: Array<{ title: string; userIndex: number }>; + }, +): Promise<{ userIds: number[] }> { + const tables = schema(executionContext).tables; + const userTable = tables['user']!; + const postTable = tables['post']!; + + const userIds: number[] = []; + + // Insert users (omit id and createdAt - they have defaults) + if (data.users) { + for (let i = 0; i < data.users.length; i++) { + const email = data.users[i]!; + + const plan = sql({ context: executionContext }) + .insert(userTable, { + email: param('email'), + }) + .returning(userTable.columns['id']!) + .build({ params: { email } }); + + type InsertedRow = ResultType; + for await (const row of runtime.execute(plan)) { + userIds.push((row as InsertedRow)['id']!); + } + } + } + + // Insert posts (omit id and createdAt - they have defaults) + if (data.posts) { + for (let i = 0; i < data.posts.length; i++) { + const post = data.posts[i]!; + const userId = userIds[post.userIndex]; + if (userId === undefined) continue; + + const plan = sql({ context: executionContext }) + .insert(postTable, { + title: param('title'), + userId: param('userId'), + }) + .build({ params: { title: post.title, userId } }); + + for await (const _row of runtime.execute(plan)) { + // consume iterator + } + } + } + + return { userIds }; +} + +describe('runtime execute integration', () => { + it('streams rows and enforces marker verification', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + // Initialize schema and marker using control client + await initTestDatabase({ + connection: connectionString, + contractIR: contract, + }); + + const context = executionContext; + const tables = schema(context).tables; + const userTable = tables['user']!; + const root = sql({ context: executionContext }); + const plan = root + .from(userTable) + .select({ + id: userTable.columns['id']!, + email: userTable.columns['email']!, + }) + .limit(10) + .build(); + + const templatePlan = root.raw.with({ annotations: { limit: 1 } })` + select id, email from "user" + where email = ${'alice@example.com'} + limit ${1} + `; + + const functionPlan = root.raw('select id from "user" where email = $1 limit $2', { + params: ['alice@example.com', 1], + refs: { + tables: ['user'], + columns: [{ table: 'user', column: 'email' }], + }, + annotations: { intent: 'report', limit: 1 }, + }); + + const createRuntimeInstance = () => { + return createRuntime({ + stackInstance: executionStackInstance, + contract, + context, + driverOptions: { + connect: { filename: connectionString }, + }, + verify: { mode: 'always', requireMarker: true }, + plugins: [ + budgets({ + maxRows: 10_000, + defaultTableRows: 10_000, + tableRows: { user: 10_000, post: 10_000 }, + }), + ], + }); + }; + + // Seed data using a runtime instance + const seedRuntime = createRuntimeInstance(); + try { + await seedTestData(seedRuntime, { + users: ['alice@example.com'], + }); + } finally { + await seedRuntime.close(); + } + + const runtime = createRuntimeInstance(); + try { + type PlanRow = ResultType; + const rows: PlanRow[] = []; + for await (const row of runtime.execute(plan)) { + rows.push(row as PlanRow); + } + + expect(rows).toHaveLength(1); + expect(rows[0]).toMatchObject({ email: 'alice@example.com' }); + + type TemplatePlanRow = ResultType; + const templateRows: TemplatePlanRow[] = []; + for await (const row of runtime.execute(templatePlan)) { + templateRows.push(row); + } + expect(templateRows).toHaveLength(1); + + type FunctionPlanRow = ResultType; + const functionRows: FunctionPlanRow[] = []; + for await (const row of runtime.execute(functionPlan)) { + functionRows.push(row); + } + expect(functionRows).toHaveLength(1); + } finally { + await runtime.close(); + } + + // Test marker mismatch detection - create a new runtime with wrong marker expectation + // Note: We can't easily test this without modifying the marker, so we skip this part + // as it would require low-level database access which we're trying to avoid + }); + }); + + it('infers correct types from query plans', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ + connection: connectionString, + contractIR: contract, + }); + const runtime = getRuntime(connectionString); + + try { + // Seed data + await seedTestData(runtime, { + users: ['alice@example.com'], + posts: [{ title: 'First Post', userIndex: 0 }], + }); + + const context = executionContext; + const tables = schema(context).tables; + const userTable = tables['user']!; + const postTable = tables['post']!; + + const userPlan = sql({ context: executionContext }) + .from(userTable) + .select({ + id: userTable.columns['id']!, + email: userTable.columns['email']!, + createdAt: userTable.columns['createdAt']!, + }) + .limit(10) + .build(); + + type UserRow = ResultType; + + const postPlan = sql({ context: executionContext }) + .from(postTable) + .where(postTable.columns['userId']!.eq(param('userId'))) + .select({ + id: postTable.columns['id']!, + title: postTable.columns['title']!, + userId: postTable.columns['userId']!, + createdAt: postTable.columns['createdAt']!, + }) + .limit(1) + .build({ params: { userId: 1 } }); + + type PostRow = ResultType; + + const userRows: UserRow[] = []; + for await (const row of runtime.execute(userPlan)) { + userRows.push(row as UserRow); + } + expect(userRows).toHaveLength(1); + expect(userRows[0]).toMatchObject({ email: 'alice@example.com' }); + + const postRows: PostRow[] = []; + for await (const row of runtime.execute(postPlan)) { + postRows.push(row as PostRow); + } + expect(postRows).toHaveLength(1); + expect(postRows[0]).toMatchObject({ + title: 'First Post', + userId: 1, + }); + } finally { + await runtime.close(); + } + }); + }); + + it('enforces row budget on unbounded queries', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ + connection: connectionString, + contractIR: contract, + }); + + const context = executionContext; + const runtime = createRuntime({ + stackInstance: executionStackInstance, + contract, + context, + driverOptions: { + connect: { filename: connectionString }, + }, + verify: { mode: 'onFirstUse', requireMarker: false }, + plugins: [ + budgets({ + maxRows: 50, + defaultTableRows: 10_000, + tableRows: { user: 10_000, post: 10_000 }, + }), + ], + }); + + try { + // Seed 100 users using a separate runtime without strict budgets + const seedRuntime = getRuntime(connectionString); + try { + const emails = Array.from({ length: 100 }, (_, i) => `user${i}@example.com`); + await seedTestData(seedRuntime, { users: emails }); + } finally { + await seedRuntime.close(); + } + + const tables = schema(context).tables; + const userTable = tables['user']!; + const unboundedPlan = sql({ context: executionContext }) + .from(tables['user']!) + .select({ + id: userTable.columns['id']!, + email: userTable.columns['email']!, + }) + .build(); + + await expect(async () => { + for await (const _row of runtime.execute(unboundedPlan)) { + // Should not reach here + } + }).rejects.toMatchObject({ + code: 'BUDGET.ROWS_EXCEEDED', + category: 'BUDGET', + }); + + const boundedPlan = sql({ context: executionContext }) + .from(tables['user']!) + .select({ + id: userTable.columns['id']!, + email: userTable.columns['email']!, + }) + .limit(10) + .build(); + + type BoundedPlanRow = ResultType; + const rows: BoundedPlanRow[] = []; + for await (const row of runtime.execute(boundedPlan)) { + rows.push(row); + } + expect(rows.length).toBeLessThanOrEqual(10); + } finally { + await runtime.close(); + } + }); + }); + + it('enforces streaming row budget', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ + connection: connectionString, + contractIR: contract, + }); + + const context = executionContext; + const runtime = createRuntime({ + stackInstance: executionStackInstance, + contract, + context, + driverOptions: { + connect: { filename: connectionString }, + }, + verify: { mode: 'onFirstUse', requireMarker: false }, + plugins: [ + budgets({ + maxRows: 10, + defaultTableRows: 10_000, + tableRows: { user: 10_000, post: 10_000 }, + }), + ], + }); + + try { + // Seed 50 users using a separate runtime without strict budgets + const seedRuntime = getRuntime(connectionString); + try { + const emails = Array.from({ length: 50 }, (_, i) => `user${i}@example.com`); + await seedTestData(seedRuntime, { users: emails }); + } finally { + await seedRuntime.close(); + } + + const tables = schema(context).tables; + const userTable = tables['user']!; + const plan = sql({ context: executionContext }) + .from(tables['user']!) + .select({ + id: userTable.columns['id']!, + email: userTable.columns['email']!, + }) + .limit(50) + .build(); + + await expect(async () => { + for await (const _row of runtime.execute(plan)) { + // Will throw on 11th row + } + }).rejects.toMatchObject({ + code: 'BUDGET.ROWS_EXCEEDED', + category: 'BUDGET', + }); + } finally { + await runtime.close(); + } + }); + }); + + it('includeMany returns users with nested posts array', async () => { + await withTempSqliteDatabase(async ({ connectionString }) => { + await initTestDatabase({ + connection: connectionString, + contractIR: contract, + }); + const runtime = getRuntime(connectionString); + + try { + // Seed users and posts + await seedTestData(runtime, { + users: ['alice@example.com', 'bob@example.com'], + posts: [ + { title: 'First Post', userIndex: 0 }, + { title: 'Second Post', userIndex: 0 }, + { title: 'Third Post', userIndex: 1 }, + ], + }); + + const context = executionContext; + const tables = schema(context).tables; + const userTable = tables['user']!; + const postTable = tables['post']!; + + const plan = sql({ context: executionContext }) + .from(userTable) + .includeMany( + postTable, + (on: JoinOnBuilder) => on.eqCol(userTable.columns['id']!, postTable.columns['userId']!), + (child: IncludeChildBuilder) => + child + .select({ + id: postTable.columns['id']!, + title: postTable.columns['title']!, + createdAt: postTable.columns['createdAt']!, + }) + .orderBy(postTable.columns['createdAt']!.desc()), + { alias: 'posts' }, + ) + .select({ + id: userTable.columns['id']!, + email: userTable.columns['email']!, + createdAt: userTable.columns['createdAt']!, + posts: true, + }) + .limit(10) + .build(); + + type Row = ResultType; + const rows: Row[] = []; + for await (const row of runtime.execute(plan)) { + rows.push(row); + } + + expect(rows).toHaveLength(2); + expect(rows[0]).toHaveProperty('id'); + expect(rows[0]).toHaveProperty('email'); + expect(rows[0]).toHaveProperty('posts'); + expect(Array.isArray(rows[0]!.posts)).toBe(true); + + const alice = rows.find((r) => r.email === 'alice@example.com'); + expect(alice).toBeDefined(); + expect(alice!.posts).toHaveLength(2); + expect(alice!.posts[0]).toHaveProperty('id'); + expect(alice!.posts[0]).toHaveProperty('title'); + expect(alice!.posts[0]).toHaveProperty('createdAt'); + expect(typeof alice!.posts[0]!.id).toBe('number'); + expect(typeof alice!.posts[0]!.title).toBe('string'); + + const bob = rows.find((r) => r.email === 'bob@example.com'); + expect(bob).toBeDefined(); + expect(bob!.posts).toHaveLength(1); + expect(bob!.posts[0]!.title).toBe('Third Post'); + } finally { + await runtime.close(); + } + }); + }); +}); diff --git a/examples/prisma-next-demo-sqlite/test/runtime.offline.integration.test.ts b/examples/prisma-next-demo-sqlite/test/runtime.offline.integration.test.ts new file mode 100644 index 0000000000..0e3d84ac49 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/test/runtime.offline.integration.test.ts @@ -0,0 +1,32 @@ +import { createRuntime } from '@prisma-next/sql-runtime'; +import { describe, expect, it } from 'vitest'; +import { executionContext, executionStackInstance } from '../src/prisma/execution-context'; +import { sql, tables } from '../src/prisma/query'; + +describe('when no driver is available', () => { + it('can still build query plans', async () => { + const runtime = createRuntime({ + stackInstance: executionStackInstance, + contract: executionContext.contract, + context: executionContext, + verify: { mode: 'onFirstUse', requireMarker: false }, + }); + + try { + const plan = sql.from(tables.user).select({ id: tables.user.columns.id }).limit(1).build(); + + expect(plan).toMatchObject({ + ast: { kind: 'select' }, + meta: { lane: 'dsl' }, + }); + + await expect(async () => { + for await (const _row of runtime.execute(plan)) { + // offline runtime should not execute + } + }).rejects.toMatchObject({ code: 'RUNTIME.DRIVER_MISSING' }); + } finally { + await runtime.close(); + } + }); +}); diff --git a/examples/prisma-next-demo-sqlite/test/utils/control-client.ts b/examples/prisma-next-demo-sqlite/test/utils/control-client.ts new file mode 100644 index 0000000000..88c380ad4c --- /dev/null +++ b/examples/prisma-next-demo-sqlite/test/utils/control-client.ts @@ -0,0 +1,65 @@ +/** + * Test utilities using the programmatic control client and runtime. + * + * This demonstrates how to use `createControlClient` for test database setup + * and the runtime for data operations, instead of manual SQL and stampMarker. + */ + +import sqliteAdapter from '@prisma-next/adapter-sqlite/control'; +import { type ControlClient, createControlClient } from '@prisma-next/cli/control-api'; +import type { ContractIR } from '@prisma-next/contract/ir'; +import sqliteDriver from '@prisma-next/driver-sqlite/control'; +import sqlitevector from '@prisma-next/extension-sqlite-vector/control'; +import sql from '@prisma-next/family-sql/control'; +import sqlite from '@prisma-next/target-sqlite/control'; + +export interface TestControlClientOptions { + readonly connection: string; +} + +/** + * Creates a control client configured for the demo app's stack. + * + * The client auto-connects when operations are called because we provide + * a default connection in options. + */ +export function createPrismaNextControlClient(options: TestControlClientOptions): ControlClient { + return createControlClient({ + family: sql, + target: sqlite, + adapter: sqliteAdapter, + driver: sqliteDriver, + extensionPacks: [sqlitevector], + connection: options.connection, + }); +} + +/** + * Initializes a test database with schema and marker from a contract. + * + * This replaces the manual table creation and stampMarker calls. + * dbInit in 'apply' mode creates all tables/indexes and writes the marker. + * + * @example + * ```typescript + * await withDevDatabase(async ({ connectionString }) => { + * await initTestDatabase({ connection: connectionString, contractIR }); + * // Database is now ready with schema and marker + * }); + * ``` + */ +export async function initTestDatabase(options: { + readonly connection: string; + readonly contractIR: ContractIR; +}): Promise { + const client = createPrismaNextControlClient({ connection: options.connection }); + + try { + const initResult = await client.dbInit({ contractIR: options.contractIR, mode: 'apply' }); + if (!initResult.ok) { + throw new Error(`dbInit failed: ${initResult.failure.summary}`); + } + } finally { + await client.close(); + } +} diff --git a/examples/prisma-next-demo-sqlite/test/utils/with-temp-sqlite-db.ts b/examples/prisma-next-demo-sqlite/test/utils/with-temp-sqlite-db.ts new file mode 100644 index 0000000000..718b77353d --- /dev/null +++ b/examples/prisma-next-demo-sqlite/test/utils/with-temp-sqlite-db.ts @@ -0,0 +1,18 @@ +import { mkdtemp, rm } from 'node:fs/promises'; +import { tmpdir } from 'node:os'; +import { join } from 'node:path'; +import { pathToFileURL } from 'node:url'; + +export async function withTempSqliteDatabase( + fn: (args: { readonly connectionString: string; readonly filename: string }) => Promise, +): Promise { + const dir = await mkdtemp(join(tmpdir(), 'prisma-next-sqlite-')); + const filename = join(dir, 'db.sqlite'); + const connectionString = pathToFileURL(filename).href; + + try { + return await fn({ connectionString, filename }); + } finally { + await rm(dir, { recursive: true, force: true }); + } +} diff --git a/examples/prisma-next-demo-sqlite/tsconfig.json b/examples/prisma-next-demo-sqlite/tsconfig.json new file mode 100644 index 0000000000..ec9ab845c8 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/tsconfig.json @@ -0,0 +1,16 @@ +{ + "extends": ["@prisma-next/tsconfig/base"], + "compilerOptions": { + "outDir": "dist", + "lib": ["ES2022", "DOM"], + "types": ["vite/client"] + }, + "include": [ + "src/**/*.ts", + "test/**/*.ts", + "scripts/**/*.ts", + "prisma/**/*.ts", + "prisma-next.config.ts" + ], + "exclude": ["dist"] +} diff --git a/examples/prisma-next-demo-sqlite/tsup.config.ts b/examples/prisma-next-demo-sqlite/tsup.config.ts new file mode 100644 index 0000000000..9b98bf47db --- /dev/null +++ b/examples/prisma-next-demo-sqlite/tsup.config.ts @@ -0,0 +1,10 @@ +import { defineConfig } from 'tsup'; + +export default defineConfig({ + entry: [], + format: ['esm'], + sourcemap: true, + clean: true, + target: 'es2022', + minify: false, +}); diff --git a/examples/prisma-next-demo-sqlite/vite.config.ts b/examples/prisma-next-demo-sqlite/vite.config.ts new file mode 100644 index 0000000000..d35ea0e764 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/vite.config.ts @@ -0,0 +1,6 @@ +import { prismaVitePlugin } from '@prisma-next/vite-plugin-contract-emit'; +import { defineConfig } from 'vite'; + +export default defineConfig({ + plugins: [prismaVitePlugin()], +}); diff --git a/examples/prisma-next-demo-sqlite/vitest.config.ts b/examples/prisma-next-demo-sqlite/vitest.config.ts new file mode 100644 index 0000000000..5c0ed4e148 --- /dev/null +++ b/examples/prisma-next-demo-sqlite/vitest.config.ts @@ -0,0 +1,13 @@ +import { timeouts } from '@prisma-next/test-utils'; +import { defineConfig } from 'vitest/config'; + +export default defineConfig({ + test: { + environment: 'node', + pool: 'threads', + maxWorkers: 1, + isolate: false, + testTimeout: timeouts.default, + hookTimeout: timeouts.default, + }, +}); diff --git a/packages/1-framework/3-tooling/cli/bin/prisma-next.js b/packages/1-framework/3-tooling/cli/bin/prisma-next.js new file mode 100755 index 0000000000..aeb2f812db --- /dev/null +++ b/packages/1-framework/3-tooling/cli/bin/prisma-next.js @@ -0,0 +1,17 @@ +#!/usr/bin/env node +import { existsSync } from 'node:fs'; +import { dirname, resolve } from 'node:path'; +import { fileURLToPath, pathToFileURL } from 'node:url'; + +const __dirname = dirname(fileURLToPath(import.meta.url)); +const entrypoint = resolve(__dirname, '../dist/cli.js'); + +if (!existsSync(entrypoint)) { + // eslint-disable-next-line no-console + console.error( + '[prisma-next] CLI is not built. Run `pnpm -C packages/1-framework/3-tooling/cli build` (or `pnpm build`).', + ); + process.exit(1); +} + +await import(pathToFileURL(entrypoint).href); diff --git a/packages/1-framework/3-tooling/cli/package.json b/packages/1-framework/3-tooling/cli/package.json index d8c557fc85..ce39a54288 100644 --- a/packages/1-framework/3-tooling/cli/package.json +++ b/packages/1-framework/3-tooling/cli/package.json @@ -4,6 +4,7 @@ "type": "module", "sideEffects": false, "files": [ + "bin", "dist", "src" ], @@ -18,7 +19,7 @@ "clean": "rm -rf dist dist-tsc dist-tsc-prod coverage .tmp-output" }, "bin": { - "prisma-next": "./dist/cli.js" + "prisma-next": "./bin/prisma-next.js" }, "dependencies": { "@prisma-next/contract": "workspace:*", diff --git a/packages/2-sql/3-tooling/family/src/core/control-instance.ts b/packages/2-sql/3-tooling/family/src/core/control-instance.ts index b7f0d24f0d..daafdf8d2b 100644 --- a/packages/2-sql/3-tooling/family/src/core/control-instance.ts +++ b/packages/2-sql/3-tooling/family/src/core/control-instance.ts @@ -21,11 +21,6 @@ import type { SqlContract, SqlStorage } from '@prisma-next/sql-contract/types'; import { sqlTargetFamilyHook } from '@prisma-next/sql-contract-emitter'; import { validateContract } from '@prisma-next/sql-contract-ts/contract'; import type { SqlOperationSignature } from '@prisma-next/sql-operations'; -import { - ensureSchemaStatement, - ensureTableStatement, - writeContractMarker, -} from '@prisma-next/sql-runtime'; import type { SqlSchemaIR, SqlTableIR } from '@prisma-next/sql-schema-ir/types'; import { ifDefined } from '@prisma-next/utils/defined'; import { @@ -645,9 +640,10 @@ export function createSqlFamilyInstance( : contractCoreHash; const contractTarget = contract.target; - // Ensure marker schema and table exist - await driver.query(ensureSchemaStatement.sql, ensureSchemaStatement.params); - await driver.query(ensureTableStatement.sql, ensureTableStatement.params); + // Ensure marker tables exist (target-specific per ADR 021). + for (const stmt of ensureMarkerInfrastructureStatements(contractTarget)) { + await driver.query(stmt.sql, stmt.params); + } // Read existing marker const existingMarker = await readMarker(driver); @@ -659,7 +655,7 @@ export function createSqlFamilyInstance( if (!existingMarker) { // No marker exists - insert new one - const write = writeContractMarker({ + const write = writeMarkerStatements(contractTarget, { coreHash: contractCoreHash, profileHash: contractProfileHash, contractJson: contractIR, @@ -682,7 +678,7 @@ export function createSqlFamilyInstance( coreHash: existingCoreHash, profileHash: existingProfileHash, }; - const write = writeContractMarker({ + const write = writeMarkerStatements(contractTarget, { coreHash: contractCoreHash, profileHash: contractProfileHash, contractJson: contractIR, @@ -911,3 +907,161 @@ export function createSqlFamilyInstance( }, }; } + +// ============================================================================ +// Marker SQL (ADR 021) +// ============================================================================ + +type SqlStatement = { readonly sql: string; readonly params: readonly unknown[] }; + +type WriteMarkerInput = { + readonly coreHash: string; + readonly profileHash: string; + readonly contractJson?: unknown; + readonly canonicalVersion?: number | null; + readonly appTag?: string | null; + readonly meta?: Record; +}; + +function ensureMarkerInfrastructureStatements(targetId: string): readonly SqlStatement[] { + if (targetId === 'sqlite') { + return [ + { + sql: `create table if not exists prisma_contract_marker ( + id integer primary key, + core_hash text not null, + profile_hash text not null, + contract_json text, + canonical_version integer, + updated_at text not null default (CURRENT_TIMESTAMP), + app_tag text, + meta text not null default '{}' + )`, + params: [], + }, + ]; + } + + // Default: Postgres schema + table. + return [ + { sql: 'create schema if not exists prisma_contract', params: [] }, + { + sql: `create table if not exists prisma_contract.marker ( + id smallint primary key default 1, + core_hash text not null, + profile_hash text not null, + contract_json jsonb, + canonical_version int, + updated_at timestamptz not null default now(), + app_tag text, + meta jsonb not null default '{}' + )`, + params: [], + }, + ]; +} + +function writeMarkerStatements( + targetId: string, + input: WriteMarkerInput, +): { readonly insert: SqlStatement; readonly update: SqlStatement } { + if (targetId === 'sqlite') { + const params: readonly unknown[] = [ + 1, + input.coreHash, + input.profileHash, + jsonParam(input.contractJson), + input.canonicalVersion ?? null, + input.appTag ?? null, + jsonParam(input.meta ?? {}), + ]; + + return { + insert: { + sql: `insert into prisma_contract_marker ( + id, + core_hash, + profile_hash, + contract_json, + canonical_version, + updated_at, + app_tag, + meta + ) values ( + ?1, + ?2, + ?3, + ?4, + ?5, + CURRENT_TIMESTAMP, + ?6, + ?7 + )`, + params, + }, + update: { + sql: `update prisma_contract_marker set + core_hash = ?2, + profile_hash = ?3, + contract_json = ?4, + canonical_version = ?5, + updated_at = CURRENT_TIMESTAMP, + app_tag = ?6, + meta = ?7 + where id = ?1`, + params, + }, + }; + } + + const params: readonly unknown[] = [ + 1, + input.coreHash, + input.profileHash, + jsonParam(input.contractJson), + input.canonicalVersion ?? null, + input.appTag ?? null, + jsonParam(input.meta ?? {}), + ]; + + return { + insert: { + sql: `insert into prisma_contract.marker ( + id, + core_hash, + profile_hash, + contract_json, + canonical_version, + updated_at, + app_tag, + meta + ) values ( + $1, + $2, + $3, + $4::jsonb, + $5, + now(), + $6, + $7::jsonb + )`, + params, + }, + update: { + sql: `update prisma_contract.marker set + core_hash = $2, + profile_hash = $3, + contract_json = $4::jsonb, + canonical_version = $5, + updated_at = now(), + app_tag = $6, + meta = $7::jsonb + where id = $1`, + params, + }, + }; +} + +function jsonParam(value: unknown): string { + return JSON.stringify(value ?? null); +} diff --git a/packages/2-sql/3-tooling/family/src/core/verify.ts b/packages/2-sql/3-tooling/family/src/core/verify.ts index fda12f4036..f392307f0c 100644 --- a/packages/2-sql/3-tooling/family/src/core/verify.ts +++ b/packages/2-sql/3-tooling/family/src/core/verify.ts @@ -102,6 +102,30 @@ export function readMarkerSql(): { readonly sql: string; readonly params: readon }; } +export function readMarkerSqlForTarget(targetId: string): { + readonly sql: string; + readonly params: readonly unknown[]; +} { + if (targetId === 'sqlite') { + return { + sql: `select + core_hash, + profile_hash, + contract_json, + canonical_version, + updated_at, + app_tag, + meta + from prisma_contract_marker + where id = ?1`, + params: [1], + }; + } + + // Default: Postgres marker location per ADR 021. + return readMarkerSql(); +} + /** * Reads the contract marker from the database using the provided driver. * Returns the parsed marker record or null if no marker is found. @@ -113,7 +137,7 @@ export function readMarkerSql(): { readonly sql: string; readonly params: readon export async function readMarker( driver: ControlDriverInstance<'sql', string>, ): Promise { - const markerStatement = readMarkerSql(); + const markerStatement = readMarkerSqlForTarget(driver.targetId); try { const queryResult = await driver.query<{ @@ -142,7 +166,9 @@ export async function readMarker( // PostgreSQL error code 42P01 = undefined_table if ( error instanceof Error && - (error.message.includes('does not exist') || (error as { code?: string }).code === '42P01') + (error.message.includes('does not exist') || + error.message.includes('no such table') || + (error as { code?: string }).code === '42P01') ) { return null; } diff --git a/packages/2-sql/4-lanes/relational-core/src/ast/codec-types.ts b/packages/2-sql/4-lanes/relational-core/src/ast/codec-types.ts index 896ac027c6..ff1e72c3b5 100644 --- a/packages/2-sql/4-lanes/relational-core/src/ast/codec-types.ts +++ b/packages/2-sql/4-lanes/relational-core/src/ast/codec-types.ts @@ -10,6 +10,9 @@ export interface CodecMeta { readonly postgres?: { readonly nativeType: string; // e.g. 'integer', 'text', 'vector', 'timestamp with time zone' }; + readonly sqlite?: { + readonly nativeType: string; // e.g. 'integer', 'text', 'real' + }; }; }; } diff --git a/packages/2-sql/5-runtime/src/sql-family-adapter.ts b/packages/2-sql/5-runtime/src/sql-family-adapter.ts index 1ff1dbb1cb..604990924f 100644 --- a/packages/2-sql/5-runtime/src/sql-family-adapter.ts +++ b/packages/2-sql/5-runtime/src/sql-family-adapter.ts @@ -8,8 +8,17 @@ import { runtimeError } from '@prisma-next/runtime-executor'; import type { SqlContract, SqlStorage } from '@prisma-next/sql-contract/types'; import { readContractMarker } from './sql-marker'; +type MarkerReaderStatementProvider = { + markerReaderStatement?: () => { readonly sql: string; readonly params: readonly unknown[] }; +}; + class SqlMarkerReader implements MarkerReader { + constructor(private readonly provider?: MarkerReaderStatementProvider) {} + readMarkerStatement(): MarkerStatement { + if (this.provider?.markerReaderStatement) { + return this.provider.markerReaderStatement(); + } return readContractMarker(); } } @@ -20,9 +29,9 @@ export class SqlFamilyAdapter> readonly contract: TContract; readonly markerReader: MarkerReader; - constructor(contract: TContract) { + constructor(contract: TContract, markerProvider?: MarkerReaderStatementProvider) { this.contract = contract; - this.markerReader = new SqlMarkerReader(); + this.markerReader = new SqlMarkerReader(markerProvider); } validatePlan(plan: ExecutionPlan, contract: TContract): void { diff --git a/packages/2-sql/5-runtime/src/sql-runtime.ts b/packages/2-sql/5-runtime/src/sql-runtime.ts index 606cf4056a..66111ff81d 100644 --- a/packages/2-sql/5-runtime/src/sql-runtime.ts +++ b/packages/2-sql/5-runtime/src/sql-runtime.ts @@ -124,7 +124,7 @@ class SqlRuntimeImpl = SqlContract new PostgresQueryCompiler(), }; } + if (config.contract.target === 'sqlite') { + return { + createAdapter: () => new SqliteAdapter(), + createDriver: () => new KyselyPrismaDriver(config), + createIntrospector: (db: Kysely) => new SqliteIntrospector(db), + createQueryCompiler: () => new SqliteQueryCompiler(), + }; + } throw new Error(`Unsupported database target: ${config.contract.target}`); } diff --git a/packages/3-extensions/sqlite-vector/README.md b/packages/3-extensions/sqlite-vector/README.md new file mode 100644 index 0000000000..7fc25d2828 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/README.md @@ -0,0 +1,20 @@ +# @prisma-next/extension-sqlite-vector + +SQLite vector extension pack for Prisma Next. + +## Overview + +This extension pack adds a `sqlite/vector@1` codec (stored as JSON text) and a `cosineDistance()` operation that lowers to a pure SQL expression (JSON1 + math functions). + +## Notes + +- SQLite does not ship a native vector type in this repo. Vectors are stored as `TEXT` containing JSON arrays. +- The cosine distance lowering requires: + - JSON1 functions (`json_each`, `json_object`, etc.) + - math functions (`sqrt`, etc.) + +## Entrypoints + +- `@prisma-next/extension-sqlite-vector/pack` for contract authoring +- `@prisma-next/extension-sqlite-vector/control` for `prisma-next.config.ts` +- `@prisma-next/extension-sqlite-vector/runtime` for execution stacks diff --git a/packages/3-extensions/sqlite-vector/biome.jsonc b/packages/3-extensions/sqlite-vector/biome.jsonc new file mode 100644 index 0000000000..b8994a7330 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/biome.jsonc @@ -0,0 +1,4 @@ +{ + "$schema": "https://biomejs.dev/schemas/2.3.11/schema.json", + "extends": "//" +} diff --git a/packages/3-extensions/sqlite-vector/package.json b/packages/3-extensions/sqlite-vector/package.json new file mode 100644 index 0000000000..ce72be357e --- /dev/null +++ b/packages/3-extensions/sqlite-vector/package.json @@ -0,0 +1,63 @@ +{ + "name": "@prisma-next/extension-sqlite-vector", + "version": "0.0.1", + "type": "module", + "sideEffects": false, + "scripts": { + "build": "tsup --config tsup.config.ts && tsc --project tsconfig.build.json", + "test": "vitest run", + "test:coverage": "vitest run --coverage", + "typecheck": "tsc --project tsconfig.json --noEmit", + "lint": "biome check . --error-on-warnings", + "lint:fix": "biome check --write .", + "lint:fix:unsafe": "biome check --write --unsafe .", + "clean": "rm -rf dist dist-tsc dist-tsc-prod coverage .tmp-output" + }, + "dependencies": { + "@prisma-next/contract": "workspace:*", + "@prisma-next/contract-authoring": "workspace:*", + "@prisma-next/family-sql": "workspace:*", + "@prisma-next/sql-operations": "workspace:*", + "@prisma-next/sql-relational-core": "workspace:*", + "@prisma-next/sql-runtime": "workspace:*" + }, + "devDependencies": { + "@prisma-next/operations": "workspace:*", + "@prisma-next/test-utils": "workspace:*", + "@prisma-next/tsconfig": "workspace:*", + "tsup": "catalog:", + "typescript": "catalog:", + "vitest": "catalog:" + }, + "files": [ + "dist", + "src" + ], + "exports": { + "./package.json": "./package.json", + "./control": { + "types": "./dist/exports/control.d.ts", + "import": "./dist/exports/control.js" + }, + "./runtime": { + "types": "./dist/exports/runtime.d.ts", + "import": "./dist/exports/runtime.js" + }, + "./pack": { + "types": "./dist/exports/pack.d.ts", + "import": "./dist/exports/pack.js" + }, + "./codec-types": { + "types": "./dist/exports/codec-types.d.ts", + "import": "./dist/exports/codec-types.js" + }, + "./column-types": { + "types": "./dist/exports/column-types.d.ts", + "import": "./dist/exports/column-types.js" + }, + "./operation-types": { + "types": "./dist/exports/operation-types.d.ts", + "import": "./dist/exports/operation-types.js" + } + } +} diff --git a/packages/3-extensions/sqlite-vector/src/core/codecs.ts b/packages/3-extensions/sqlite-vector/src/core/codecs.ts new file mode 100644 index 0000000000..63f5030da3 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/core/codecs.ts @@ -0,0 +1,53 @@ +/** + * Vector codec implementation for sqlite-vector extension. + * + * Stores vectors as JSON text, e.g. `[1,2,3]`. + */ + +import { codec, defineCodecs } from '@prisma-next/sql-relational-core/ast'; + +const sqliteVectorCodec = codec<'sqlite/vector@1', string, number[]>({ + typeId: 'sqlite/vector@1', + targetTypes: ['text'], + encode: (value: number[]): string => { + if (!Array.isArray(value)) { + throw new Error('Vector value must be an array of numbers'); + } + if (!value.every((v) => typeof v === 'number')) { + throw new Error('Vector value must contain only numbers'); + } + return JSON.stringify(value); + }, + decode: (wire: string): number[] => { + if (typeof wire !== 'string') { + throw new Error('Vector wire value must be a string'); + } + const parsed = JSON.parse(wire) as unknown; + if (!Array.isArray(parsed)) { + throw new Error('Vector wire value must be a JSON array'); + } + if (!parsed.every((v) => typeof v === 'number')) { + throw new Error('Vector wire value must contain only numbers'); + } + return parsed as number[]; + }, + meta: { + db: { + sql: { + sqlite: { + nativeType: 'text', + }, + }, + }, + }, +}); + +// Build codec definitions using the builder DSL +const codecs = defineCodecs().add('vector', sqliteVectorCodec); + +// Export derived structures directly from codecs builder +export const codecDefinitions = codecs.codecDefinitions; +export const dataTypes = codecs.dataTypes; + +// Export types derived from codecs builder +export type CodecTypes = typeof codecs.CodecTypes; diff --git a/packages/3-extensions/sqlite-vector/src/core/constants.ts b/packages/3-extensions/sqlite-vector/src/core/constants.ts new file mode 100644 index 0000000000..b176a1108e --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/core/constants.ts @@ -0,0 +1,4 @@ +/** + * Codec ID for sqlite-vector's vector type. + */ +export const VECTOR_CODEC_ID = 'sqlite/vector@1' as const; diff --git a/packages/3-extensions/sqlite-vector/src/core/descriptor-meta.ts b/packages/3-extensions/sqlite-vector/src/core/descriptor-meta.ts new file mode 100644 index 0000000000..6ccc52d5a2 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/core/descriptor-meta.ts @@ -0,0 +1,89 @@ +import type { ExtensionPackRef } from '@prisma-next/contract/framework-components'; +import type { SqlOperationSignature } from '@prisma-next/sql-operations'; + +const sqliteVectorTypeId = 'sqlite/vector@1' as const; + +const cosineLowering = { + targetFamily: 'sql', + strategy: 'function', + // Pure SQL implementation to avoid requiring a custom SQLite UDF. + // + // This assumes both vectors are JSON arrays (stored as TEXT) and uses JSON1 + math functions. + template: `( + SELECT + CASE + WHEN denom IS NULL OR denom = 0 THEN NULL + ELSE 1.0 - (dot / denom) + END + FROM ( + SELECT + SUM(CAST(a.value AS REAL) * CAST(b.value AS REAL)) AS dot, + (SQRT(SUM(CAST(a.value AS REAL) * CAST(a.value AS REAL))) * + SQRT(SUM(CAST(b.value AS REAL) * CAST(b.value AS REAL)))) AS denom + FROM json_each({{self}}) AS a + JOIN json_each({{arg0}}) AS b + ON a.key = b.key + ) + )`, +} as const; + +/** + * Shared operation definition used by both pack metadata and runtime descriptor. + * Frozen to prevent accidental mutation. + */ +const cosineDistanceOperation = Object.freeze({ + method: 'cosineDistance', + args: [{ kind: 'param' }], + returns: { kind: 'builtin', type: 'number' }, + lowering: cosineLowering, +} as const); + +export const sqliteVectorPackMeta = { + kind: 'extension', + id: 'sqlitevector', + familyId: 'sql', + targetId: 'sqlite', + version: '0.0.1', + capabilities: { + sqlite: { + 'sqlitevector/cosine': true, + }, + }, + types: { + codecTypes: { + import: { + package: '@prisma-next/extension-sqlite-vector/codec-types', + named: 'CodecTypes', + alias: 'SqliteVectorTypes', + }, + typeImports: [ + { + package: '@prisma-next/extension-sqlite-vector/codec-types', + named: 'Vector', + alias: 'Vector', + }, + ], + }, + operationTypes: { + import: { + package: '@prisma-next/extension-sqlite-vector/operation-types', + named: 'OperationTypes', + alias: 'SqliteVectorOperationTypes', + }, + }, + storage: [ + { typeId: sqliteVectorTypeId, familyId: 'sql', targetId: 'sqlite', nativeType: 'text' }, + ], + }, + operations: [ + { + for: sqliteVectorTypeId, + ...cosineDistanceOperation, + }, + ], +} as const satisfies ExtensionPackRef<'sql', 'sqlite'>; + +export const sqliteVectorRuntimeOperation: SqlOperationSignature = { + forTypeId: sqliteVectorTypeId, + ...cosineDistanceOperation, +}; diff --git a/packages/3-extensions/sqlite-vector/src/exports/codec-types.ts b/packages/3-extensions/sqlite-vector/src/exports/codec-types.ts new file mode 100644 index 0000000000..b08b19d2da --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/exports/codec-types.ts @@ -0,0 +1,7 @@ +/** + * Codec type definitions for sqlite-vector extension. + * + * Re-export from types module for public API. + */ + +export type { CodecTypes, Vector } from '../types/codec-types'; diff --git a/packages/3-extensions/sqlite-vector/src/exports/column-types.ts b/packages/3-extensions/sqlite-vector/src/exports/column-types.ts new file mode 100644 index 0000000000..86eec063f3 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/exports/column-types.ts @@ -0,0 +1,19 @@ +/** + * Column type descriptors for sqlite-vector extension. + * + * These descriptors provide both codecId and nativeType for use in contract authoring. + * They are derived from the same source of truth as codec definitions and manifests. + */ + +import type { ColumnTypeDescriptor } from '@prisma-next/contract-authoring'; +import { VECTOR_CODEC_ID } from '../core/constants'; + +/** + * Static vector column descriptor without dimension. + * + * SQLite stores vectors as JSON text. + */ +export const vectorColumn = { + codecId: VECTOR_CODEC_ID, + nativeType: 'text', +} as const satisfies ColumnTypeDescriptor; diff --git a/packages/3-extensions/sqlite-vector/src/exports/control.ts b/packages/3-extensions/sqlite-vector/src/exports/control.ts new file mode 100644 index 0000000000..c4e219fb18 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/exports/control.ts @@ -0,0 +1,20 @@ +import type { SqlControlExtensionDescriptor } from '@prisma-next/family-sql/control'; +import { sqliteVectorPackMeta } from '../core/descriptor-meta'; + +/** + * sqlite-vector extension descriptor for CLI config. + * + * Note: SQLite "extensions" in Prisma Next are purely logical packs. This pack + * lowers cosine distance to pure SQL (JSON1 + math functions), so no database-side + * extension install or JS UDF registration is required. + */ +const sqliteVectorExtensionDescriptor: SqlControlExtensionDescriptor<'sqlite'> = { + ...sqliteVectorPackMeta, + create: () => ({ + familyId: 'sql' as const, + targetId: 'sqlite' as const, + }), +}; + +export { sqliteVectorExtensionDescriptor }; +export default sqliteVectorExtensionDescriptor; diff --git a/packages/3-extensions/sqlite-vector/src/exports/operation-types.ts b/packages/3-extensions/sqlite-vector/src/exports/operation-types.ts new file mode 100644 index 0000000000..987071859f --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/exports/operation-types.ts @@ -0,0 +1,7 @@ +/** + * Operation type definitions for sqlite-vector extension. + * + * Re-export from types module for public API. + */ + +export type { OperationTypes } from '../types/operation-types'; diff --git a/packages/3-extensions/sqlite-vector/src/exports/pack.ts b/packages/3-extensions/sqlite-vector/src/exports/pack.ts new file mode 100644 index 0000000000..3f44a33877 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/exports/pack.ts @@ -0,0 +1,5 @@ +import { sqliteVectorPackMeta } from '../core/descriptor-meta'; + +const sqliteVectorPack = sqliteVectorPackMeta; + +export default sqliteVectorPack; diff --git a/packages/3-extensions/sqlite-vector/src/exports/runtime.ts b/packages/3-extensions/sqlite-vector/src/exports/runtime.ts new file mode 100644 index 0000000000..6912b31fcb --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/exports/runtime.ts @@ -0,0 +1,48 @@ +import type { SqlOperationSignature } from '@prisma-next/sql-operations'; +import type { CodecRegistry } from '@prisma-next/sql-relational-core/ast'; +import { createCodecRegistry } from '@prisma-next/sql-relational-core/ast'; +import type { + SqlRuntimeExtensionDescriptor, + SqlRuntimeExtensionInstance, +} from '@prisma-next/sql-runtime'; +import { codecDefinitions } from '../core/codecs'; +import { sqliteVectorPackMeta, sqliteVectorRuntimeOperation } from '../core/descriptor-meta'; + +/** + * sqlite-vector SQL runtime extension instance. + * Provides codecs and operations for vector data type and similarity operations. + */ +class SqliteVectorRuntimeExtensionInstance implements SqlRuntimeExtensionInstance<'sqlite'> { + readonly familyId = 'sql' as const; + readonly targetId = 'sqlite' as const; + + codecs(): CodecRegistry { + const registry = createCodecRegistry(); + // Register all codecs from codecDefinitions + for (const def of Object.values(codecDefinitions)) { + registry.register(def.codec); + } + return registry; + } + + operations(): ReadonlyArray { + return [sqliteVectorRuntimeOperation]; + } +} + +/** + * sqlite-vector SQL runtime extension descriptor. + * Provides metadata and factory for creating runtime extension instances. + */ +const sqliteVectorRuntimeDescriptor: SqlRuntimeExtensionDescriptor<'sqlite'> = { + kind: 'extension' as const, + id: sqliteVectorPackMeta.id, + version: sqliteVectorPackMeta.version, + familyId: 'sql' as const, + targetId: 'sqlite' as const, + create(): SqlRuntimeExtensionInstance<'sqlite'> { + return new SqliteVectorRuntimeExtensionInstance(); + }, +}; + +export default sqliteVectorRuntimeDescriptor; diff --git a/packages/3-extensions/sqlite-vector/src/types/codec-types.ts b/packages/3-extensions/sqlite-vector/src/types/codec-types.ts new file mode 100644 index 0000000000..9ac56f6e3c --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/types/codec-types.ts @@ -0,0 +1,25 @@ +/** + * Codec type definitions for sqlite-vector extension. + * + * This file exports type-only definitions for codec input/output types. + * These types are imported by contract.d.ts files for compile-time type inference. + * + * Runtime codec implementations are provided by the extension's codec registry. + */ + +import type { CodecTypes as CoreCodecTypes } from '../core/codecs'; + +/** + * Type-level branded vector. + * + * The runtime values are plain number arrays, but parameterized column typing can + * carry the dimension at the type level (e.g. Vector<1536>). + */ +export type Vector = number[] & { readonly __vectorLength?: N }; + +/** + * Codec types for sqlite-vector. + * + * - Scalar output remains `number[]` (runtime representation). + */ +export type CodecTypes = CoreCodecTypes; diff --git a/packages/3-extensions/sqlite-vector/src/types/operation-types.ts b/packages/3-extensions/sqlite-vector/src/types/operation-types.ts new file mode 100644 index 0000000000..f0bcc1f1b2 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/src/types/operation-types.ts @@ -0,0 +1,31 @@ +/** + * Operation type definitions for sqlite-vector extension. + * + * This file exports type-only definitions for operation method signatures. + * These types are imported by contract.d.ts files for compile-time type inference. + */ + +/** + * Operation types for sqlite-vector extension. + * Maps typeId to operation methods. + */ +export type OperationTypes = { + readonly 'sqlite/vector@1': { + readonly cosineDistance: { + readonly args: readonly [ + { + readonly kind: 'param'; + }, + ]; + readonly returns: { + readonly kind: 'builtin'; + readonly type: 'number'; + }; + readonly lowering: { + readonly targetFamily: 'sql'; + readonly strategy: 'function'; + readonly template: string; + }; + }; + }; +}; diff --git a/packages/3-extensions/sqlite-vector/test/codecs.test.ts b/packages/3-extensions/sqlite-vector/test/codecs.test.ts new file mode 100644 index 0000000000..eda45385cf --- /dev/null +++ b/packages/3-extensions/sqlite-vector/test/codecs.test.ts @@ -0,0 +1,81 @@ +import { describe, expect, it } from 'vitest'; +import { codecDefinitions } from '../src/core/codecs'; + +describe('sqlite-vector codecs', () => { + it('has vector codec registered', () => { + const vectorDef = codecDefinitions.vector; + expect(vectorDef).toBeDefined(); + expect(vectorDef.typeId).toBe('sqlite/vector@1'); + expect(vectorDef.codec.targetTypes).toEqual(['text']); + }); + + it('encodes number array to JSON', () => { + const vectorCodec = codecDefinitions.vector.codec; + + const value = [0.1, 0.2, 0.3, 0.4]; + const encoded = vectorCodec.encode!(value); + expect(encoded).toBe('[0.1,0.2,0.3,0.4]'); + expect(typeof encoded).toBe('string'); + }); + + it('decodes JSON', () => { + const vectorCodec = codecDefinitions.vector.codec; + + const wire = '[0.1,0.2,0.3,0.4]'; + const decoded = vectorCodec.decode(wire); + expect(decoded).toEqual([0.1, 0.2, 0.3, 0.4]); + }); + + it('round-trip encode/decode preserves values', () => { + const vectorCodec = codecDefinitions.vector.codec; + + const original = [0.1, 0.2, 0.3, 0.4, 0.5]; + const encoded = vectorCodec.encode!(original); + expect(typeof encoded).toBe('string'); + expect(encoded).toBe('[0.1,0.2,0.3,0.4,0.5]'); + const decoded = vectorCodec.decode(encoded); + expect(decoded).toEqual(original); + }); + + it('handles empty vector', () => { + const vectorCodec = codecDefinitions.vector.codec; + + const original: number[] = []; + const encoded = vectorCodec.encode!(original); + expect(encoded).toBe('[]'); + const decoded = vectorCodec.decode(encoded); + expect(decoded).toEqual([]); + }); + + it('throws error when encoding non-array', () => { + const vectorCodec = codecDefinitions.vector.codec; + + expect(() => { + vectorCodec.encode!('not an array' as unknown as number[]); + }).toThrow('Vector value must be an array of numbers'); + }); + + it('throws error when encoding array with non-numbers', () => { + const vectorCodec = codecDefinitions.vector.codec; + + expect(() => { + vectorCodec.encode!([1, 2, 'three'] as unknown as number[]); + }).toThrow('Vector value must contain only numbers'); + }); + + it('throws error when decoding invalid JSON', () => { + const vectorCodec = codecDefinitions.vector.codec; + + expect(() => { + vectorCodec.decode('not json'); + }).toThrow(); + }); + + it('throws error when decoding non-string', () => { + const vectorCodec = codecDefinitions.vector.codec; + + expect(() => { + vectorCodec.decode(123 as unknown as string); + }).toThrow('Vector wire value must be a string'); + }); +}); diff --git a/packages/3-extensions/sqlite-vector/test/column-types.test.ts b/packages/3-extensions/sqlite-vector/test/column-types.test.ts new file mode 100644 index 0000000000..5bd1bca255 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/test/column-types.test.ts @@ -0,0 +1,15 @@ +import { describe, expect, it } from 'vitest'; +import { vectorColumn } from '../src/exports/column-types'; + +describe('sqlite-vector column-types', () => { + it('vectorColumn has correct codecId and nativeType', () => { + expect(vectorColumn).toMatchObject({ + codecId: 'sqlite/vector@1', + nativeType: 'text', + }); + }); + + it('vectorColumn has no typeParams', () => { + expect(vectorColumn).not.toHaveProperty('typeParams'); + }); +}); diff --git a/packages/3-extensions/sqlite-vector/test/manifest.test.ts b/packages/3-extensions/sqlite-vector/test/manifest.test.ts new file mode 100644 index 0000000000..1160b08fc8 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/test/manifest.test.ts @@ -0,0 +1,64 @@ +import { describe, expect, it } from 'vitest'; +import { sqliteVectorExtensionDescriptor } from '../src/exports/control'; + +describe('sqlite-vector descriptor', () => { + it('has correct metadata', () => { + expect(sqliteVectorExtensionDescriptor.id).toBe('sqlitevector'); + expect(sqliteVectorExtensionDescriptor.version).toBe('0.0.1'); + expect(sqliteVectorExtensionDescriptor.familyId).toBe('sql'); + expect(sqliteVectorExtensionDescriptor.targetId).toBe('sqlite'); + const sqliteCapabilities = sqliteVectorExtensionDescriptor.capabilities?.['sqlite'] as + | Record + | undefined; + expect(sqliteCapabilities?.['sqlitevector/cosine']).toBe(true); + }); + + it('has codec types import', () => { + expect(sqliteVectorExtensionDescriptor.types?.codecTypes?.import).toEqual({ + package: '@prisma-next/extension-sqlite-vector/codec-types', + named: 'CodecTypes', + alias: 'SqliteVectorTypes', + }); + }); + + it('has operation types import', () => { + expect(sqliteVectorExtensionDescriptor.types?.operationTypes?.import).toEqual({ + package: '@prisma-next/extension-sqlite-vector/operation-types', + named: 'OperationTypes', + alias: 'SqliteVectorOperationTypes', + }); + }); + + it('has cosineDistance operation', () => { + const operations = sqliteVectorExtensionDescriptor.operations; + expect(operations).toBeDefined(); + expect(operations?.length).toBeGreaterThan(0); + + const cosineDistanceOp = operations?.find( + (op: { for: string; method: string }) => + op.for === 'sqlite/vector@1' && op.method === 'cosineDistance', + ); + + expect(cosineDistanceOp).toBeDefined(); + expect(cosineDistanceOp?.args).toEqual([{ kind: 'param' }]); + expect(cosineDistanceOp?.returns).toEqual({ kind: 'builtin', type: 'number' }); + expect(cosineDistanceOp?.lowering.targetFamily).toBe('sql'); + expect(cosineDistanceOp?.lowering.strategy).toBe('function'); + expect(cosineDistanceOp?.lowering.template).toContain('json_each'); + expect(cosineDistanceOp?.lowering.template).toContain('SQRT'); + }); + + it('codec types are importable', async () => { + // Verify the codec types module can be imported (type-only export) + // Type-only exports don't exist at runtime, so we just verify the import succeeds + await expect(import('../src/exports/codec-types')).resolves.toBeDefined(); + }); + + it('operation types are importable', async () => { + // Verify the operation types module can be imported (type-only export) + // Type-only exports don't exist at runtime, so we just verify the import succeeds + await expect(import('../src/exports/operation-types')).resolves.toBeDefined(); + }); + + // Note: sqlite-vector doesn't currently ship parameterized type renderers. +}); diff --git a/packages/3-extensions/sqlite-vector/test/operations.test.ts b/packages/3-extensions/sqlite-vector/test/operations.test.ts new file mode 100644 index 0000000000..ad94a34c95 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/test/operations.test.ts @@ -0,0 +1,72 @@ +import { createOperationRegistry } from '@prisma-next/operations'; +import { createCodecRegistry } from '@prisma-next/sql-relational-core/ast'; +import { describe, expect, it } from 'vitest'; +import sqliteVectorDescriptor from '../src/exports/runtime'; + +describe('sqlite-vector operations', () => { + it('descriptor has correct metadata', () => { + expect(sqliteVectorDescriptor.kind).toBe('extension'); + expect(sqliteVectorDescriptor.id).toBe('sqlitevector'); + expect(sqliteVectorDescriptor.familyId).toBe('sql'); + expect(sqliteVectorDescriptor.targetId).toBe('sqlite'); + expect(sqliteVectorDescriptor.version).toBe('0.0.1'); + }); + + it('provides codec registry with vector codec', () => { + const extension = sqliteVectorDescriptor.create(); + const codecs = extension.codecs?.(); + expect(codecs).toBeDefined(); + + const vectorCodec = codecs?.get('sqlite/vector@1'); + expect(vectorCodec).toBeDefined(); + expect(vectorCodec?.id).toBe('sqlite/vector@1'); + }); + + it('provides operation signatures', () => { + const extension = sqliteVectorDescriptor.create(); + const operations = extension.operations?.(); + expect(operations).toBeDefined(); + expect(operations?.length).toBe(1); + + const cosineDistanceOp = operations?.[0]; + expect(cosineDistanceOp).toBeDefined(); + expect(cosineDistanceOp?.forTypeId).toBe('sqlite/vector@1'); + expect(cosineDistanceOp?.method).toBe('cosineDistance'); + expect(cosineDistanceOp?.args).toEqual([{ kind: 'param' }]); + expect(cosineDistanceOp?.returns).toEqual({ kind: 'builtin', type: 'number' }); + expect(cosineDistanceOp?.lowering.targetFamily).toBe('sql'); + expect(cosineDistanceOp?.lowering.strategy).toBe('function'); + expect(cosineDistanceOp?.lowering.template).toContain('json_each'); + expect(cosineDistanceOp?.lowering.template).toContain('SQRT'); + }); + + it('operations can be registered in operation registry', () => { + const extension = sqliteVectorDescriptor.create(); + const operations = extension.operations?.(); + expect(operations).toBeDefined(); + + const registry = createOperationRegistry(); + for (const op of operations ?? []) { + registry.register(op); + } + + const registeredOps = registry.byType('sqlite/vector@1'); + expect(registeredOps.length).toBe(1); + expect(registeredOps[0]?.method).toBe('cosineDistance'); + }); + + it('codecs can be registered in codec registry', () => { + const extension = sqliteVectorDescriptor.create(); + const codecs = extension.codecs?.(); + expect(codecs).toBeDefined(); + + const registry = createCodecRegistry(); + for (const codec of codecs?.values() ?? []) { + registry.register(codec); + } + + const vectorCodec = registry.get('sqlite/vector@1'); + expect(vectorCodec).toBeDefined(); + expect(vectorCodec?.id).toBe('sqlite/vector@1'); + }); +}); diff --git a/packages/3-extensions/sqlite-vector/tsconfig.build.json b/packages/3-extensions/sqlite-vector/tsconfig.build.json new file mode 100644 index 0000000000..671541c1a3 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/tsconfig.build.json @@ -0,0 +1,12 @@ +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "rootDir": "src", + "outDir": "dist", + "declaration": true, + "declarationMap": true, + "emitDeclarationOnly": true + }, + "include": ["src/**/*.ts"], + "exclude": ["test", "dist"] +} diff --git a/packages/3-extensions/sqlite-vector/tsconfig.json b/packages/3-extensions/sqlite-vector/tsconfig.json new file mode 100644 index 0000000000..7afa587436 --- /dev/null +++ b/packages/3-extensions/sqlite-vector/tsconfig.json @@ -0,0 +1,9 @@ +{ + "extends": ["@prisma-next/tsconfig/base"], + "compilerOptions": { + "rootDir": ".", + "outDir": "dist" + }, + "include": ["src/**/*.ts", "test/**/*.ts"], + "exclude": ["dist"] +} diff --git a/packages/3-extensions/sqlite-vector/tsup.config.ts b/packages/3-extensions/sqlite-vector/tsup.config.ts new file mode 100644 index 0000000000..f3acfeb90a --- /dev/null +++ b/packages/3-extensions/sqlite-vector/tsup.config.ts @@ -0,0 +1,19 @@ +import { defineConfig } from 'tsup'; + +export default defineConfig({ + entry: { + 'exports/control': 'src/exports/control.ts', + 'exports/runtime': 'src/exports/runtime.ts', + 'exports/codec-types': 'src/exports/codec-types.ts', + 'exports/column-types': 'src/exports/column-types.ts', + 'exports/operation-types': 'src/exports/operation-types.ts', + 'exports/pack': 'src/exports/pack.ts', + }, + outDir: 'dist', + format: ['esm'], + sourcemap: true, + dts: false, + clean: true, + target: 'es2022', + minify: false, +}); diff --git a/packages/3-extensions/sqlite-vector/vitest.config.ts b/packages/3-extensions/sqlite-vector/vitest.config.ts new file mode 100644 index 0000000000..0d44e4c0fb --- /dev/null +++ b/packages/3-extensions/sqlite-vector/vitest.config.ts @@ -0,0 +1,31 @@ +import { timeouts } from '@prisma-next/test-utils'; +import { defineConfig } from 'vitest/config'; + +export default defineConfig({ + test: { + globals: true, + environment: 'node', + testTimeout: timeouts.default, + hookTimeout: timeouts.default, + coverage: { + provider: 'v8', + reporter: ['text', 'json', 'html'], + include: ['src/**/*.ts'], + exclude: [ + 'dist/**', + 'test/**', + '**/*.test.ts', + '**/*.test-d.ts', + '**/*.config.ts', + '**/exports/**', + '**/types.ts', + ], + thresholds: { + lines: 95, + branches: 90, + functions: 95, + statements: 95, + }, + }, + }, +}); diff --git a/packages/3-targets/3-targets/sqlite/README.md b/packages/3-targets/3-targets/sqlite/README.md new file mode 100644 index 0000000000..1eb91bbe64 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/README.md @@ -0,0 +1,160 @@ +# @prisma-next/target-postgres + +Postgres target pack for Prisma Next. + +## Package Classification + +- **Domain**: targets +- **Layer**: targets +- **Plane**: multi-plane (migration, runtime) + +## Purpose + +Provides the Postgres target descriptor (`SqlControlTargetDescriptor`) for CLI config. The target descriptor includes capabilities and type information directly as properties, as well as factories for creating migration planners and runners. + +## Responsibilities + +- **Target Descriptor Export**: Exports the Postgres `SqlControlTargetDescriptor` for use in CLI configuration files +- **Descriptor-First Design**: All declarative fields (version, capabilities, types, operations) are properties directly on the descriptor, eliminating the need for separate manifest files +- **Multi-Plane Support**: Provides both migration-plane (control) and runtime-plane entry points for the Postgres target +- **Planner Factory**: Implements `migrations.createPlanner()` to create Postgres-specific migration planners +- **Runner Factory**: Implements `migrations.createRunner()` to create Postgres-specific migration runners +- **Schema Verification Normalization**: Normalizes Postgres default expressions (for example, `nextval(...)`, `now()`) when verifying the post-apply schema +- **Postgres-Only Contract Extensions**: Defines Postgres-specific column defaults (e.g., sequences) used by the migration planner +- **Database Dependency Consumption**: The planner extracts database dependencies from the configured framework components (passed as `frameworkComponents`), verifies each dependency against the live schema, and only emits install operations when required. The runner reuses the same metadata for post-apply verification, so there are no hardcoded extension mappings—database dependencies stay component-owned. +- **Storage Type Planning**: The planner dispatches storage type hooks for `storage.types` and emits type operations before table creation when supported by the policy + +This package spans multiple planes: +- **Migration plane** (`src/exports/control.ts`): Control plane entry point that exports `SqlControlTargetDescriptor` for config files +- **Runtime plane** (`src/exports/runtime.ts`): Runtime entry point for target-specific runtime code (future) +- **Authoring pack ref** (`src/exports/pack.ts`): Pure data surface for contract builder workflows + +## `db init` + +This package provides the Postgres implementation of the SQL migration planner/runner used by `prisma-next db init`: + +- **Planner** (`src/core/migrations/planner.ts`): produces an additive-only `MigrationPlan` to bring the database schema in line with a destination contract. Extra unrelated schema is tolerated; non-additive mismatches (type/nullability/constraint incompatibilities) surface as structured conflicts. Storage type operations (from codec-owned hooks) are emitted before table operations when `storage.types` are present. +- **Runner** (`src/core/migrations/runner.ts`): executes a plan under an advisory lock, verifies the post-state schema, then writes the contract marker and appends a ledger entry in the `prisma_contract` schema. + +For the CLI orchestration, see `packages/1-framework/3-tooling/cli/src/commands/db-init.ts`. + +## Usage + +### Control Plane (CLI) + +```typescript +import postgres from '@prisma-next/target-postgres/control'; +import sqlFamilyDescriptor from '@prisma-next/family-sql/control'; +import postgresAdapter from '@prisma-next/adapter-postgres/control'; +import postgresDriver from '@prisma-next/driver-postgres/control'; + +// postgres is a SqlControlTargetDescriptor with: +// - kind: 'target' +// - familyId: 'sql' +// - targetId: 'postgres' +// - id: 'postgres' +// - version: '0.0.1' +// - capabilities, types, operations (directly on descriptor) +// - migrations.createPlanner(): creates a Postgres migration planner +// - migrations.createRunner(): creates a Postgres migration runner + +// Create family instance with target, adapter, and driver +const family = sqlFamilyDescriptor.create({ + target: postgres, + adapter: postgresAdapter, + driver: postgresDriver, + extensions: [], +}); + +// Include the active framework components so planner/runner can resolve +// component-owned database dependencies (e.g., extension installs). +const frameworkComponents = [postgres, postgresAdapter]; + +// Create planner and runner from target descriptor +const planner = postgres.migrations.createPlanner(family); +const runner = postgres.migrations.createRunner(family); + +// Plan and execute migrations +const planResult = planner.plan({ contract, schema, policy, frameworkComponents }); +if (planResult.kind === 'success') { + const executeResult = await runner.execute({ + plan: planResult.plan, + driver, + destinationContract: contract, + policy, + frameworkComponents, + }); + if (!executeResult.ok) { + // Handle structured failure (e.g., EXECUTION_FAILED, PRECHECK_FAILED) + console.error(executeResult.failure.code, executeResult.failure.summary); + } +} else { + // Handle planner failure (e.g., unsupportedExtension, unsupportedOperation) + console.error(planResult.conflicts); +} +``` + +### Pack refs for TypeScript contract authoring + +```typescript +import postgresPack from '@prisma-next/target-postgres/pack'; +import pgvector from '@prisma-next/extension-pgvector/pack'; +import { defineContract } from '@prisma-next/sql-contract-ts/contract-builder'; + +export const contract = defineContract() + .target(postgresPack) + .extensionPacks({ pgvector }) + .build(); +``` + +Pack refs are pure JSON-friendly objects that make TypeScript contract authoring work in both emit and no-emit workflows without requiring separate manifest files. + +## Architecture + +This package provides both control and runtime entry points for the Postgres target. All declarative fields (version, capabilities, types, operations) are defined directly on the descriptor, so the published entry points never touch the filesystem. The `./pack` entry point provides a pure pack ref for contract authoring. The runtime entry point will provide target-specific runtime functionality in the future. + +## Error Handling + +Both the planner and runner return structured results instead of throwing: + +**Planner** returns `PlannerResult` with either: +- `kind: 'success'` with a `MigrationPlan` +- `kind: 'failure'` with a list of `PlannerConflict` objects (e.g., `unsupportedOperation`, `policyViolation`) + +**Runner** returns `MigrationRunnerResult` (`Result`) with either: +- `ok: true` with operation counts +- `ok: false` with a `MigrationRunnerFailure` containing error code, summary, and metadata + +Runner error codes include: `EXECUTION_FAILED`, `PRECHECK_FAILED`, `POSTCHECK_FAILED`, `SCHEMA_VERIFY_FAILED`, `POLICY_VIOLATION`, `MARKER_ORIGIN_MISMATCH`, `DESTINATION_CONTRACT_MISMATCH`. + +See `@prisma-next/family-sql/control` README for full error code documentation. + +## Dependencies + +- **`@prisma-next/family-sql`**: SQL family types (`SqlControlTargetDescriptor`, `SqlControlFamilyInstance`) +- **`@prisma-next/core-control-plane`**: Control plane types (`ControlTargetInstance`) +- **`@prisma-next/sql-contract`**: Pack types (`TargetPackRef`) +- **`arktype`**: Runtime validation + +**Dependents:** +- CLI configuration files import this package to register the Postgres target + +## Exports + +- `./control`: Control plane entry point for `SqlControlTargetDescriptor` +- `./runtime`: Runtime entry point for target-specific runtime code (future) +- `./pack`: Pure pack ref for `defineContract().target(postgresPack)` + +## Tests + +This package ships a mix of fast planner unit tests and slower runner integration tests that require a dev Postgres instance (via `@prisma/dev`). + +- **Default (`pnpm --filter @prisma-next/target-postgres test`)**: runs all tests including integration tests +- **Test files**: + - `test/migrations/planner.behavior.test.ts`: Planner unit tests (classification, conflicts, dependency ops) + - `test/migrations/planner.integration.test.ts`: Planner integration tests + - `test/migrations/runner.*.integration.test.ts`: Runner integration tests (basic, errors, idempotency, policy) + +```bash +pnpm --filter @prisma-next/target-postgres test +``` diff --git a/packages/3-targets/3-targets/sqlite/biome.jsonc b/packages/3-targets/3-targets/sqlite/biome.jsonc new file mode 100644 index 0000000000..b8994a7330 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/biome.jsonc @@ -0,0 +1,4 @@ +{ + "$schema": "https://biomejs.dev/schemas/2.3.11/schema.json", + "extends": "//" +} diff --git a/packages/3-targets/3-targets/sqlite/package.json b/packages/3-targets/3-targets/sqlite/package.json new file mode 100644 index 0000000000..f9f3408299 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/package.json @@ -0,0 +1,57 @@ +{ + "name": "@prisma-next/target-sqlite", + "version": "0.0.1", + "type": "module", + "sideEffects": false, + "description": "SQLite target pack for Prisma Next", + "scripts": { + "build": "tsup --config tsup.config.ts && tsc --project tsconfig.build.json", + "test": "vitest run --passWithNoTests", + "test:coverage": "vitest run --coverage --passWithNoTests", + "typecheck": "tsc --project tsconfig.json --noEmit", + "lint": "biome check . --error-on-warnings", + "lint:fix": "biome check --write .", + "lint:fix:unsafe": "biome check --write --unsafe .", + "clean": "rm -rf dist dist-tsc dist-tsc-prod coverage .tmp-output" + }, + "dependencies": { + "@prisma-next/cli": "workspace:*", + "@prisma-next/contract": "workspace:*", + "@prisma-next/core-control-plane": "workspace:*", + "@prisma-next/core-execution-plane": "workspace:*", + "@prisma-next/family-sql": "workspace:*", + "@prisma-next/sql-contract": "workspace:*", + "@prisma-next/sql-errors": "workspace:*", + "@prisma-next/sql-schema-ir": "workspace:*", + "@prisma-next/utils": "workspace:*", + "arktype": "^2.0.0" + }, + "devDependencies": { + "@prisma-next/adapter-sqlite": "workspace:*", + "@prisma-next/driver-sqlite": "workspace:*", + "@prisma-next/test-utils": "workspace:*", + "@prisma-next/tsconfig": "workspace:*", + "tsup": "catalog:", + "typescript": "catalog:", + "vitest": "catalog:" + }, + "files": [ + "dist", + "src", + "packs" + ], + "exports": { + "./control": { + "types": "./dist/exports/control.d.ts", + "import": "./dist/exports/control.js" + }, + "./runtime": { + "types": "./dist/exports/runtime.d.ts", + "import": "./dist/exports/runtime.js" + }, + "./pack": { + "types": "./dist/exports/pack.d.ts", + "import": "./dist/exports/pack.js" + } + } +} diff --git a/packages/3-targets/3-targets/sqlite/src/core/descriptor-meta.ts b/packages/3-targets/3-targets/sqlite/src/core/descriptor-meta.ts new file mode 100644 index 0000000000..620f69d3a3 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/src/core/descriptor-meta.ts @@ -0,0 +1,8 @@ +export const sqliteTargetDescriptorMeta = { + kind: 'target', + familyId: 'sql', + targetId: 'sqlite', + id: 'sqlite', + version: '0.0.1', + capabilities: {}, +} as const; diff --git a/packages/3-targets/3-targets/sqlite/src/core/migrations/planner.ts b/packages/3-targets/3-targets/sqlite/src/core/migrations/planner.ts new file mode 100644 index 0000000000..e69835eb95 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/src/core/migrations/planner.ts @@ -0,0 +1,799 @@ +import { + escapeLiteral, + parseSqliteDefault, + quoteIdentifier, +} from '@prisma-next/adapter-sqlite/control'; +import type { ColumnDefault } from '@prisma-next/contract/types'; +import type { SchemaIssue } from '@prisma-next/core-control-plane/types'; +import type { + CodecControlHooks, + MigrationOperationPolicy, + SqlMigrationPlanner, + SqlMigrationPlannerPlanOptions, + SqlMigrationPlanOperation, + SqlPlannerConflict, +} from '@prisma-next/family-sql/control'; +import { + createMigrationPlan, + extractCodecControlHooks, + plannerFailure, + plannerSuccess, +} from '@prisma-next/family-sql/control'; +import { + isIndexSatisfied, + isUniqueConstraintSatisfied, + verifySqlSchema, +} from '@prisma-next/family-sql/schema-verify'; +import type { + SqlContract, + SqlStorage, + StorageColumn, + StorageTable, +} from '@prisma-next/sql-contract/types'; +import type { SqlSchemaIR } from '@prisma-next/sql-schema-ir/types'; +import { ifDefined } from '@prisma-next/utils/defined'; + +type OperationClass = 'extension' | 'type' | 'table' | 'unique' | 'index' | 'foreignKey'; + +type PlannerFrameworkComponents = SqlMigrationPlannerPlanOptions extends { + readonly frameworkComponents: infer T; +} + ? T + : ReadonlyArray; + +type PlannerOptionsWithComponents = SqlMigrationPlannerPlanOptions & { + readonly frameworkComponents: PlannerFrameworkComponents; +}; + +type VerifySqlSchemaOptionsWithComponents = Parameters[0] & { + readonly frameworkComponents: PlannerFrameworkComponents; +}; + +type PlannerDatabaseDependency = { + readonly id: string; + readonly label: string; + readonly install: readonly SqlMigrationPlanOperation[]; + readonly verifyDatabaseDependencyInstalled: (schema: SqlSchemaIR) => readonly SchemaIssue[]; +}; + +export interface SqlitePlanTargetDetails { + readonly objectType: OperationClass; + readonly name: string; + readonly table?: string; +} + +interface PlannerConfig { + /** + * SQLite doesn't have schemas in the Postgres sense, but some shared hooks accept a schemaName + * parameter. We pass a stable identifier (`main`) by default. + */ + readonly defaultSchemaName: string; +} + +const DEFAULT_PLANNER_CONFIG: PlannerConfig = { + defaultSchemaName: 'main', +}; + +export function createSqliteMigrationPlanner( + config: Partial = {}, +): SqlMigrationPlanner { + return new SqliteMigrationPlanner({ + ...DEFAULT_PLANNER_CONFIG, + ...config, + }); +} + +class SqliteMigrationPlanner implements SqlMigrationPlanner { + constructor(private readonly config: PlannerConfig) {} + + plan(options: SqlMigrationPlannerPlanOptions) { + const schemaName = options.schemaName ?? this.config.defaultSchemaName; + const policyResult = this.ensureAdditivePolicy(options.policy); + if (policyResult) { + return policyResult; + } + + const classification = this.classifySchema(options); + if (classification.kind === 'conflict') { + return plannerFailure(classification.conflicts); + } + + // Extract codec control hooks once at entry point for reuse across all operations. + const codecHooks = extractCodecControlHooks(options.frameworkComponents); + + const operations: SqlMigrationPlanOperation[] = []; + + const storageTypePlan = this.buildStorageTypeOperations(options, schemaName, codecHooks); + if (storageTypePlan.conflicts.length > 0) { + return plannerFailure(storageTypePlan.conflicts); + } + + operations.push( + ...this.buildDatabaseDependencyOperations(options), + ...storageTypePlan.operations, + ...this.buildTableOperations(options.contract.storage.tables, options.schema), + ...this.buildColumnOperations(options.contract.storage.tables, options.schema), + ...this.buildUniqueOperations(options.contract.storage.tables, options.schema), + ...this.buildIndexOperations(options.contract.storage.tables, options.schema), + ); + + const plan = createMigrationPlan({ + targetId: 'sqlite', + origin: null, + destination: { + coreHash: options.contract.coreHash, + ...ifDefined('profileHash', options.contract.profileHash), + }, + operations, + }); + + return plannerSuccess(plan); + } + + private ensureAdditivePolicy(policy: MigrationOperationPolicy) { + if (!policy.allowedOperationClasses.includes('additive')) { + return plannerFailure([ + { + kind: 'unsupportedOperation', + summary: 'Init planner requires additive operations be allowed', + why: 'The init planner only emits additive operations. Update the policy to include "additive".', + }, + ]); + } + return null; + } + + /** + * Builds migration operations from component-owned database dependencies. + * These operations install database-side persistence structures declared by components. + */ + private buildDatabaseDependencyOperations( + options: PlannerOptionsWithComponents, + ): readonly SqlMigrationPlanOperation[] { + const dependencies = this.collectDependencies(options); + const operations: SqlMigrationPlanOperation[] = []; + const seenDependencyIds = new Set(); + const seenOperationIds = new Set(); + + for (const dependency of dependencies) { + if (seenDependencyIds.has(dependency.id)) { + continue; + } + seenDependencyIds.add(dependency.id); + + const issues = dependency.verifyDatabaseDependencyInstalled(options.schema); + if (issues.length === 0) { + continue; + } + + for (const installOp of dependency.install) { + if (seenOperationIds.has(installOp.id)) { + continue; + } + seenOperationIds.add(installOp.id); + operations.push(installOp); + } + } + + return operations; + } + + private buildStorageTypeOperations( + options: PlannerOptionsWithComponents, + schemaName: string, + codecHooks: Map, + ): { + readonly operations: readonly SqlMigrationPlanOperation[]; + readonly conflicts: readonly SqlPlannerConflict[]; + } { + const operations: SqlMigrationPlanOperation[] = []; + const conflicts: SqlPlannerConflict[] = []; + const storageTypes = options.contract.storage.types ?? {}; + + for (const [typeName, typeInstance] of sortedEntries(storageTypes)) { + const hook = codecHooks.get(typeInstance.codecId); + const planResult = hook?.planTypeOperations?.({ + typeName, + typeInstance, + contract: options.contract, + schema: options.schema, + schemaName, + policy: options.policy, + }); + if (!planResult) { + continue; + } + for (const operation of planResult.operations) { + if (!options.policy.allowedOperationClasses.includes(operation.operationClass)) { + conflicts.push({ + kind: 'missingButNonAdditive', + summary: `Storage type "${typeName}" requires "${operation.operationClass}" operation "${operation.id}"`, + location: { + type: typeName, + }, + }); + continue; + } + operations.push({ + ...operation, + target: { + id: operation.target.id, + details: this.buildTargetDetails('type', typeName, undefined), + }, + }); + } + } + + return { operations, conflicts }; + } + + private collectDependencies( + options: PlannerOptionsWithComponents, + ): ReadonlyArray { + const components = options.frameworkComponents; + if (components.length === 0) { + return []; + } + const deps: PlannerDatabaseDependency[] = []; + for (const component of components) { + if (!isSqlDependencyProvider(component)) { + continue; + } + const initDeps = component.databaseDependencies?.init; + if (initDeps && initDeps.length > 0) { + deps.push(...(initDeps as readonly PlannerDatabaseDependency[])); + } + } + return sortDependencies(deps); + } + + private buildTableOperations( + tables: SqlContract['storage']['tables'], + schema: SqlSchemaIR, + ): readonly SqlMigrationPlanOperation[] { + const operations: SqlMigrationPlanOperation[] = []; + for (const [tableName, table] of sortedEntries(tables)) { + if (schema.tables[tableName]) { + continue; + } + operations.push({ + id: `table.${tableName}`, + label: `Create table ${tableName}`, + summary: `Creates table ${tableName} with required columns and constraints`, + operationClass: 'additive', + target: { + id: 'sqlite', + details: this.buildTargetDetails('table', tableName, tableName), + }, + precheck: [ + { + description: `ensure table "${tableName}" does not exist`, + sql: tableExistsCheck({ table: tableName, exists: false }), + }, + ], + execute: [ + { + description: `create table "${tableName}"`, + sql: buildCreateTableSql(tableName, table), + }, + ], + postcheck: [ + { + description: `verify table "${tableName}" exists`, + sql: tableExistsCheck({ table: tableName, exists: true }), + }, + ], + }); + } + return operations; + } + + private buildColumnOperations( + tables: SqlContract['storage']['tables'], + schema: SqlSchemaIR, + ): readonly SqlMigrationPlanOperation[] { + const operations: SqlMigrationPlanOperation[] = []; + for (const [tableName, table] of sortedEntries(tables)) { + const schemaTable = schema.tables[tableName]; + if (!schemaTable) { + continue; + } + for (const [columnName, column] of sortedEntries(table.columns)) { + if (schemaTable.columns[columnName]) { + continue; + } + operations.push(this.buildAddColumnOperation(tableName, columnName, column)); + } + } + return operations; + } + + private buildAddColumnOperation( + tableName: string, + columnName: string, + column: StorageColumn, + ): SqlMigrationPlanOperation { + const qualified = quoteIdentifier(tableName); + const notNull = column.nullable === false; + const hasDefault = column.default !== undefined; + // SQLite allows adding NOT NULL columns without default only if table is empty. + const requiresEmptyTable = notNull && !hasDefault; + const precheck = [ + { + description: `ensure column "${columnName}" is missing`, + sql: columnExistsCheck({ table: tableName, column: columnName, exists: false }), + }, + ...(requiresEmptyTable + ? [ + { + description: `ensure table "${tableName}" is empty before adding NOT NULL column without default`, + sql: tableIsEmptyCheck(qualified), + }, + ] + : []), + ]; + const execute = [ + { + description: `add column "${columnName}"`, + sql: buildAddColumnSql(qualified, columnName, column), + }, + ]; + const postcheck = [ + { + description: `verify column "${columnName}" exists`, + sql: columnExistsCheck({ table: tableName, column: columnName, exists: true }), + }, + ...(notNull + ? [ + { + description: `verify column "${columnName}" is NOT NULL`, + sql: columnIsNotNullCheck({ table: tableName, column: columnName }), + }, + ] + : []), + ]; + + return { + id: `column.${tableName}.${columnName}`, + label: `Add column ${columnName} to ${tableName}`, + summary: `Adds column ${columnName} to table ${tableName}`, + operationClass: 'additive', + target: { + id: 'sqlite', + details: this.buildTargetDetails('table', tableName, tableName), + }, + precheck, + execute, + postcheck, + }; + } + + private buildUniqueOperations( + tables: SqlContract['storage']['tables'], + schema: SqlSchemaIR, + ): readonly SqlMigrationPlanOperation[] { + const operations: SqlMigrationPlanOperation[] = []; + for (const [tableName, table] of sortedEntries(tables)) { + const schemaTable = schema.tables[tableName]; + for (const unique of table.uniques) { + if (schemaTable && hasUniqueConstraint(schemaTable, unique.columns)) { + continue; + } + const indexName = unique.name ?? `${tableName}_${unique.columns.join('_')}_key`; + operations.push({ + id: `unique.${tableName}.${indexName}`, + label: `Create unique index ${indexName} on ${tableName}`, + summary: `Creates unique index ${indexName} on ${tableName} to satisfy unique constraint`, + operationClass: 'additive', + target: { + id: 'sqlite', + details: this.buildTargetDetails('unique', indexName, tableName), + }, + precheck: [ + { + description: `ensure index "${indexName}" is missing`, + sql: indexExistsCheck({ index: indexName, exists: false }), + }, + ], + execute: [ + { + description: `create unique index "${indexName}"`, + sql: `CREATE UNIQUE INDEX ${quoteIdentifier(indexName)} ON ${quoteIdentifier( + tableName, + )} (${unique.columns.map(quoteIdentifier).join(', ')})`, + }, + ], + postcheck: [ + { + description: `verify index "${indexName}" exists`, + sql: indexExistsCheck({ index: indexName, exists: true }), + }, + ], + }); + } + } + return operations; + } + + private buildIndexOperations( + tables: SqlContract['storage']['tables'], + schema: SqlSchemaIR, + ): readonly SqlMigrationPlanOperation[] { + const operations: SqlMigrationPlanOperation[] = []; + for (const [tableName, table] of sortedEntries(tables)) { + const schemaTable = schema.tables[tableName]; + for (const index of table.indexes) { + if (schemaTable && hasIndex(schemaTable, index.columns)) { + continue; + } + const indexName = index.name ?? `${tableName}_${index.columns.join('_')}_idx`; + operations.push({ + id: `index.${tableName}.${indexName}`, + label: `Create index ${indexName} on ${tableName}`, + summary: `Creates index ${indexName} on ${tableName}`, + operationClass: 'additive', + target: { + id: 'sqlite', + details: this.buildTargetDetails('index', indexName, tableName), + }, + precheck: [ + { + description: `ensure index "${indexName}" is missing`, + sql: indexExistsCheck({ index: indexName, exists: false }), + }, + ], + execute: [ + { + description: `create index "${indexName}"`, + sql: `CREATE INDEX ${quoteIdentifier(indexName)} ON ${quoteIdentifier(tableName)} (${index.columns + .map(quoteIdentifier) + .join(', ')})`, + }, + ], + postcheck: [ + { + description: `verify index "${indexName}" exists`, + sql: indexExistsCheck({ index: indexName, exists: true }), + }, + ], + }); + } + } + return operations; + } + + private buildTargetDetails( + objectType: OperationClass, + name: string, + table?: string, + ): SqlitePlanTargetDetails { + return { + objectType, + name, + ...ifDefined('table', table), + }; + } + + private classifySchema(options: PlannerOptionsWithComponents): + | { kind: 'ok' } + | { + kind: 'conflict'; + conflicts: SqlPlannerConflict[]; + } { + const verifyOptions: VerifySqlSchemaOptionsWithComponents = { + contract: options.contract, + schema: options.schema, + strict: false, + typeMetadataRegistry: new Map(), + frameworkComponents: options.frameworkComponents, + normalizeDefault: parseSqliteDefault, + }; + const verifyResult = verifySqlSchema(verifyOptions); + + const conflicts = this.extractConflicts(verifyResult.schema.issues); + if (conflicts.length > 0) { + return { kind: 'conflict', conflicts }; + } + return { kind: 'ok' }; + } + + private extractConflicts(issues: readonly SchemaIssue[]): SqlPlannerConflict[] { + const conflicts: SqlPlannerConflict[] = []; + for (const issue of issues) { + if (isAdditiveIssue(issue)) { + continue; + } + const conflict = this.convertIssueToConflict(issue); + if (conflict) { + conflicts.push(conflict); + } + } + return conflicts.sort(conflictComparator); + } + + private convertIssueToConflict(issue: SchemaIssue): SqlPlannerConflict | null { + switch (issue.kind) { + case 'type_mismatch': + return this.buildConflict('typeMismatch', issue); + case 'nullability_mismatch': + return this.buildConflict('nullabilityConflict', issue); + case 'primary_key_mismatch': + // SQLite cannot add primary keys to existing tables additively. + return this.buildConflict('indexIncompatible', issue); + case 'foreign_key_mismatch': + // SQLite cannot add foreign keys to existing tables additively. + return this.buildConflict('foreignKeyConflict', issue); + case 'unique_constraint_mismatch': + case 'index_mismatch': + // These are additive (create indexes), so they should have been filtered already. + return this.buildConflict('indexIncompatible', issue); + default: + return null; + } + } + + private buildConflict(kind: SqlPlannerConflict['kind'], issue: SchemaIssue): SqlPlannerConflict { + const location = buildConflictLocation(issue); + const meta = + issue.expected || issue.actual + ? Object.freeze({ + ...ifDefined('expected', issue.expected), + ...ifDefined('actual', issue.actual), + }) + : undefined; + + return { + kind, + summary: issue.message, + ...ifDefined('location', location), + ...ifDefined('meta', meta), + }; + } +} + +function isSqlDependencyProvider(component: unknown): component is { + readonly databaseDependencies?: { + readonly init?: readonly PlannerDatabaseDependency[]; + }; +} { + if (typeof component !== 'object' || component === null) { + return false; + } + const record = component as Record; + + if (Object.hasOwn(record, 'familyId') && record['familyId'] !== 'sql') { + return false; + } + + if (!Object.hasOwn(record, 'databaseDependencies')) { + return false; + } + const deps = record['databaseDependencies']; + return deps === undefined || (typeof deps === 'object' && deps !== null); +} + +function sortDependencies( + dependencies: ReadonlyArray, +): ReadonlyArray { + if (dependencies.length <= 1) { + return dependencies; + } + return [...dependencies].sort((a, b) => a.id.localeCompare(b.id)); +} + +function buildCreateTableSql(tableName: string, table: StorageTable): string { + const columnDefinitions = Object.entries(table.columns).map( + ([columnName, column]: [string, StorageColumn]) => { + const parts = [ + quoteIdentifier(columnName), + buildColumnTypeSql(column), + buildColumnDefaultSql(column.default), + column.nullable ? '' : 'NOT NULL', + ].filter(Boolean); + return parts.join(' '); + }, + ); + + const constraintDefinitions: string[] = []; + if (table.primaryKey) { + constraintDefinitions.push( + `PRIMARY KEY (${table.primaryKey.columns.map(quoteIdentifier).join(', ')})`, + ); + } + + // SQLite cannot add FKs after table creation; include them in CREATE TABLE. + for (const foreignKey of table.foreignKeys) { + const fkName = foreignKey.name ?? `${tableName}_${foreignKey.columns.join('_')}_fkey`; + const constraintPrefix = foreignKey.name ? `CONSTRAINT ${quoteIdentifier(fkName)} ` : ''; + constraintDefinitions.push( + `${constraintPrefix}FOREIGN KEY (${foreignKey.columns + .map(quoteIdentifier) + .join(', ')}) REFERENCES ${quoteIdentifier( + foreignKey.references.table, + )} (${foreignKey.references.columns.map(quoteIdentifier).join(', ')})`, + ); + } + + const allDefinitions = [...columnDefinitions, ...constraintDefinitions]; + return `CREATE TABLE ${quoteIdentifier(tableName)} (\n ${allDefinitions.join(',\n ')}\n)`; +} + +function buildColumnTypeSql(column: StorageColumn): string { + return column.nativeType; +} + +function buildColumnDefaultSql(columnDefault: ColumnDefault | undefined): string { + if (!columnDefault) { + return ''; + } + + switch (columnDefault.kind) { + case 'literal': + return `DEFAULT ${columnDefault.expression}`; + case 'function': { + if (columnDefault.expression === 'autoincrement()') { + // SQLite has implicit rowid autoincrement semantics for INTEGER PRIMARY KEY columns. + // We treat this as "no explicit default". + return ''; + } + if (columnDefault.expression === 'now()') { + return 'DEFAULT (CURRENT_TIMESTAMP)'; + } + return `DEFAULT ${columnDefault.expression}`; + } + } +} + +function sortedEntries(record: Readonly>): Array<[string, V]> { + return Object.entries(record).sort(([a], [b]) => a.localeCompare(b)) as Array<[string, V]>; +} + +function tableExistsCheck({ table, exists = true }: { table: string; exists?: boolean }): string { + const existsClause = exists ? '' : 'NOT '; + return `SELECT ${existsClause}EXISTS ( + SELECT 1 + FROM sqlite_master + WHERE type = 'table' + AND name = '${escapeLiteral(table)}' +)`; +} + +function indexExistsCheck({ index, exists = true }: { index: string; exists?: boolean }): string { + const existsClause = exists ? '' : 'NOT '; + return `SELECT ${existsClause}EXISTS ( + SELECT 1 + FROM sqlite_master + WHERE type = 'index' + AND name = '${escapeLiteral(index)}' +)`; +} + +function columnExistsCheck({ + table, + column, + exists = true, +}: { + table: string; + column: string; + exists?: boolean; +}): string { + const existsClause = exists ? '' : 'NOT '; + return `SELECT ${existsClause}EXISTS ( + SELECT 1 + FROM pragma_table_info('${escapeLiteral(table)}') + WHERE name = '${escapeLiteral(column)}' +)`; +} + +function columnIsNotNullCheck({ table, column }: { table: string; column: string }): string { + // pragma_table_info returns notnull=0 for INTEGER PRIMARY KEY columns, but they are not nullable. + return `SELECT EXISTS ( + SELECT 1 + FROM pragma_table_info('${escapeLiteral(table)}') + WHERE name = '${escapeLiteral(column)}' + AND ("notnull" = 1 OR pk > 0) +)`; +} + +function tableIsEmptyCheck(qualifiedTableName: string): string { + return `SELECT NOT EXISTS (SELECT 1 FROM ${qualifiedTableName} LIMIT 1)`; +} + +function buildAddColumnSql( + qualifiedTableName: string, + columnName: string, + column: StorageColumn, +): string { + const typeSql = buildColumnTypeSql(column); + const defaultSql = buildColumnDefaultSql(column.default); + const parts = [ + `ALTER TABLE ${qualifiedTableName}`, + `ADD COLUMN ${quoteIdentifier(columnName)} ${typeSql}`, + defaultSql, + column.nullable ? '' : 'NOT NULL', + ].filter(Boolean); + return parts.join(' '); +} + +function hasUniqueConstraint( + table: SqlSchemaIR['tables'][string], + columns: readonly string[], +): boolean { + return isUniqueConstraintSatisfied(table.uniques, table.indexes, columns); +} + +function hasIndex(table: SqlSchemaIR['tables'][string], columns: readonly string[]): boolean { + return isIndexSatisfied(table.indexes, table.uniques, columns); +} + +function isAdditiveIssue(issue: SchemaIssue): boolean { + switch (issue.kind) { + case 'type_missing': + case 'type_values_mismatch': + case 'missing_table': + case 'missing_column': + case 'extension_missing': + return true; + // SQLite cannot add PKs or FKs to existing tables additively, so these are conflicts. + case 'primary_key_mismatch': + case 'foreign_key_mismatch': + return false; + case 'unique_constraint_mismatch': + case 'index_mismatch': + return true; + default: + return false; + } +} + +function buildConflictLocation(issue: SchemaIssue) { + const location: { + table?: string; + column?: string; + constraint?: string; + } = {}; + if (issue.table) { + location.table = issue.table; + } + if (issue.column) { + location.column = issue.column; + } + if (issue.indexOrConstraint) { + location.constraint = issue.indexOrConstraint; + } + return Object.keys(location).length > 0 ? location : undefined; +} + +function conflictComparator(a: SqlPlannerConflict, b: SqlPlannerConflict): number { + if (a.kind !== b.kind) { + return a.kind < b.kind ? -1 : 1; + } + const aLocation = a.location ?? {}; + const bLocation = b.location ?? {}; + const tableCompare = compareStrings(aLocation.table, bLocation.table); + if (tableCompare !== 0) { + return tableCompare; + } + const columnCompare = compareStrings(aLocation.column, bLocation.column); + if (columnCompare !== 0) { + return columnCompare; + } + const constraintCompare = compareStrings(aLocation.constraint, bLocation.constraint); + if (constraintCompare !== 0) { + return constraintCompare; + } + return compareStrings(a.summary, b.summary); +} + +function compareStrings(a?: string, b?: string): number { + if (a === b) { + return 0; + } + if (a === undefined) { + return -1; + } + if (b === undefined) { + return 1; + } + return a < b ? -1 : 1; +} diff --git a/packages/3-targets/3-targets/sqlite/src/core/migrations/runner.ts b/packages/3-targets/3-targets/sqlite/src/core/migrations/runner.ts new file mode 100644 index 0000000000..856f7ba770 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/src/core/migrations/runner.ts @@ -0,0 +1,501 @@ +import { parseSqliteDefault } from '@prisma-next/adapter-sqlite/control'; +import type { ContractMarkerRecord } from '@prisma-next/contract/types'; +import type { + MigrationOperationPolicy, + SqlControlFamilyInstance, + SqlMigrationPlanContractInfo, + SqlMigrationPlanOperation, + SqlMigrationPlanOperationStep, + SqlMigrationRunner, + SqlMigrationRunnerExecuteOptions, + SqlMigrationRunnerFailure, + SqlMigrationRunnerResult, +} from '@prisma-next/family-sql/control'; +import { runnerFailure, runnerSuccess } from '@prisma-next/family-sql/control'; +import { verifySqlSchema } from '@prisma-next/family-sql/schema-verify'; +import { readMarker } from '@prisma-next/family-sql/verify'; +import { SqlQueryError } from '@prisma-next/sql-errors'; +import type { Result } from '@prisma-next/utils/result'; +import { ok, okVoid } from '@prisma-next/utils/result'; +import type { SqlitePlanTargetDetails } from './planner'; +import { + buildLedgerInsertStatement, + buildWriteMarkerStatements, + ensureLedgerTableStatement, + ensureMarkerTableStatement, + type SqlStatement, +} from './statement-builders'; + +interface ApplyPlanSuccessValue { + readonly operationsExecuted: number; + readonly executedOperations: readonly SqlMigrationPlanOperation[]; +} + +export function createSqliteMigrationRunner( + family: SqlControlFamilyInstance, +): SqlMigrationRunner { + return new SqliteMigrationRunner(family); +} + +class SqliteMigrationRunner implements SqlMigrationRunner { + constructor(private readonly family: SqlControlFamilyInstance) {} + + async execute( + options: SqlMigrationRunnerExecuteOptions, + ): Promise { + const driver = options.driver; + + // Static checks - fail fast before transaction + const destinationCheck = this.ensurePlanMatchesDestinationContract( + options.plan.destination, + options.destinationContract, + ); + if (!destinationCheck.ok) { + return destinationCheck; + } + + const policyCheck = this.enforcePolicyCompatibility(options.policy, options.plan.operations); + if (!policyCheck.ok) { + return policyCheck; + } + + await this.beginTransaction(driver); + let committed = false; + + try { + await this.ensureControlTables(driver); + const existingMarker = await readMarker(driver); + + const markerCheck = this.ensureMarkerCompatibility(existingMarker, options.plan); + if (!markerCheck.ok) { + return markerCheck; + } + + const markerAtDestination = this.markerMatchesDestination(existingMarker, options.plan); + let applyValue: ApplyPlanSuccessValue; + + if (markerAtDestination) { + applyValue = { operationsExecuted: 0, executedOperations: [] }; + } else { + const applyResult = await this.applyPlan(driver, options); + if (!applyResult.ok) { + return applyResult; + } + applyValue = applyResult.value; + } + + // Verify resulting schema matches contract + const schemaIR = await this.family.introspect({ + driver, + contractIR: options.destinationContract, + }); + + const schemaVerifyResult = verifySqlSchema({ + contract: options.destinationContract, + schema: schemaIR, + strict: options.strictVerification ?? true, + context: options.context ?? {}, + typeMetadataRegistry: this.family.typeMetadataRegistry, + frameworkComponents: options.frameworkComponents, + normalizeDefault: parseSqliteDefault, + }); + if (!schemaVerifyResult.ok) { + return runnerFailure('SCHEMA_VERIFY_FAILED', schemaVerifyResult.summary, { + why: 'The resulting database schema does not satisfy the destination contract.', + meta: { + issues: schemaVerifyResult.schema.issues, + }, + }); + } + + await this.upsertMarker(driver, options, existingMarker); + await this.recordLedgerEntry(driver, options, existingMarker, applyValue.executedOperations); + + await this.commitTransaction(driver); + committed = true; + + return runnerSuccess({ + operationsPlanned: options.plan.operations.length, + operationsExecuted: applyValue.operationsExecuted, + }); + } finally { + if (!committed) { + await this.rollbackTransaction(driver); + } + } + } + + private async applyPlan( + driver: SqlMigrationRunnerExecuteOptions['driver'], + options: SqlMigrationRunnerExecuteOptions, + ): Promise> { + const checks = options.executionChecks; + const runPrechecks = checks?.prechecks !== false; + const runPostchecks = checks?.postchecks !== false; + const runIdempotency = checks?.idempotencyChecks !== false; + + let operationsExecuted = 0; + const executedOperations: Array> = []; + + for (const operation of options.plan.operations) { + options.callbacks?.onOperationStart?.(operation); + try { + if (runPostchecks && runIdempotency) { + const postcheckAlreadySatisfied = await this.expectationsAreSatisfied( + driver, + operation.postcheck, + ); + if (postcheckAlreadySatisfied) { + executedOperations.push(operation); + continue; + } + } + + if (runPrechecks) { + const precheckResult = await this.runExpectationSteps( + driver, + operation.precheck, + operation, + 'precheck', + ); + if (!precheckResult.ok) { + return precheckResult; + } + } + + const executeResult = await this.runExecuteSteps(driver, operation.execute, operation); + if (!executeResult.ok) { + return executeResult; + } + + if (runPostchecks) { + const postcheckResult = await this.runExpectationSteps( + driver, + operation.postcheck, + operation, + 'postcheck', + ); + if (!postcheckResult.ok) { + return postcheckResult; + } + } + + executedOperations.push(operation); + operationsExecuted += 1; + } finally { + options.callbacks?.onOperationComplete?.(operation); + } + } + + return ok({ operationsExecuted, executedOperations }); + } + + private async ensureControlTables( + driver: SqlMigrationRunnerExecuteOptions['driver'], + ): Promise { + await this.executeStatement(driver, ensureMarkerTableStatement); + await this.executeStatement(driver, ensureLedgerTableStatement); + } + + private async runExpectationSteps( + driver: SqlMigrationRunnerExecuteOptions['driver'], + steps: readonly SqlMigrationPlanOperationStep[], + operation: SqlMigrationPlanOperation, + phase: 'precheck' | 'postcheck', + ): Promise> { + for (const step of steps) { + const result = await driver.query(step.sql); + if (!this.stepResultIsTrue(result.rows)) { + const code = phase === 'precheck' ? 'PRECHECK_FAILED' : 'POSTCHECK_FAILED'; + return runnerFailure( + code, + `Operation ${operation.id} failed during ${phase}: ${step.description}`, + { + meta: { + operationId: operation.id, + phase, + stepDescription: step.description, + }, + }, + ); + } + } + return okVoid(); + } + + private async runExecuteSteps( + driver: SqlMigrationRunnerExecuteOptions['driver'], + steps: readonly SqlMigrationPlanOperationStep[], + operation: SqlMigrationPlanOperation, + ): Promise> { + for (const step of steps) { + try { + await driver.query(step.sql); + } catch (error: unknown) { + if (SqlQueryError.is(error)) { + return runnerFailure( + 'EXECUTION_FAILED', + `Operation ${operation.id} failed during execution: ${step.description}`, + { + why: error.message, + meta: { + operationId: operation.id, + stepDescription: step.description, + sql: step.sql, + sqlState: error.sqlState, + constraint: error.constraint, + table: error.table, + column: error.column, + detail: error.detail, + }, + }, + ); + } + throw error; + } + } + return okVoid(); + } + + private stepResultIsTrue(rows: readonly Record[]): boolean { + if (!rows || rows.length === 0) { + return false; + } + const firstRow = rows[0]; + const firstValue = firstRow ? Object.values(firstRow)[0] : undefined; + if (typeof firstValue === 'boolean') { + return firstValue; + } + if (typeof firstValue === 'number') { + return firstValue !== 0; + } + if (typeof firstValue === 'string') { + const lower = firstValue.toLowerCase(); + if (lower === 'true' || lower === '1') { + return true; + } + if (lower === 'false' || lower === '0') { + return false; + } + return firstValue.length > 0; + } + return Boolean(firstValue); + } + + private async expectationsAreSatisfied( + driver: SqlMigrationRunnerExecuteOptions['driver'], + steps: readonly SqlMigrationPlanOperationStep[], + ): Promise { + for (const step of steps) { + try { + const result = await driver.query(step.sql); + if (!this.stepResultIsTrue(result.rows)) { + return false; + } + } catch { + return false; + } + } + return true; + } + + private ensurePlanMatchesDestinationContract( + destination: SqlMigrationPlanContractInfo, + contract: SqlMigrationRunnerExecuteOptions['destinationContract'], + ): Result { + if (destination.coreHash !== contract.coreHash) { + return runnerFailure( + 'DESTINATION_CONTRACT_MISMATCH', + 'Plan destination does not match destination contract core hash', + { + meta: { + planCoreHash: destination.coreHash, + contractCoreHash: contract.coreHash, + }, + }, + ); + } + if ( + destination.profileHash && + contract.profileHash && + destination.profileHash !== contract.profileHash + ) { + return runnerFailure( + 'DESTINATION_CONTRACT_MISMATCH', + 'Plan destination does not match destination contract profile hash', + { + meta: { + planProfileHash: destination.profileHash, + contractProfileHash: contract.profileHash, + }, + }, + ); + } + return okVoid(); + } + + private enforcePolicyCompatibility( + policy: MigrationOperationPolicy, + operations: readonly SqlMigrationPlanOperation[], + ): Result { + for (const op of operations) { + if (!policy.allowedOperationClasses.includes(op.operationClass)) { + return runnerFailure('POLICY_VIOLATION', 'Operation class not allowed by policy', { + meta: { operationId: op.id, operationClass: op.operationClass }, + }); + } + } + return okVoid(); + } + + private ensureMarkerCompatibility( + marker: ContractMarkerRecord | null, + plan: SqlMigrationRunnerExecuteOptions['plan'], + ): Result { + const origin = plan.origin ?? null; + if (!origin) { + if (!marker) { + return okVoid(); + } + if (this.markerMatchesDestination(marker, plan)) { + return okVoid(); + } + return runnerFailure( + 'MARKER_ORIGIN_MISMATCH', + `Existing contract marker (${marker.coreHash}) does not match plan origin (no marker expected).`, + { + meta: { + markerCoreHash: marker.coreHash, + expectedOrigin: null, + }, + }, + ); + } + + if (!marker) { + return runnerFailure( + 'MARKER_ORIGIN_MISMATCH', + `Missing contract marker: expected origin core hash ${origin.coreHash}.`, + { + meta: { + expectedOriginCoreHash: origin.coreHash, + }, + }, + ); + } + if (marker.coreHash !== origin.coreHash) { + return runnerFailure( + 'MARKER_ORIGIN_MISMATCH', + `Existing contract marker (${marker.coreHash}) does not match plan origin (${origin.coreHash}).`, + { + meta: { + markerCoreHash: marker.coreHash, + expectedOriginCoreHash: origin.coreHash, + }, + }, + ); + } + if (origin.profileHash && marker.profileHash !== origin.profileHash) { + return runnerFailure( + 'MARKER_ORIGIN_MISMATCH', + `Existing contract marker profile hash (${marker.profileHash}) does not match plan origin profile hash (${origin.profileHash}).`, + { + meta: { + markerProfileHash: marker.profileHash, + expectedOriginProfileHash: origin.profileHash, + }, + }, + ); + } + return okVoid(); + } + + private markerMatchesDestination( + existingMarker: ContractMarkerRecord | null, + plan: { readonly destination: { readonly coreHash: string; readonly profileHash?: string } }, + ): boolean { + if (!existingMarker) { + return false; + } + if (existingMarker.coreHash !== plan.destination.coreHash) { + return false; + } + if ( + plan.destination.profileHash && + existingMarker.profileHash !== plan.destination.profileHash + ) { + return false; + } + return true; + } + + private async upsertMarker( + driver: SqlMigrationRunnerExecuteOptions['driver'], + options: SqlMigrationRunnerExecuteOptions, + existingMarker: ContractMarkerRecord | null, + ): Promise { + const writeStatements = buildWriteMarkerStatements({ + coreHash: options.plan.destination.coreHash, + profileHash: + options.plan.destination.profileHash ?? + options.destinationContract.profileHash ?? + options.plan.destination.coreHash, + contractJson: options.destinationContract, + canonicalVersion: null, + meta: {}, + }); + + const statement = existingMarker ? writeStatements.update : writeStatements.insert; + await this.executeStatement(driver, statement); + } + + private async recordLedgerEntry( + driver: SqlMigrationRunnerExecuteOptions['driver'], + options: SqlMigrationRunnerExecuteOptions, + existingMarker: ContractMarkerRecord | null, + executedOperations: readonly SqlMigrationPlanOperation[], + ): Promise { + const ledgerStatement = buildLedgerInsertStatement({ + originCoreHash: existingMarker?.coreHash ?? null, + originProfileHash: existingMarker?.profileHash ?? null, + destinationCoreHash: options.plan.destination.coreHash, + destinationProfileHash: + options.plan.destination.profileHash ?? + options.destinationContract.profileHash ?? + options.plan.destination.coreHash, + contractJsonBefore: existingMarker?.contractJson ?? null, + contractJsonAfter: options.destinationContract, + operations: executedOperations, + }); + await this.executeStatement(driver, ledgerStatement); + } + + private async beginTransaction( + driver: SqlMigrationRunnerExecuteOptions['driver'], + ): Promise { + await driver.query('BEGIN IMMEDIATE'); + } + + private async commitTransaction( + driver: SqlMigrationRunnerExecuteOptions['driver'], + ): Promise { + await driver.query('COMMIT'); + } + + private async rollbackTransaction( + driver: SqlMigrationRunnerExecuteOptions['driver'], + ): Promise { + await driver.query('ROLLBACK'); + } + + private async executeStatement( + driver: SqlMigrationRunnerExecuteOptions['driver'], + statement: SqlStatement, + ): Promise { + if (statement.params.length > 0) { + await driver.query(statement.sql, statement.params); + return; + } + await driver.query(statement.sql); + } +} diff --git a/packages/3-targets/3-targets/sqlite/src/core/migrations/statement-builders.ts b/packages/3-targets/3-targets/sqlite/src/core/migrations/statement-builders.ts new file mode 100644 index 0000000000..9bd5df3e1e --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/src/core/migrations/statement-builders.ts @@ -0,0 +1,145 @@ +export interface SqlStatement { + readonly sql: string; + readonly params: readonly unknown[]; +} + +/** + * Marker table per ADR 021 (SQLite uses a flat table name, no schemas). + */ +export const ensureMarkerTableStatement: SqlStatement = { + sql: `create table if not exists prisma_contract_marker ( + id integer primary key, + core_hash text not null, + profile_hash text not null, + contract_json text, + canonical_version integer, + updated_at text not null default (CURRENT_TIMESTAMP), + app_tag text, + meta text not null default '{}' + )`, + params: [], +}; + +/** + * Minimal ledger table for audit/debug (SQLite flavor). + */ +export const ensureLedgerTableStatement: SqlStatement = { + sql: `create table if not exists prisma_contract_ledger ( + id integer primary key autoincrement, + created_at text not null default (CURRENT_TIMESTAMP), + origin_core_hash text, + origin_profile_hash text, + destination_core_hash text not null, + destination_profile_hash text, + contract_json_before text, + contract_json_after text, + operations text not null + )`, + params: [], +}; + +export interface WriteMarkerInput { + readonly coreHash: string; + readonly profileHash: string; + readonly contractJson?: unknown; + readonly canonicalVersion?: number | null; + readonly appTag?: string | null; + readonly meta?: Record; +} + +export function buildWriteMarkerStatements(input: WriteMarkerInput): { + readonly insert: SqlStatement; + readonly update: SqlStatement; +} { + const params: readonly unknown[] = [ + 1, + input.coreHash, + input.profileHash, + jsonParam(input.contractJson), + input.canonicalVersion ?? null, + input.appTag ?? null, + jsonParam(input.meta ?? {}), + ]; + + return { + insert: { + sql: `insert into prisma_contract_marker ( + id, + core_hash, + profile_hash, + contract_json, + canonical_version, + updated_at, + app_tag, + meta + ) values ( + ?1, + ?2, + ?3, + ?4, + ?5, + CURRENT_TIMESTAMP, + ?6, + ?7 + )`, + params, + }, + update: { + sql: `update prisma_contract_marker set + core_hash = ?2, + profile_hash = ?3, + contract_json = ?4, + canonical_version = ?5, + updated_at = CURRENT_TIMESTAMP, + app_tag = ?6, + meta = ?7 + where id = ?1`, + params, + }, + }; +} + +export interface LedgerInsertInput { + readonly originCoreHash?: string | null; + readonly originProfileHash?: string | null; + readonly destinationCoreHash: string; + readonly destinationProfileHash?: string | null; + readonly contractJsonBefore?: unknown; + readonly contractJsonAfter?: unknown; + readonly operations: unknown; +} + +export function buildLedgerInsertStatement(input: LedgerInsertInput): SqlStatement { + return { + sql: `insert into prisma_contract_ledger ( + origin_core_hash, + origin_profile_hash, + destination_core_hash, + destination_profile_hash, + contract_json_before, + contract_json_after, + operations + ) values ( + ?1, + ?2, + ?3, + ?4, + ?5, + ?6, + ?7 + )`, + params: [ + input.originCoreHash ?? null, + input.originProfileHash ?? null, + input.destinationCoreHash, + input.destinationProfileHash ?? null, + jsonParam(input.contractJsonBefore), + jsonParam(input.contractJsonAfter), + jsonParam(input.operations), + ], + }; +} + +function jsonParam(value: unknown): string { + return JSON.stringify(value ?? null); +} diff --git a/packages/3-targets/3-targets/sqlite/src/core/types.ts b/packages/3-targets/3-targets/sqlite/src/core/types.ts new file mode 100644 index 0000000000..5ea40846f4 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/src/core/types.ts @@ -0,0 +1,3 @@ +import type { ColumnDefault } from '@prisma-next/contract/types'; + +export type SqliteColumnDefault = ColumnDefault; diff --git a/packages/3-targets/3-targets/sqlite/src/exports/control.ts b/packages/3-targets/3-targets/sqlite/src/exports/control.ts new file mode 100644 index 0000000000..bf623d0820 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/src/exports/control.ts @@ -0,0 +1,55 @@ +import type { + ControlTargetInstance, + MigrationPlanner, + MigrationRunner, +} from '@prisma-next/core-control-plane/types'; +import type { + SqlControlFamilyInstance, + SqlControlTargetDescriptor, +} from '@prisma-next/family-sql/control'; +import { sqliteTargetDescriptorMeta } from '../core/descriptor-meta'; +import type { SqlitePlanTargetDetails } from '../core/migrations/planner'; +import { createSqliteMigrationPlanner } from '../core/migrations/planner'; +import { createSqliteMigrationRunner } from '../core/migrations/runner'; + +/** + * SQLite target descriptor for CLI config. + */ +const sqliteTargetDescriptor: SqlControlTargetDescriptor<'sqlite', SqlitePlanTargetDetails> = { + ...sqliteTargetDescriptorMeta, + /** + * Migrations capability for CLI to access planner/runner via core types. + * The SQL-specific planner/runner types are compatible with the generic + * MigrationPlanner/MigrationRunner interfaces at runtime. + */ + migrations: { + createPlanner(_family: SqlControlFamilyInstance) { + return createSqliteMigrationPlanner() as MigrationPlanner<'sql', 'sqlite'>; + }, + createRunner(family) { + return createSqliteMigrationRunner(family) as MigrationRunner<'sql', 'sqlite'>; + }, + }, + create(): ControlTargetInstance<'sql', 'sqlite'> { + return { + familyId: 'sql', + targetId: 'sqlite', + }; + }, + /** + * Direct method for SQL-specific usage. + * @deprecated Use migrations.createPlanner() for CLI compatibility. + */ + createPlanner(_family: SqlControlFamilyInstance) { + return createSqliteMigrationPlanner(); + }, + /** + * Direct method for SQL-specific usage. + * @deprecated Use migrations.createRunner() for CLI compatibility. + */ + createRunner(family) { + return createSqliteMigrationRunner(family); + }, +}; + +export default sqliteTargetDescriptor; diff --git a/packages/3-targets/3-targets/sqlite/src/exports/pack.ts b/packages/3-targets/3-targets/sqlite/src/exports/pack.ts new file mode 100644 index 0000000000..7c53548d0b --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/src/exports/pack.ts @@ -0,0 +1,6 @@ +import type { TargetPackRef } from '@prisma-next/contract/framework-components'; +import { sqliteTargetDescriptorMeta } from '../core/descriptor-meta'; + +const sqlitePack: TargetPackRef<'sql', 'sqlite'> = sqliteTargetDescriptorMeta; + +export default sqlitePack; diff --git a/packages/3-targets/3-targets/sqlite/src/exports/runtime.ts b/packages/3-targets/3-targets/sqlite/src/exports/runtime.ts new file mode 100644 index 0000000000..601a5f7c9b --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/src/exports/runtime.ts @@ -0,0 +1,29 @@ +import type { + RuntimeTargetDescriptor, + RuntimeTargetInstance, +} from '@prisma-next/core-execution-plane/types'; +import { sqliteTargetDescriptorMeta } from '../core/descriptor-meta'; + +/** + * SQLite runtime target instance interface. + */ +export interface SqliteRuntimeTargetInstance extends RuntimeTargetInstance<'sql', 'sqlite'> {} + +/** + * SQLite target descriptor for runtime plane. + */ +const sqliteRuntimeTargetDescriptor: RuntimeTargetDescriptor< + 'sql', + 'sqlite', + SqliteRuntimeTargetInstance +> = { + ...sqliteTargetDescriptorMeta, + create(): SqliteRuntimeTargetInstance { + return { + familyId: 'sql', + targetId: 'sqlite', + }; + }, +}; + +export default sqliteRuntimeTargetDescriptor; diff --git a/packages/3-targets/3-targets/sqlite/tsconfig.build.json b/packages/3-targets/3-targets/sqlite/tsconfig.build.json new file mode 100644 index 0000000000..671541c1a3 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/tsconfig.build.json @@ -0,0 +1,12 @@ +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "rootDir": "src", + "outDir": "dist", + "declaration": true, + "declarationMap": true, + "emitDeclarationOnly": true + }, + "include": ["src/**/*.ts"], + "exclude": ["test", "dist"] +} diff --git a/packages/3-targets/3-targets/sqlite/tsconfig.json b/packages/3-targets/3-targets/sqlite/tsconfig.json new file mode 100644 index 0000000000..0ede0e1950 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/tsconfig.json @@ -0,0 +1,9 @@ +{ + "extends": ["@prisma-next/tsconfig/base"], + "compilerOptions": { + "rootDir": ".", + "outDir": "./dist" + }, + "include": ["src/**/*", "test/**/*"], + "exclude": ["node_modules", "dist"] +} diff --git a/packages/3-targets/3-targets/sqlite/tsup.config.ts b/packages/3-targets/3-targets/sqlite/tsup.config.ts new file mode 100644 index 0000000000..0ed9dc1dbf --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/tsup.config.ts @@ -0,0 +1,16 @@ +import { defineConfig } from 'tsup'; + +export default defineConfig({ + entry: { + 'exports/control': 'src/exports/control.ts', + 'exports/runtime': 'src/exports/runtime.ts', + 'exports/pack': 'src/exports/pack.ts', + }, + outDir: 'dist', + format: ['esm'], + sourcemap: true, + dts: false, + clean: false, + target: 'es2022', + minify: false, +}); diff --git a/packages/3-targets/3-targets/sqlite/vitest.config.ts b/packages/3-targets/3-targets/sqlite/vitest.config.ts new file mode 100644 index 0000000000..9f4d0c3501 --- /dev/null +++ b/packages/3-targets/3-targets/sqlite/vitest.config.ts @@ -0,0 +1,29 @@ +import { defineConfig } from 'vitest/config'; + +export default defineConfig({ + test: { + environment: 'node', + coverage: { + provider: 'v8', + include: ['src/**/*.{ts,tsx,js,jsx}'], + exclude: [ + 'dist/**', + 'test/**', + '**/*.test.ts', + '**/*.test-d.ts', + '**/*.spec.ts', + '**/*.spec.tsx', + '**/*.d.ts', + '**/*.config.ts', + '**/exports/**', + ], + reporter: ['text', 'html'], + thresholds: { + lines: 82, + branches: 67, + functions: 94, + statements: 82, + }, + }, + }, +}); diff --git a/packages/3-targets/6-adapters/postgres/src/core/adapter.ts b/packages/3-targets/6-adapters/postgres/src/core/adapter.ts index 00e5e3ed13..f07a5158b4 100644 --- a/packages/3-targets/6-adapters/postgres/src/core/adapter.ts +++ b/packages/3-targets/6-adapters/postgres/src/core/adapter.ts @@ -250,9 +250,10 @@ function renderOperation(expr: OperationExpr, contract?: PostgresContract): stri }); let result = expr.lowering.template; - result = result.replace(/\$\{self\}/g, self); + // Support both runtime `${self}` templates and manifest-safe `{{self}}` templates. + result = result.replace(/\$\{self\}|\{\{self\}\}/g, self); for (let i = 0; i < args.length; i++) { - result = result.replace(new RegExp(`\\$\\{arg${i}\\}`, 'g'), args[i] ?? ''); + result = result.replace(new RegExp(`\\$\\{arg${i}\\}|\\{\\{arg${i}\\}\\}`, 'g'), args[i] ?? ''); } if (expr.lowering.strategy === 'function') { diff --git a/packages/3-targets/6-adapters/sqlite/README.md b/packages/3-targets/6-adapters/sqlite/README.md new file mode 100644 index 0000000000..35bdb909d3 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/README.md @@ -0,0 +1,256 @@ +# @prisma-next/adapter-postgres + +PostgreSQL adapter for Prisma Next. + +## Package Classification + +- **Domain**: targets +- **Layer**: adapters +- **Plane**: multi-plane (shared, migration, runtime) + +## Overview + +The PostgreSQL adapter implements the adapter SPI for PostgreSQL databases. It provides SQL lowering, capability discovery, codec definitions, and error mapping for PostgreSQL-specific behavior. It also exports both control-plane and runtime-plane adapter descriptors for config wiring. + +## Purpose + +Provide PostgreSQL-specific adapter implementation, codecs, and capabilities. Enable PostgreSQL dialect support in Prisma Next through the adapter SPI. + +## Responsibilities + +- **Adapter Implementation**: Implement `Adapter` SPI for PostgreSQL + - Lower SQL ASTs to PostgreSQL dialect SQL + - Render `includeMany` as `LEFT JOIN LATERAL` with `json_agg` for nested array includes + - Advertise PostgreSQL capabilities (`lateral`, `jsonAgg`) + - Normalize PostgreSQL EXPLAIN output + - Map PostgreSQL errors to `RuntimeError` envelope +- **Codec Definitions**: Define PostgreSQL codecs for type conversion + - Wire format to JavaScript type decoding + - JavaScript type to wire format encoding +- **Storage Type Control Hooks**: Provide control-plane hooks for contract-defined storage types (e.g., enums) +- **Codec Types**: Export TypeScript types for PostgreSQL codecs +- **Descriptors**: Provide adapter descriptors declaring capabilities and codec type imports + +**Non-goals:** +- Transport/pooling management (drivers) +- Query compilation (sql-query) +- Runtime execution (runtime) + +## Architecture + +This package spans multiple planes: + +- **Shared plane** (`src/core/**`): Core adapter implementation, codecs, and types that can be imported by both migration and runtime planes +- **Migration plane** (`src/exports/control.ts`): Control-plane entry point that exports the adapter descriptor for config files +- **Runtime plane** (`src/exports/runtime.ts`): Runtime-plane entry point that exports the runtime adapter descriptor + +```mermaid +flowchart TD + subgraph "Runtime" + RT[Runtime] + PLAN[Plan] + end + + subgraph "Postgres Adapter" + ADAPTER[Adapter] + LOWERER[Lowerer] + CODECS[Codecs] + CAPS[Capabilities] + end + + subgraph "Postgres Driver" + DRIVER[Driver] + PG[(PostgreSQL)] + end + + subgraph "Descriptors" + CONTROL[Control Descriptor] + RUNTIME_DESC[Runtime Descriptor] + CODECTYPES[Codec Types] + end + + RT --> PLAN + PLAN --> ADAPTER + ADAPTER --> LOWERER + ADAPTER --> CODECS + ADAPTER --> CAPS + ADAPTER --> DRIVER + DRIVER --> PG + CONTROL --> RT + RUNTIME_DESC --> RT + CODECTYPES --> RT + CODECS --> CODECTYPES +``` + +## Components + +### Core (`src/core/`) + +**Adapter (`adapter.ts`)** +- Main adapter implementation +- Lowers SQL ASTs to PostgreSQL SQL +- Renders joins (INNER, LEFT, RIGHT, FULL) with ON conditions +- Renders `includeMany` as `LEFT JOIN LATERAL` with `json_agg` for nested array includes +- Renders DML operations (INSERT, UPDATE, DELETE) with RETURNING clauses +- Advertises PostgreSQL capabilities (`lateral`, `jsonAgg`, `returning`) +- Maps PostgreSQL errors to `RuntimeError` + +**Codecs (`codecs.ts`)** +- PostgreSQL codec definitions +- Type conversion between wire format and JavaScript +- Supports PostgreSQL types: `int2`, `int4`, `int8`, `float4`, `float8`, `text`, `timestamp`, `timestamptz`, `bool`, `enum` + +**Types (`types.ts`)** +- PostgreSQL-specific types and utilities +- Re-exports SQL contract types + +### Exports (`src/exports/`) + +**Control Entry Point (`control.ts`)** +- Exports the control-plane adapter descriptor for CLI config +- Used by `prisma-next.config.ts` to declare the adapter + +**Runtime Entry Point (`runtime.ts`)** +- Exports the runtime-plane adapter descriptor + +**Adapter Export (`adapter.ts`)** +- Re-exports `createPostgresAdapter` from core + +**Codec Types Export (`codec-types.ts`)** +- Exports TypeScript type definitions for PostgreSQL codecs +- Used in `contract.d.ts` generation + +**Types Export (`types.ts`)** +- Re-exports PostgreSQL-specific types + +**Column Types Export (`column-types.ts`)** +- Exports column descriptors for built-in types and enum helpers (`enumType`, `enumColumn(typeRef, nativeType)`) + +## Dependencies + +- **`@prisma-next/sql-contract`**: SQL contract types +- **`@prisma-next/sql-relational-core`**: SQL AST types and codec registry +- **`@prisma-next/cli`**: CLI config types and extension pack manifest types + +## Related Subsystems + +- **[Adapters & Targets](../../../../docs/architecture%20docs/subsystems/5.%20Adapters%20&%20Targets.md)**: Detailed adapter specification +- **[Ecosystem Extensions & Packs](../../../../docs/architecture%20docs/subsystems/6.%20Ecosystem%20Extensions%20&%20Packs.md)**: Extension pack model + +## Related ADRs + +- [ADR 005 - Thin Core Fat Targets](../../../../docs/architecture%20docs/adrs/ADR%20005%20-%20Thin%20Core%20Fat%20Targets.md) +- [ADR 016 - Adapter SPI for Lowering](../../../../docs/architecture%20docs/adrs/ADR%20016%20-%20Adapter%20SPI%20for%20Lowering.md) +- [ADR 030 - Result decoding & codecs registry](../../../../docs/architecture%20docs/adrs/ADR%20030%20-%20Result%20decoding%20&%20codecs%20registry.md) +- [ADR 065 - Adapter capability schema & negotiation v1](../../../../docs/architecture%20docs/adrs/ADR%20065%20-%20Adapter%20capability%20schema%20&%20negotiation%20v1.md) +- [ADR 068 - Error mapping to RuntimeError](../../../../docs/architecture%20docs/adrs/ADR%20068%20-%20Error%20mapping%20to%20RuntimeError.md) +- [ADR 112 - Target Extension Packs](../../../../docs/architecture%20docs/adrs/ADR%20112%20-%20Target%20Extension%20Packs.md) +- [ADR 114 - Extension codecs & branded types](../../../../docs/architecture%20docs/adrs/ADR%20114%20-%20Extension%20codecs%20&%20branded%20types.md) + +## Usage + +### Runtime + +```typescript +import { createPostgresAdapter } from '@prisma-next/adapter-postgres/adapter'; +import { createRuntime } from '@prisma-next/sql-runtime'; + +const runtime = createRuntime({ + contract, + adapter: createPostgresAdapter(), + driver: postgresDriver, +}); +``` + +### CLI Config + +```typescript +import postgresAdapter from '@prisma-next/adapter-postgres/control'; + +export default defineConfig({ + family: sql, + target: postgres, + adapter: postgresAdapter, + extensions: [], +}); +``` + +## Capabilities + +The adapter declares the following PostgreSQL capabilities: + +- **`orderBy: true`** - Supports ORDER BY clauses +- **`limit: true`** - Supports LIMIT clauses +- **`lateral: true`** - Supports LATERAL joins for `includeMany` nested array includes +- **`jsonAgg: true`** - Supports JSON aggregation functions (`json_agg`) for `includeMany` +- **`returning: true`** - Supports RETURNING clauses for DML operations (INSERT, UPDATE, DELETE) +- **`sql.enums: true`** - Supports contract-defined enum storage types + +**Important**: Capabilities must be declared in **both** places: + +1. **Adapter descriptor** (`src/exports/control.ts` and `src/exports/runtime.ts`): Capabilities are read during emission and included in the contract +2. **Runtime adapter code** (`src/core/adapter.ts`): The `defaultCapabilities` constant is used at runtime via `adapter.profile.capabilities` + +The capabilities on the descriptor must match the capabilities in code. If they don't match, emitted contracts and runtime capability checks will diverge. + +See `docs/reference/capabilities.md` and `docs/architecture docs/subsystems/5. Adapters & Targets.md` for details. + +## includeMany Support + +The adapter supports `includeMany` for nested array includes using PostgreSQL's `LATERAL` joins and `json_agg`: + +**Lowering Strategy:** +- Renders `includeMany` as `LEFT JOIN LATERAL` with a subquery that uses `json_agg(json_build_object(...))` to aggregate child rows into a JSON array +- The ON condition from the include is moved into the WHERE clause of the lateral subquery +- When both `ORDER BY` and `LIMIT` are present, wraps the query in an inner SELECT that projects individual columns with aliases, then uses `json_agg(row_to_json(sub.*))` on the result +- Uses different aliases for the table (`{alias}_lateral`) and column (`{alias}`) to avoid ambiguity + +**Capabilities Required:** +- `lateral: true` - Enables LATERAL join support +- `jsonAgg: true` - Enables `json_agg` function support + +**Example SQL Output:** +```sql +SELECT "user"."id" AS "id", "posts_lateral"."posts" AS "posts" +FROM "user" +LEFT JOIN LATERAL ( + SELECT json_agg(json_build_object('id', "post"."id", 'title', "post"."title")) AS "posts" + FROM "post" + WHERE "user"."id" = "post"."userId" +) AS "posts_lateral" ON true +``` + +## DML Operations with RETURNING + +The adapter supports RETURNING clauses for DML operations (INSERT, UPDATE, DELETE), allowing you to return affected rows: + +**Lowering Strategy:** +- Renders `RETURNING` clause after INSERT, UPDATE, or DELETE statements +- Returns specified columns from affected rows +- Supports returning multiple columns + +**Capability Required:** +- `returning: true` - Enables RETURNING clause support + +**Example SQL Output:** +```sql +-- INSERT with RETURNING +INSERT INTO "user" ("email", "createdAt") VALUES ($1, $2) RETURNING "user"."id", "user"."email" + +-- UPDATE with RETURNING +UPDATE "user" SET "email" = $1 WHERE "user"."id" = $2 RETURNING "user"."id", "user"."email" + +-- DELETE with RETURNING +DELETE FROM "user" WHERE "user"."id" = $1 RETURNING "user"."id", "user"."email" +``` + +**Note:** MySQL does not support RETURNING clauses. A future MySQL adapter would declare `returning: false` and either reject plans with RETURNING or provide an alternative implementation. + +## Exports + +- `./adapter`: Adapter implementation (`createPostgresAdapter`) +- `./codec-types`: PostgreSQL codec types (`CodecTypes`, `dataTypes`) +- `./types`: PostgreSQL-specific types +- `./control`: Control-plane entry point (adapter descriptor) +- `./runtime`: Runtime-plane entry point (runtime adapter descriptor) + diff --git a/packages/3-targets/6-adapters/sqlite/biome.jsonc b/packages/3-targets/6-adapters/sqlite/biome.jsonc new file mode 100644 index 0000000000..b8994a7330 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/biome.jsonc @@ -0,0 +1,4 @@ +{ + "$schema": "https://biomejs.dev/schemas/2.3.11/schema.json", + "extends": "//" +} diff --git a/packages/3-targets/6-adapters/sqlite/package.json b/packages/3-targets/6-adapters/sqlite/package.json new file mode 100644 index 0000000000..ec4fd7afb9 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/package.json @@ -0,0 +1,69 @@ +{ + "name": "@prisma-next/adapter-sqlite", + "version": "0.0.1", + "type": "module", + "sideEffects": false, + "files": [ + "dist", + "src" + ], + "scripts": { + "build": "tsup --config tsup.config.ts && tsc --project tsconfig.build.json", + "test": "vitest run", + "test:coverage": "vitest run --coverage", + "typecheck": "tsc --project tsconfig.json --noEmit", + "lint": "biome check . --error-on-warnings", + "lint:fix": "biome check --write .", + "lint:fix:unsafe": "biome check --write --unsafe .", + "clean": "rm -rf dist dist-tsc dist-tsc-prod coverage .tmp-output" + }, + "dependencies": { + "@prisma-next/cli": "workspace:*", + "@prisma-next/contract": "workspace:*", + "@prisma-next/contract-authoring": "workspace:*", + "@prisma-next/core-control-plane": "workspace:*", + "@prisma-next/core-execution-plane": "workspace:*", + "@prisma-next/family-sql": "workspace:*", + "@prisma-next/sql-contract": "workspace:*", + "@prisma-next/sql-contract-ts": "workspace:*", + "@prisma-next/sql-operations": "workspace:*", + "@prisma-next/sql-relational-core": "workspace:*", + "@prisma-next/sql-schema-ir": "workspace:*", + "@prisma-next/utils": "workspace:*", + "arktype": "^2.0.0" + }, + "devDependencies": { + "@prisma-next/test-utils": "workspace:*", + "@prisma-next/tsconfig": "workspace:*", + "tsup": "catalog:", + "typescript": "catalog:", + "vitest": "catalog:" + }, + "exports": { + "./package.json": "./package.json", + "./adapter": { + "types": "./dist/exports/adapter.d.ts", + "import": "./dist/exports/adapter.js" + }, + "./types": { + "types": "./dist/exports/types.d.ts", + "import": "./dist/exports/types.js" + }, + "./codec-types": { + "types": "./dist/exports/codec-types.d.ts", + "import": "./dist/exports/codec-types.js" + }, + "./column-types": { + "types": "./dist/exports/column-types.d.ts", + "import": "./dist/exports/column-types.js" + }, + "./control": { + "types": "./dist/exports/control.d.ts", + "import": "./dist/exports/control.js" + }, + "./runtime": { + "types": "./dist/exports/runtime.d.ts", + "import": "./dist/exports/runtime.js" + } + } +} diff --git a/packages/3-targets/6-adapters/sqlite/src/core/adapter.ts b/packages/3-targets/6-adapters/sqlite/src/core/adapter.ts new file mode 100644 index 0000000000..e1565d41ec --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/core/adapter.ts @@ -0,0 +1,392 @@ +import type { + Adapter, + AdapterProfile, + BinaryExpr, + ColumnRef, + DeleteAst, + IncludeAst, + IncludeRef, + InsertAst, + JoinAst, + LiteralExpr, + LowererContext, + NullCheckExpr, + OperationExpr, + ParamRef, + QueryAst, + SelectAst, + UpdateAst, + WhereExpr, +} from '@prisma-next/sql-relational-core/ast'; +import { createCodecRegistry, isOperationExpr } from '@prisma-next/sql-relational-core/ast'; +import { codecDefinitions } from './codecs'; +import type { SqliteAdapterOptions, SqliteContract, SqliteLoweredStatement } from './types'; + +const defaultCapabilities = Object.freeze({ + sqlite: { + orderBy: true, + limit: true, + // Used today to gate includeMany(). SQLite implements includeMany via correlated subqueries. + lateral: true, + jsonAgg: true, + returning: true, + json1: true, + }, + sql: { + enums: false, + }, +}); + +class SqliteAdapterImpl implements Adapter { + readonly familyId = 'sql' as const; + readonly targetId = 'sqlite' as const; + + readonly profile: AdapterProfile<'sqlite'>; + private readonly codecRegistry = (() => { + const registry = createCodecRegistry(); + for (const definition of Object.values(codecDefinitions)) { + registry.register(definition.codec); + } + return registry; + })(); + + constructor(options?: SqliteAdapterOptions) { + this.profile = Object.freeze({ + id: options?.profileId ?? 'sqlite/default@1', + target: 'sqlite', + capabilities: defaultCapabilities, + codecs: () => this.codecRegistry, + }); + } + + lower(ast: QueryAst, context: LowererContext) { + let sql: string; + const params = context.params ? [...context.params] : []; + + if (ast.kind === 'select') { + sql = renderSelect(ast, context.contract); + } else if (ast.kind === 'insert') { + sql = renderInsert(ast); + } else if (ast.kind === 'update') { + sql = renderUpdate(ast); + } else if (ast.kind === 'delete') { + sql = renderDelete(ast); + } else { + throw new Error(`Unsupported AST kind: ${(ast as { kind: string }).kind}`); + } + + return Object.freeze({ + profileId: this.profile.id, + body: Object.freeze({ sql, params }), + }); + } + + /** + * Adapter-owned marker reader statement (ADR 021). + * + * The SQL family runtime should prefer this over hardcoded Postgres marker SQL. + */ + markerReaderStatement(): { readonly sql: string; readonly params: readonly unknown[] } { + return { + sql: `select + core_hash, + profile_hash, + contract_json, + canonical_version, + updated_at, + app_tag, + meta + from prisma_contract_marker + where id = ?1`, + params: [1], + }; + } +} + +function renderSelect(ast: SelectAst, contract?: SqliteContract): string { + const selectClause = `SELECT ${renderProjection(ast, contract)}`; + const fromClause = `FROM ${quoteIdentifier(ast.from.name)}`; + + const joinsClause = ast.joins?.length + ? ast.joins.map((join) => renderJoin(join, contract)).join(' ') + : ''; + + const whereClause = ast.where ? ` WHERE ${renderWhere(ast.where, contract)}` : ''; + const orderClause = ast.orderBy?.length + ? ` ORDER BY ${ast.orderBy + .map((order) => { + const expr = renderExpr(order.expr as ColumnRef | OperationExpr, contract); + return `${expr} ${order.dir.toUpperCase()}`; + }) + .join(', ')}` + : ''; + const limitClause = typeof ast.limit === 'number' ? ` LIMIT ${ast.limit}` : ''; + + const clauses = [joinsClause].filter(Boolean).join(' '); + return `${selectClause} ${fromClause}${clauses ? ` ${clauses}` : ''}${whereClause}${orderClause}${limitClause}`.trim(); +} + +function renderProjection(ast: SelectAst, contract?: SqliteContract): string { + const includesByAlias = new Map(); + for (const include of ast.includes ?? []) { + includesByAlias.set(include.alias, include); + } + + return ast.project + .map((item) => { + const expr = item.expr as ColumnRef | IncludeRef | OperationExpr | LiteralExpr; + + if (expr.kind === 'includeRef') { + const include = includesByAlias.get(expr.alias); + if (!include) { + throw new Error(`Missing include definition for alias '${expr.alias}'`); + } + + const includeExpr = renderIncludeProjection(include, contract); + return `${includeExpr} AS ${quoteIdentifier(item.alias)}`; + } + + if (expr.kind === 'operation') { + const operation = renderOperation(expr, contract); + return `${operation} AS ${quoteIdentifier(item.alias)}`; + } + + if (expr.kind === 'literal') { + const literal = renderLiteral(expr); + return `${literal} AS ${quoteIdentifier(item.alias)}`; + } + + const column = renderColumn(expr as ColumnRef); + return `${column} AS ${quoteIdentifier(item.alias)}`; + }) + .join(', '); +} + +function renderIncludeProjection(include: IncludeAst, contract?: SqliteContract): string { + const child = include.child; + const childTable = quoteIdentifier(child.table.name); + + // Build WHERE: ON predicate + optional child where + const onCondition = renderJoinOn(child.on); + let whereClause = ` WHERE ${onCondition}`; + if (child.where) { + whereClause += ` AND ${renderWhere(child.where, contract)}`; + } + + const innerColumns = child.project + .map( + (item) => + `${renderExpr(item.expr as ColumnRef | OperationExpr, contract)} AS ${quoteIdentifier(item.alias)}`, + ) + .join(', '); + + const innerOrderBy = child.orderBy?.length + ? ` ORDER BY ${child.orderBy + .map((order) => { + const expr = renderExpr(order.expr as ColumnRef | OperationExpr, contract); + return `${expr} ${order.dir.toUpperCase()}`; + }) + .join(', ')}` + : ''; + + const innerLimit = typeof child.limit === 'number' ? ` LIMIT ${child.limit}` : ''; + + const innerSelect = `SELECT ${innerColumns} FROM ${childTable}${whereClause}${innerOrderBy}${innerLimit}`; + + const jsonObjectArgs = child.project + .map((item) => `'${item.alias}', sub.${quoteIdentifier(item.alias)}`) + .join(', '); + + // Always wrap to make ORDER BY/LIMIT deterministic for aggregation. + const aggregate = `SELECT json_group_array(json_object(${jsonObjectArgs})) FROM (${innerSelect}) sub`; + + // Ensure decodeRow() sees a JSON array even when there are no children. + return `coalesce((${aggregate}), '[]')`; +} + +function renderWhere(expr: WhereExpr, contract?: SqliteContract): string { + if (expr.kind === 'exists') { + const notKeyword = expr.not ? 'NOT ' : ''; + const subquery = renderSelect(expr.subquery, contract); + return `${notKeyword}EXISTS (${subquery})`; + } + if (expr.kind === 'nullCheck') { + return renderNullCheck(expr, contract); + } + return renderBinary(expr, contract); +} + +function renderNullCheck(expr: NullCheckExpr, contract?: SqliteContract): string { + const rendered = renderExpr(expr.expr as ColumnRef | OperationExpr, contract); + const renderedExpr = isOperationExpr(expr.expr) ? `(${rendered})` : rendered; + return expr.isNull ? `${renderedExpr} IS NULL` : `${renderedExpr} IS NOT NULL`; +} + +function renderBinary(expr: BinaryExpr, contract?: SqliteContract): string { + const leftExpr = expr.left as ColumnRef | OperationExpr; + const left = renderExpr(leftExpr, contract); + const rightExpr = expr.right as ParamRef | ColumnRef; + const right = + rightExpr.kind === 'col' ? renderColumn(rightExpr) : renderParam(rightExpr as ParamRef); + const leftRendered = isOperationExpr(leftExpr) ? `(${left})` : left; + + const operatorMap: Record = { + eq: '=', + neq: '!=', + gt: '>', + lt: '<', + gte: '>=', + lte: '<=', + }; + + return `${leftRendered} ${operatorMap[expr.op]} ${right}`; +} + +function renderColumn(ref: ColumnRef): string { + return `${quoteIdentifier(ref.table)}.${quoteIdentifier(ref.column)}`; +} + +function renderExpr(expr: ColumnRef | OperationExpr, contract?: SqliteContract): string { + if (isOperationExpr(expr)) { + return renderOperation(expr, contract); + } + return renderColumn(expr); +} + +function renderParam(ref: ParamRef): string { + // Use numeric placeholders for stable ordering: ?1, ?2, ... + return `?${ref.index}`; +} + +function renderLiteral(expr: LiteralExpr): string { + if (typeof expr.value === 'string') { + return `'${expr.value.replace(/'/g, "''")}'`; + } + if (typeof expr.value === 'number' || typeof expr.value === 'boolean') { + return String(expr.value); + } + if (expr.value === null) { + return 'NULL'; + } + if (Array.isArray(expr.value)) { + // SQLite doesn't have ARRAY literal; fall back to JSON. + return `'${JSON.stringify(expr.value).replace(/'/g, "''")}'`; + } + return `'${JSON.stringify(expr.value).replace(/'/g, "''")}'`; +} + +function renderOperation(expr: OperationExpr, contract?: SqliteContract): string { + void contract; + const self = renderExpr(expr.self as ColumnRef | OperationExpr, contract); + const args = expr.args.map((arg) => { + if (arg.kind === 'col') { + return renderColumn(arg); + } + if (arg.kind === 'param') { + return renderParam(arg); + } + if (arg.kind === 'literal') { + return renderLiteral(arg); + } + if (arg.kind === 'operation') { + return renderOperation(arg, contract); + } + const _exhaustive: never = arg; + throw new Error(`Unsupported argument kind: ${(_exhaustive as { kind: string }).kind}`); + }); + + let result = expr.lowering.template; + // Support both runtime `${self}` templates and manifest-safe `{{self}}` templates. + result = result.replace(/\$\{self\}|\{\{self\}\}/g, self); + for (let i = 0; i < args.length; i++) { + result = result.replace(new RegExp(`\\$\\{arg${i}\\}|\\{\\{arg${i}\\}\\}`, 'g'), args[i] ?? ''); + } + + return result; +} + +function renderJoin(join: JoinAst, contract?: SqliteContract): string { + void contract; + const joinType = join.joinType.toUpperCase(); + const table = quoteIdentifier(join.table.name); + const onClause = renderJoinOn(join.on); + return `${joinType} JOIN ${table} ON ${onClause}`; +} + +function renderJoinOn(on: JoinAst['on']): string { + if (on.kind === 'eqCol') { + const left = renderColumn(on.left); + const right = renderColumn(on.right); + return `${left} = ${right}`; + } + throw new Error(`Unsupported join ON expression kind: ${on.kind}`); +} + +function renderInsert(ast: InsertAst): string { + const table = quoteIdentifier(ast.table.name); + const columns = Object.keys(ast.values).map((col) => quoteIdentifier(col)); + const values = Object.values(ast.values).map((val) => { + if (val.kind === 'param') { + return renderParam(val); + } + if (val.kind === 'col') { + return `${quoteIdentifier(val.table)}.${quoteIdentifier(val.column)}`; + } + throw new Error(`Unsupported value kind in INSERT: ${(val as { kind: string }).kind}`); + }); + + const insertClause = `INSERT INTO ${table} (${columns.join(', ')}) VALUES (${values.join(', ')})`; + const returningClause = ast.returning?.length + ? ` RETURNING ${ast.returning + .map((col) => `${quoteIdentifier(col.table)}.${quoteIdentifier(col.column)}`) + .join(', ')}` + : ''; + + return `${insertClause}${returningClause}`; +} + +function renderUpdate(ast: UpdateAst): string { + const table = quoteIdentifier(ast.table.name); + const setClauses = Object.entries(ast.set).map(([col, val]) => { + const column = quoteIdentifier(col); + let value: string; + if (val.kind === 'param') { + value = renderParam(val); + } else if (val.kind === 'col') { + value = `${quoteIdentifier(val.table)}.${quoteIdentifier(val.column)}`; + } else { + throw new Error(`Unsupported value kind in UPDATE: ${(val as { kind: string }).kind}`); + } + return `${column} = ${value}`; + }); + + const whereClause = ` WHERE ${renderWhere(ast.where, undefined)}`; + const returningClause = ast.returning?.length + ? ` RETURNING ${ast.returning + .map((col) => `${quoteIdentifier(col.table)}.${quoteIdentifier(col.column)}`) + .join(', ')}` + : ''; + + return `UPDATE ${table} SET ${setClauses.join(', ')}${whereClause}${returningClause}`; +} + +function renderDelete(ast: DeleteAst, contract?: SqliteContract): string { + void contract; + const table = quoteIdentifier(ast.table.name); + const whereClause = ` WHERE ${renderWhere(ast.where, contract)}`; + const returningClause = ast.returning?.length + ? ` RETURNING ${ast.returning + .map((col) => `${quoteIdentifier(col.table)}.${quoteIdentifier(col.column)}`) + .join(', ')}` + : ''; + + return `DELETE FROM ${table}${whereClause}${returningClause}`; +} + +function quoteIdentifier(identifier: string): string { + return `"${identifier.replace(/"/g, '""')}"`; +} + +export function createSqliteAdapter(options?: SqliteAdapterOptions) { + return Object.freeze(new SqliteAdapterImpl(options)); +} diff --git a/packages/3-targets/6-adapters/sqlite/src/core/codecs.ts b/packages/3-targets/6-adapters/sqlite/src/core/codecs.ts new file mode 100644 index 0000000000..532f6a23bc --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/core/codecs.ts @@ -0,0 +1,104 @@ +/** + * Unified codec definitions for SQLite adapter. + * + * Single source of truth for: + * - Scalar names + * - Type IDs + * - Codec implementations (runtime) + * - Type information (compile-time) + */ + +import { codec, defineCodecs } from '@prisma-next/sql-relational-core/ast'; + +const sqliteTextCodec = codec<'sqlite/text@1', string, string>({ + typeId: 'sqlite/text@1', + targetTypes: ['text'], + encode: (value) => value, + decode: (wire) => String(wire), + meta: { + db: { + sql: { + sqlite: { + nativeType: 'text', + }, + }, + }, + }, +}); + +const sqliteIntCodec = codec<'sqlite/int@1', number, number>({ + typeId: 'sqlite/int@1', + targetTypes: ['int'], + encode: (value) => value, + decode: (wire) => (typeof wire === 'number' ? wire : Number(wire)), + meta: { + db: { + sql: { + sqlite: { + nativeType: 'integer', + }, + }, + }, + }, +}); + +const sqliteRealCodec = codec<'sqlite/real@1', number, number>({ + typeId: 'sqlite/real@1', + targetTypes: ['real'], + encode: (value) => value, + decode: (wire) => (typeof wire === 'number' ? wire : Number(wire)), + meta: { + db: { + sql: { + sqlite: { + nativeType: 'real', + }, + }, + }, + }, +}); + +const sqliteDatetimeCodec = codec<'sqlite/datetime@1', string, string | Date>({ + typeId: 'sqlite/datetime@1', + targetTypes: ['datetime'], + encode: (value) => (value instanceof Date ? value.toISOString() : String(value)), + decode: (wire) => String(wire), + meta: { + db: { + sql: { + sqlite: { + // SQLite doesn't enforce types; store datetimes as ISO-ish TEXT. + nativeType: 'text', + }, + }, + }, + }, +}); + +const sqliteBoolCodec = codec<'sqlite/bool@1', number, boolean>({ + typeId: 'sqlite/bool@1', + targetTypes: ['bool'], + encode: (value) => (value ? 1 : 0), + decode: (wire) => Boolean(wire), + meta: { + db: { + sql: { + sqlite: { + nativeType: 'integer', + }, + }, + }, + }, +}); + +const codecs = defineCodecs() + .add('text', sqliteTextCodec) + .add('int', sqliteIntCodec) + .add('real', sqliteRealCodec) + .add('datetime', sqliteDatetimeCodec) + .add('bool', sqliteBoolCodec); + +export const codecDefinitions = codecs.codecDefinitions; +export const dataTypes = codecs.dataTypes; + +export type CodecTypes = typeof codecs.CodecTypes; diff --git a/packages/3-targets/6-adapters/sqlite/src/core/control-adapter.ts b/packages/3-targets/6-adapters/sqlite/src/core/control-adapter.ts new file mode 100644 index 0000000000..787d7aa18e --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/core/control-adapter.ts @@ -0,0 +1,234 @@ +import type { ControlDriverInstance } from '@prisma-next/core-control-plane/types'; +import type { SqlControlAdapter } from '@prisma-next/family-sql/control-adapter'; +import type { + PrimaryKey, + SqlColumnIR, + SqlForeignKeyIR, + SqlIndexIR, + SqlSchemaIR, + SqlTableIR, + SqlUniqueIR, +} from '@prisma-next/sql-schema-ir/types'; +import { ifDefined } from '@prisma-next/utils/defined'; +import { parseSqliteDefault } from './default-normalizer'; + +type SqliteTableRow = { name: string }; + +type PragmaTableInfoRow = { + cid: number; + name: string; + type: string; + notnull: number; + dflt_value: string | null; + pk: number; +}; + +type PragmaIndexListRow = { + seq: number; + name: string; + unique: number; + origin: string; + partial: number; +}; + +type PragmaIndexInfoRow = { + seqno: number; + cid: number; + name: string; +}; + +type PragmaForeignKeyRow = { + id: number; + seq: number; + table: string; + from: string; + to: string; + on_update: string; + on_delete: string; + match: string; +}; + +/** + * SQLite control plane adapter for control-plane operations like introspection. + */ +export class SqliteControlAdapter implements SqlControlAdapter<'sqlite'> { + readonly familyId = 'sql' as const; + readonly targetId = 'sqlite' as const; + /** + * @deprecated Use targetId instead + */ + readonly target = 'sqlite' as const; + + readonly normalizeDefault = parseSqliteDefault; + + async introspect( + driver: ControlDriverInstance<'sql', 'sqlite'>, + _contractIR?: unknown, + _schema?: string, + ): Promise { + const tablesResult = await driver.query( + `select name + from sqlite_master + where type = 'table' + and name not like 'sqlite_%' + and name not like 'prisma_contract_%' + order by name`, + ); + + const tables: Record = {}; + + for (const tableRow of tablesResult.rows) { + const tableName = tableRow.name; + + const columnsResult = await driver.query( + `pragma table_info(${escapeSqliteStringLiteral(tableName)})`, + ); + + const columns: Record = {}; + + // Primary key columns: pk is 1..N for composite keys. + const pkColumns = columnsResult.rows + .filter((r) => r.pk > 0) + .sort((a, b) => a.pk - b.pk) + .map((r) => r.name); + + const primaryKey: PrimaryKey | undefined = + pkColumns.length > 0 ? { columns: pkColumns } : undefined; + + for (const colRow of columnsResult.rows) { + // Normalize to lower-case for consistent comparisons with contract nativeType, + // which is always emitted in lower-case (e.g., "text", "integer"). + const nativeType = + colRow.type && colRow.type.length > 0 ? colRow.type.trim().toLowerCase() : 'text'; + + // SQLite reports `notnull = 0` for PRIMARY KEY columns even though they are not nullable. + // It also omits any explicit default for INTEGER PRIMARY KEY columns even though they have + // implicit autoincrement semantics. We encode these semantics into the IR so schema + // verification can compare against contract defaults/nullability. + let defaultValue = colRow.dflt_value ?? undefined; + if ( + defaultValue === undefined && + pkColumns.length === 1 && + pkColumns[0] === colRow.name && + /int/i.test(nativeType) + ) { + defaultValue = 'autoincrement()'; + } + + columns[colRow.name] = { + name: colRow.name, + nativeType, + nullable: colRow.notnull === 0 && colRow.pk === 0, + ...ifDefined('default', defaultValue), + }; + } + + const foreignKeys = await introspectForeignKeys(driver, tableName); + const { uniques, indexes } = await introspectIndexes(driver, tableName); + + tables[tableName] = { + name: tableName, + columns, + ...(primaryKey ? { primaryKey } : {}), + foreignKeys, + uniques, + indexes, + }; + } + + return { + tables, + extensions: [], + }; + } +} + +async function introspectForeignKeys( + driver: ControlDriverInstance<'sql', 'sqlite'>, + tableName: string, +): Promise { + const rows = await driver.query( + `pragma foreign_key_list(${escapeSqliteStringLiteral(tableName)})`, + ); + + const byId = new Map< + number, + { + referencedTable: string; + columns: string[]; + referencedColumns: string[]; + } + >(); + + for (const row of rows.rows) { + const existing = byId.get(row.id); + if (existing) { + existing.columns.push(row.from); + existing.referencedColumns.push(row.to); + continue; + } + byId.set(row.id, { + referencedTable: row.table, + columns: [row.from], + referencedColumns: [row.to], + }); + } + + return Array.from(byId.values()).map((fk) => ({ + columns: Object.freeze([...fk.columns]) as readonly string[], + referencedTable: fk.referencedTable, + referencedColumns: Object.freeze([...fk.referencedColumns]) as readonly string[], + })); +} + +async function introspectIndexes( + driver: ControlDriverInstance<'sql', 'sqlite'>, + tableName: string, +): Promise<{ readonly uniques: readonly SqlUniqueIR[]; readonly indexes: readonly SqlIndexIR[] }> { + const list = await driver.query( + `pragma index_list(${escapeSqliteStringLiteral(tableName)})`, + ); + + const uniques: SqlUniqueIR[] = []; + const indexes: SqlIndexIR[] = []; + + for (const idx of list.rows) { + // Skip PK indexes (covered by pragma_table_info pk metadata) + if (idx.origin === 'pk') { + continue; + } + + const info = await driver.query( + `pragma index_info(${escapeSqliteStringLiteral(idx.name)})`, + ); + + const columns = info.rows + .slice() + .sort((a, b) => a.seqno - b.seqno) + .map((r) => r.name); + + if (columns.length === 0) { + continue; + } + + if (idx.unique === 1) { + uniques.push({ + columns, + name: idx.name, + }); + continue; + } + + indexes.push({ + columns, + name: idx.name, + unique: false, + }); + } + + return { uniques, indexes }; +} + +function escapeSqliteStringLiteral(value: string): string { + return `'${value.replace(/'/g, "''")}'`; +} diff --git a/packages/3-targets/6-adapters/sqlite/src/core/default-normalizer.ts b/packages/3-targets/6-adapters/sqlite/src/core/default-normalizer.ts new file mode 100644 index 0000000000..6333972f03 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/core/default-normalizer.ts @@ -0,0 +1,42 @@ +import type { ColumnDefault } from '@prisma-next/contract/types'; + +const CURRENT_TIMESTAMP_PATTERN = /^current_timestamp$/i; +const DATETIME_NOW_PATTERN = /^datetime\s*\(\s*'now'\s*\)$/i; +const TRUE_PATTERN = /^true$/i; +const FALSE_PATTERN = /^false$/i; +const NUMERIC_PATTERN = /^-?\d+(\.\d+)?$/; +const STRING_LITERAL_PATTERN = /^'((?:[^']|'')*)'$/; + +/** + * Parses a raw SQLite column default expression into a normalized ColumnDefault. + * + * SQLite stores defaults as text fragments (e.g., CURRENT_TIMESTAMP, 0, 'hello'). + * Normalization enables semantic comparison against contract defaults. + */ +export function parseSqliteDefault(rawDefault: string): ColumnDefault | undefined { + const trimmed = rawDefault.trim(); + + // now(): CURRENT_TIMESTAMP or datetime('now') + if (CURRENT_TIMESTAMP_PATTERN.test(trimmed) || DATETIME_NOW_PATTERN.test(trimmed)) { + return { kind: 'function', expression: 'now()' }; + } + + if (TRUE_PATTERN.test(trimmed)) { + return { kind: 'literal', expression: 'true' }; + } + if (FALSE_PATTERN.test(trimmed)) { + return { kind: 'literal', expression: 'false' }; + } + + if (NUMERIC_PATTERN.test(trimmed)) { + return { kind: 'literal', expression: trimmed }; + } + + const stringMatch = trimmed.match(STRING_LITERAL_PATTERN); + if (stringMatch?.[1] !== undefined) { + return { kind: 'literal', expression: trimmed }; + } + + // Unknown default; preserve raw expression. + return { kind: 'function', expression: trimmed }; +} diff --git a/packages/3-targets/6-adapters/sqlite/src/core/descriptor-meta.ts b/packages/3-targets/6-adapters/sqlite/src/core/descriptor-meta.ts new file mode 100644 index 0000000000..a117b0dc10 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/core/descriptor-meta.ts @@ -0,0 +1,45 @@ +export const sqliteAdapterDescriptorMeta = { + kind: 'adapter', + familyId: 'sql', + targetId: 'sqlite', + id: 'sqlite', + version: '0.0.1', + capabilities: { + // Contract capability requirements are declared under contract.capabilities[contract.target], + // so this must match 'sqlite' for lane gating. + sqlite: { + orderBy: true, + limit: true, + // Used today to gate includeMany() in lanes. SQLite does not support LATERAL, but the + // adapter implements includeMany via correlated subqueries. This key is legacy/misnamed. + lateral: true, + // JSON aggregation requires JSON1. In most modern builds this is available. + jsonAgg: true, + // SQLite supports RETURNING since 3.35.0. We assume a modern SQLite for the demo. + returning: true, + // SQLite-specific feature flags (doc-level only today; not enforced in code yet). + json1: true, + }, + sql: { + enums: false, + }, + }, + types: { + codecTypes: { + import: { + package: '@prisma-next/adapter-sqlite/codec-types', + named: 'CodecTypes', + alias: 'SqliteTypes', + }, + parameterized: {}, + controlPlaneHooks: {}, + }, + storage: [ + { typeId: 'sqlite/text@1', familyId: 'sql', targetId: 'sqlite', nativeType: 'text' }, + { typeId: 'sqlite/int@1', familyId: 'sql', targetId: 'sqlite', nativeType: 'integer' }, + { typeId: 'sqlite/real@1', familyId: 'sql', targetId: 'sqlite', nativeType: 'real' }, + { typeId: 'sqlite/datetime@1', familyId: 'sql', targetId: 'sqlite', nativeType: 'text' }, + { typeId: 'sqlite/bool@1', familyId: 'sql', targetId: 'sqlite', nativeType: 'integer' }, + ], + }, +} as const; diff --git a/packages/3-targets/6-adapters/sqlite/src/core/sql-utils.ts b/packages/3-targets/6-adapters/sqlite/src/core/sql-utils.ts new file mode 100644 index 0000000000..37205f6b62 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/core/sql-utils.ts @@ -0,0 +1,87 @@ +/** + * Shared SQL utility functions for the SQLite adapter. + * + * These functions handle safe SQL identifier and literal escaping + * with security validations to prevent injection and encoding issues. + */ + +/** + * Error thrown when an invalid SQL identifier or literal is detected. + * Boundary layers map this to structured envelopes. + */ +export class SqlEscapeError extends Error { + constructor( + message: string, + public readonly value: string, + public readonly kind: 'identifier' | 'literal', + ) { + super(message); + this.name = 'SqlEscapeError'; + } +} + +/** + * Practical identifier length limit used for diagnostics. + * + * SQLite supports long identifiers; this is primarily a guardrail for accidental abuse and + * to keep parity with other targets that truncate around 63. + */ +const MAX_IDENTIFIER_LENGTH = 63; + +/** + * Validates and quotes a SQLite identifier (table, column names). + * + * Security validations: + * - Rejects null bytes which could cause truncation or unexpected behavior + * - Rejects empty identifiers + * - Warns on very long identifiers (diagnostic only) + * + * @throws {SqlEscapeError} If the identifier contains null bytes or is empty + */ +export function quoteIdentifier(identifier: string): string { + if (identifier.length === 0) { + throw new SqlEscapeError('Identifier cannot be empty', identifier, 'identifier'); + } + if (identifier.includes('\0')) { + throw new SqlEscapeError( + 'Identifier cannot contain null bytes', + identifier.replace(/\0/g, '\\0'), + 'identifier', + ); + } + // Diagnostic-only warning for unusually long identifiers. + if (identifier.length > MAX_IDENTIFIER_LENGTH) { + console.warn( + `Identifier "${identifier.slice(0, 20)}..." exceeds ${MAX_IDENTIFIER_LENGTH} characters`, + ); + } + return `"${identifier.replace(/"/g, '""')}"`; +} + +/** + * Escapes a string literal for safe use in SQL statements. + * + * Security validations: + * - Rejects null bytes which could cause truncation or unexpected behavior + * + * @throws {SqlEscapeError} If the value contains null bytes + */ +export function escapeLiteral(value: string): string { + if (value.includes('\0')) { + throw new SqlEscapeError( + 'Literal value cannot contain null bytes', + value.replace(/\0/g, '\\0'), + 'literal', + ); + } + return value.replace(/'/g, "''"); +} + +/** + * Builds a qualified name (db.table) with proper quoting. + * + * In SQLite this is primarily used for attached databases, not schemas. + */ +export function qualifyName(schemaName: string, objectName: string): string { + return `${quoteIdentifier(schemaName)}.${quoteIdentifier(objectName)}`; +} diff --git a/packages/3-targets/6-adapters/sqlite/src/core/types.ts b/packages/3-targets/6-adapters/sqlite/src/core/types.ts new file mode 100644 index 0000000000..d7caab5f05 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/core/types.ts @@ -0,0 +1,53 @@ +import type { + SqlContract, + SqlStorage, + StorageColumn, + StorageTable, +} from '@prisma-next/sql-contract/types'; +import type { + BinaryExpr, + ColumnRef, + DeleteAst, + Direction, + InsertAst, + JoinAst, + LiteralExpr, + LoweredStatement, + OperationExpr, + ParamRef, + QueryAst, + SelectAst, + UpdateAst, +} from '@prisma-next/sql-relational-core/ast'; + +export interface SqliteAdapterOptions { + readonly profileId?: string; +} + +export type SqliteContract = SqlContract & { readonly target: 'sqlite' }; + +export type Expr = ColumnRef | ParamRef; + +export interface OrderClause { + readonly expr: ColumnRef; + readonly dir: Direction; +} + +export type SqliteLoweredStatement = LoweredStatement; + +export type { + BinaryExpr, + ColumnRef, + DeleteAst, + Direction, + InsertAst, + JoinAst, + LiteralExpr, + OperationExpr, + ParamRef, + QueryAst, + SelectAst, + StorageColumn, + StorageTable, + UpdateAst, +}; diff --git a/packages/3-targets/6-adapters/sqlite/src/exports/adapter.ts b/packages/3-targets/6-adapters/sqlite/src/exports/adapter.ts new file mode 100644 index 0000000000..b2f62e2117 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/exports/adapter.ts @@ -0,0 +1 @@ +export { createSqliteAdapter } from '../core/adapter'; diff --git a/packages/3-targets/6-adapters/sqlite/src/exports/codec-types.ts b/packages/3-targets/6-adapters/sqlite/src/exports/codec-types.ts new file mode 100644 index 0000000000..b6b79fec8e --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/exports/codec-types.ts @@ -0,0 +1,11 @@ +/** + * Codec type definitions for SQLite adapter. + * + * This file exports type-only definitions for codec input/output types. + * These types are imported by contract.d.ts files for compile-time type inference. + * + * Runtime codec implementations are provided by the adapter's codec registry. + */ + +export type { CodecTypes } from '../core/codecs'; +export { dataTypes } from '../core/codecs'; diff --git a/packages/3-targets/6-adapters/sqlite/src/exports/column-types.ts b/packages/3-targets/6-adapters/sqlite/src/exports/column-types.ts new file mode 100644 index 0000000000..d86c2e312d --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/exports/column-types.ts @@ -0,0 +1,33 @@ +/** + * Column type descriptors for SQLite adapter. + * + * These descriptors provide both codecId and nativeType for use in contract authoring. + */ + +import type { ColumnTypeDescriptor } from '@prisma-next/contract-authoring'; + +export const textColumn: ColumnTypeDescriptor = { + codecId: 'sqlite/text@1', + nativeType: 'text', +} as const; + +export const intColumn: ColumnTypeDescriptor = { + codecId: 'sqlite/int@1', + nativeType: 'integer', +} as const; + +export const realColumn: ColumnTypeDescriptor = { + codecId: 'sqlite/real@1', + nativeType: 'real', +} as const; + +// Store datetimes as TEXT (ISO string) for deterministic JS decode. +export const datetimeColumn: ColumnTypeDescriptor = { + codecId: 'sqlite/datetime@1', + nativeType: 'text', +} as const; + +export const boolColumn: ColumnTypeDescriptor = { + codecId: 'sqlite/bool@1', + nativeType: 'integer', +} as const; diff --git a/packages/3-targets/6-adapters/sqlite/src/exports/control.ts b/packages/3-targets/6-adapters/sqlite/src/exports/control.ts new file mode 100644 index 0000000000..faee5086db --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/exports/control.ts @@ -0,0 +1,24 @@ +import type { ControlAdapterDescriptor } from '@prisma-next/core-control-plane/types'; +import type { SqlControlAdapter } from '@prisma-next/family-sql/control-adapter'; +import { SqliteControlAdapter } from '../core/control-adapter'; +import { parseSqliteDefault } from '../core/default-normalizer'; +import { sqliteAdapterDescriptorMeta } from '../core/descriptor-meta'; +import { escapeLiteral, qualifyName, quoteIdentifier, SqlEscapeError } from '../core/sql-utils'; + +/** + * SQLite adapter descriptor for CLI config. + */ +const sqliteAdapterDescriptor: ControlAdapterDescriptor< + 'sql', + 'sqlite', + SqlControlAdapter<'sqlite'> +> = { + ...sqliteAdapterDescriptorMeta, + create(): SqlControlAdapter<'sqlite'> { + return new SqliteControlAdapter(); + }, +}; + +export default sqliteAdapterDescriptor; + +export { escapeLiteral, parseSqliteDefault, qualifyName, quoteIdentifier, SqlEscapeError }; diff --git a/packages/3-targets/6-adapters/sqlite/src/exports/runtime.ts b/packages/3-targets/6-adapters/sqlite/src/exports/runtime.ts new file mode 100644 index 0000000000..da32efec46 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/exports/runtime.ts @@ -0,0 +1,29 @@ +import type { + RuntimeAdapterDescriptor, + RuntimeAdapterInstance, +} from '@prisma-next/core-execution-plane/types'; +import type { Adapter, QueryAst } from '@prisma-next/sql-relational-core/ast'; +import { createSqliteAdapter } from '../core/adapter'; +import { sqliteAdapterDescriptorMeta } from '../core/descriptor-meta'; +import type { SqliteContract, SqliteLoweredStatement } from '../core/types'; + +/** + * SQL runtime adapter interface for SQLite. + * Extends RuntimeAdapterInstance with SQL-specific adapter methods. + */ +export interface SqlRuntimeAdapter + extends RuntimeAdapterInstance<'sql', 'sqlite'>, + Adapter {} + +/** + * SQLite adapter descriptor for runtime plane. + */ +const sqliteRuntimeAdapterDescriptor: RuntimeAdapterDescriptor<'sql', 'sqlite', SqlRuntimeAdapter> = + { + ...sqliteAdapterDescriptorMeta, + create(): SqlRuntimeAdapter { + return createSqliteAdapter(); + }, + }; + +export default sqliteRuntimeAdapterDescriptor; diff --git a/packages/3-targets/6-adapters/sqlite/src/exports/types.ts b/packages/3-targets/6-adapters/sqlite/src/exports/types.ts new file mode 100644 index 0000000000..56736ebef7 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/src/exports/types.ts @@ -0,0 +1,14 @@ +export type { + BinaryExpr, + ColumnRef, + Direction, + Expr, + OrderClause, + ParamRef, + SelectAst, + SqliteAdapterOptions, + SqliteContract, + SqliteLoweredStatement, + StorageColumn, + StorageTable, +} from '../core/types'; diff --git a/packages/3-targets/6-adapters/sqlite/test/adapter.lowering.test.ts b/packages/3-targets/6-adapters/sqlite/test/adapter.lowering.test.ts new file mode 100644 index 0000000000..3b3287e8ab --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/test/adapter.lowering.test.ts @@ -0,0 +1,133 @@ +import { validateContract } from '@prisma-next/sql-contract-ts/contract'; +import type { SelectAst } from '@prisma-next/sql-relational-core/ast'; +import { describe, expect, it } from 'vitest'; + +import { createSqliteAdapter } from '../src/core/adapter'; +import type { SqliteContract } from '../src/core/types'; + +const contract = Object.freeze( + validateContract({ + target: 'sqlite', + targetFamily: 'sql' as const, + coreHash: 'sha256:test-core', + profileHash: 'sha256:test-profile', + storage: { + tables: { + user: { + columns: { + id: { codecId: 'sqlite/int@1', nativeType: 'integer', nullable: false }, + email: { codecId: 'sqlite/text@1', nativeType: 'text', nullable: false }, + createdAt: { codecId: 'sqlite/datetime@1', nativeType: 'text', nullable: false }, + }, + primaryKey: { columns: ['id'] }, + uniques: [], + indexes: [], + foreignKeys: [], + }, + post: { + columns: { + id: { codecId: 'sqlite/int@1', nativeType: 'integer', nullable: false }, + title: { codecId: 'sqlite/text@1', nativeType: 'text', nullable: false }, + userId: { codecId: 'sqlite/int@1', nativeType: 'integer', nullable: false }, + createdAt: { codecId: 'sqlite/datetime@1', nativeType: 'text', nullable: false }, + }, + primaryKey: { columns: ['id'] }, + uniques: [], + indexes: [], + foreignKeys: [{ columns: ['userId'], references: { table: 'user', columns: ['id'] } }], + }, + }, + }, + models: {}, + relations: {}, + mappings: {}, + capabilities: {}, + extensionPacks: {}, + meta: {}, + sources: {}, + }), +); + +describe('createSqliteAdapter', () => { + it('lowers select AST into canonical SQL with numeric params', () => { + const adapter = createSqliteAdapter(); + + const ast = { + kind: 'select', + from: { kind: 'table', name: 'user' }, + project: [ + { alias: 'id', expr: { kind: 'col', table: 'user', column: 'id' } }, + { alias: 'email', expr: { kind: 'col', table: 'user', column: 'email' } }, + ], + where: { + kind: 'bin', + op: 'eq', + left: { kind: 'col', table: 'user', column: 'id' }, + right: { kind: 'param', index: 1, name: 'userId' }, + }, + orderBy: [ + { + expr: { kind: 'col', table: 'user', column: 'createdAt' }, + dir: 'desc', + }, + ], + limit: 5, + } as const; + + const lowered = adapter.lower(ast, { contract, params: [42] }); + + expect(lowered.body).toEqual({ + sql: 'SELECT "user"."id" AS "id", "user"."email" AS "email" FROM "user" WHERE "user"."id" = ?1 ORDER BY "user"."createdAt" DESC LIMIT 5', + params: [42], + }); + }); + + it('renders includeMany using correlated subquery + JSON1', () => { + const adapter = createSqliteAdapter(); + + const ast: SelectAst = { + kind: 'select', + from: { kind: 'table', name: 'user' }, + includes: [ + { + kind: 'includeMany', + alias: 'posts', + child: { + table: { kind: 'table', name: 'post' }, + on: { + kind: 'eqCol', + left: { kind: 'col', table: 'user', column: 'id' }, + right: { kind: 'col', table: 'post', column: 'userId' }, + }, + orderBy: [ + { + expr: { kind: 'col', table: 'post', column: 'createdAt' }, + dir: 'desc', + }, + ], + limit: 10, + project: [ + { alias: 'id', expr: { kind: 'col', table: 'post', column: 'id' } }, + { alias: 'title', expr: { kind: 'col', table: 'post', column: 'title' } }, + ], + }, + }, + ], + project: [ + { alias: 'id', expr: { kind: 'col', table: 'user', column: 'id' } }, + { alias: 'posts', expr: { kind: 'includeRef', alias: 'posts' } }, + ], + }; + + const result = adapter.lower(ast, { contract, params: [] }); + + expect(result.body.sql).toContain('coalesce(('); + expect(result.body.sql).toContain('json_group_array'); + expect(result.body.sql).toContain('json_object'); + expect(result.body.sql).toContain('FROM "post"'); + expect(result.body.sql).toContain('WHERE "user"."id" = "post"."userId"'); + expect(result.body.sql).toContain('ORDER BY "post"."createdAt" DESC'); + expect(result.body.sql).toContain('LIMIT 10'); + expect(result.body.sql).toContain('AS "posts"'); + }); +}); diff --git a/packages/3-targets/6-adapters/sqlite/test/control-adapter.introspect.test.ts b/packages/3-targets/6-adapters/sqlite/test/control-adapter.introspect.test.ts new file mode 100644 index 0000000000..b7f3ec9253 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/test/control-adapter.introspect.test.ts @@ -0,0 +1,59 @@ +import { DatabaseSync } from 'node:sqlite'; +import type { ControlDriverInstance } from '@prisma-next/core-control-plane/types'; +import { describe, expect, it } from 'vitest'; + +import { SqliteControlAdapter } from '../src/core/control-adapter'; + +function createTestDriver(db: DatabaseSync): ControlDriverInstance<'sql', 'sqlite'> { + return { + familyId: 'sql', + targetId: 'sqlite', + // @deprecated + target: 'sqlite', + async query>( + sql: string, + _params?: readonly unknown[], + ): Promise<{ readonly rows: Row[] }> { + const stmt = db.prepare(sql); + const returnsRows = stmt.columns().length > 0; + if (!returnsRows) { + stmt.run(); + return { rows: [] }; + } + return { rows: stmt.all() as Row[] }; + }, + async close(): Promise { + db.close(); + }, + }; +} + +describe('SqliteControlAdapter', () => { + it('introspects primary keys as NOT NULL and encodes implicit autoincrement defaults', async () => { + const db = new DatabaseSync(':memory:'); + db.exec(` + create table user ( + id integer primary key, + email text not null, + createdAt text not null default (CURRENT_TIMESTAMP) + ); + `); + + const driver = createTestDriver(db); + const adapter = new SqliteControlAdapter(); + const schema = await adapter.introspect(driver); + + const user = schema.tables['user']; + expect(user).toBeDefined(); + expect(user?.primaryKey?.columns).toEqual(['id']); + expect(user?.columns['id']).toMatchObject({ + nativeType: 'integer', + nullable: false, + default: 'autoincrement()', + }); + expect(user?.columns['createdAt']).toMatchObject({ + nullable: false, + default: 'CURRENT_TIMESTAMP', + }); + }); +}); diff --git a/packages/3-targets/6-adapters/sqlite/test/dml.lowering.test.ts b/packages/3-targets/6-adapters/sqlite/test/dml.lowering.test.ts new file mode 100644 index 0000000000..9259e33dce --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/test/dml.lowering.test.ts @@ -0,0 +1,113 @@ +import { validateContract } from '@prisma-next/sql-contract-ts/contract'; +import type { DeleteAst, InsertAst, UpdateAst } from '@prisma-next/sql-relational-core/ast'; +import { describe, expect, it } from 'vitest'; + +import { createSqliteAdapter } from '../src/core/adapter'; +import type { SqliteContract } from '../src/core/types'; + +const contract = Object.freeze( + validateContract({ + target: 'sqlite', + targetFamily: 'sql' as const, + coreHash: 'sha256:test-core', + profileHash: 'sha256:test-profile', + storage: { + tables: { + user: { + columns: { + id: { codecId: 'sqlite/int@1', nativeType: 'integer', nullable: false }, + email: { codecId: 'sqlite/text@1', nativeType: 'text', nullable: false }, + }, + primaryKey: { columns: ['id'] }, + uniques: [], + indexes: [], + foreignKeys: [], + }, + }, + }, + models: {}, + relations: {}, + mappings: {}, + capabilities: {}, + extensionPacks: {}, + meta: {}, + sources: {}, + }), +); + +describe('SQLite DML lowering', () => { + it('lowers INSERT ... RETURNING', () => { + const adapter = createSqliteAdapter(); + + const ast: InsertAst = { + kind: 'insert', + table: { kind: 'table', name: 'user' }, + values: { + email: { kind: 'param', index: 1, name: 'email' }, + }, + returning: [ + { kind: 'col', table: 'user', column: 'id' }, + { kind: 'col', table: 'user', column: 'email' }, + ], + }; + + const lowered = adapter.lower(ast, { contract, params: ['a@b.com'] }); + expect(lowered.body.sql).toBe( + 'INSERT INTO "user" ("email") VALUES (?1) RETURNING "user"."id", "user"."email"', + ); + expect(lowered.body.params).toEqual(['a@b.com']); + }); + + it('lowers UPDATE ... RETURNING', () => { + const adapter = createSqliteAdapter(); + + const ast: UpdateAst = { + kind: 'update', + table: { kind: 'table', name: 'user' }, + set: { + email: { kind: 'param', index: 1, name: 'email' }, + }, + where: { + kind: 'bin', + op: 'eq', + left: { kind: 'col', table: 'user', column: 'id' }, + right: { kind: 'param', index: 2, name: 'userId' }, + }, + returning: [ + { kind: 'col', table: 'user', column: 'id' }, + { kind: 'col', table: 'user', column: 'email' }, + ], + }; + + const lowered = adapter.lower(ast, { contract, params: ['new@b.com', 1] }); + expect(lowered.body.sql).toBe( + 'UPDATE "user" SET "email" = ?1 WHERE "user"."id" = ?2 RETURNING "user"."id", "user"."email"', + ); + expect(lowered.body.params).toEqual(['new@b.com', 1]); + }); + + it('lowers DELETE ... RETURNING', () => { + const adapter = createSqliteAdapter(); + + const ast: DeleteAst = { + kind: 'delete', + table: { kind: 'table', name: 'user' }, + where: { + kind: 'bin', + op: 'eq', + left: { kind: 'col', table: 'user', column: 'id' }, + right: { kind: 'param', index: 1, name: 'userId' }, + }, + returning: [ + { kind: 'col', table: 'user', column: 'id' }, + { kind: 'col', table: 'user', column: 'email' }, + ], + }; + + const lowered = adapter.lower(ast, { contract, params: [1] }); + expect(lowered.body.sql).toBe( + 'DELETE FROM "user" WHERE "user"."id" = ?1 RETURNING "user"."id", "user"."email"', + ); + expect(lowered.body.params).toEqual([1]); + }); +}); diff --git a/packages/3-targets/6-adapters/sqlite/tsconfig.build.json b/packages/3-targets/6-adapters/sqlite/tsconfig.build.json new file mode 100644 index 0000000000..671541c1a3 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/tsconfig.build.json @@ -0,0 +1,12 @@ +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "rootDir": "src", + "outDir": "dist", + "declaration": true, + "declarationMap": true, + "emitDeclarationOnly": true + }, + "include": ["src/**/*.ts"], + "exclude": ["test", "dist"] +} diff --git a/packages/3-targets/6-adapters/sqlite/tsconfig.json b/packages/3-targets/6-adapters/sqlite/tsconfig.json new file mode 100644 index 0000000000..7afa587436 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/tsconfig.json @@ -0,0 +1,9 @@ +{ + "extends": ["@prisma-next/tsconfig/base"], + "compilerOptions": { + "rootDir": ".", + "outDir": "dist" + }, + "include": ["src/**/*.ts", "test/**/*.ts"], + "exclude": ["dist"] +} diff --git a/packages/3-targets/6-adapters/sqlite/tsup.config.ts b/packages/3-targets/6-adapters/sqlite/tsup.config.ts new file mode 100644 index 0000000000..078c165997 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/tsup.config.ts @@ -0,0 +1,19 @@ +import { defineConfig } from 'tsup'; + +export default defineConfig({ + entry: { + 'exports/adapter': 'src/exports/adapter.ts', + 'exports/types': 'src/exports/types.ts', + 'exports/codec-types': 'src/exports/codec-types.ts', + 'exports/column-types': 'src/exports/column-types.ts', + 'exports/control': 'src/exports/control.ts', + 'exports/runtime': 'src/exports/runtime.ts', + }, + outDir: 'dist', + format: ['esm'], + sourcemap: true, + dts: false, + clean: true, + target: 'es2022', + minify: false, +}); diff --git a/packages/3-targets/6-adapters/sqlite/vitest.config.ts b/packages/3-targets/6-adapters/sqlite/vitest.config.ts new file mode 100644 index 0000000000..ae09da78f5 --- /dev/null +++ b/packages/3-targets/6-adapters/sqlite/vitest.config.ts @@ -0,0 +1,32 @@ +import { timeouts } from '@prisma-next/test-utils'; +import { defineConfig } from 'vitest/config'; + +export default defineConfig({ + test: { + globals: true, + environment: 'node', + testTimeout: timeouts.default, + hookTimeout: timeouts.default, + coverage: { + provider: 'v8', + reporter: ['text', 'json', 'html'], + include: ['src/**/*.ts'], + exclude: [ + 'dist/**', + 'test/**', + '**/*.test.ts', + '**/*.test-d.ts', + '**/*.config.ts', + '**/exports/**', + '**/types.ts', + 'src/core/descriptor-meta.ts', + ], + thresholds: { + lines: 92, + branches: 87, + functions: 95, + statements: 92, + }, + }, + }, +}); diff --git a/packages/3-targets/7-drivers/sqlite/README.md b/packages/3-targets/7-drivers/sqlite/README.md new file mode 100644 index 0000000000..a128a03cef --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/README.md @@ -0,0 +1,101 @@ +# @prisma-next/driver-sqlite + +SQLite driver for Prisma Next. + +## Package Classification + +- **Domain**: targets +- **Layer**: drivers +- **Plane**: multi-plane (migration, runtime) + +## Overview + +The SQLite driver provides transport and connection management for SQLite databases (file-backed). It implements the `SqlDriver` interface for executing SQL statements, explaining queries, and managing connections. + +Drivers are transport-agnostic: they own pooling, connection management, and transport protocol (TCP, HTTP, etc.), but contain no dialect-specific logic. All dialect behavior lives in adapters. + +This package spans multiple planes: +- **Migration plane** (`src/exports/control.ts`): Control plane entry point for driver descriptors +- **Runtime plane** (`src/exports/runtime.ts`): Runtime entry point for driver implementation + +## Purpose + +Provide SQLite transport and connection management. Execute SQL statements and manage connections without dialect-specific logic. + +## Responsibilities + +- **Connection Management**: Acquire and release database connections +- **Statement Execution**: Execute SQL statements with parameters +- **Query Explanation**: Execute EXPLAIN queries for query analysis +- **Connection Pooling**: Manage connection pools (when applicable) +- **Transport**: Open and manage SQLite database handles (file-backed) + +**Non-goals:** +- Dialect-specific SQL lowering (adapters) +- Query compilation (sql-query) +- Runtime execution (runtime) + +## Architecture + +```mermaid +flowchart TD + subgraph "Runtime" + RT[Runtime] + ADAPTER[Adapter] + end + + subgraph "SQLite Driver" + DRIVER[Driver] + DBFILE[(DB File)] + end + + RT --> ADAPTER + ADAPTER --> DRIVER + DRIVER --> DBFILE + DRIVER --> RT +``` + +## Components + +### Driver (`sqlite-driver.ts`) +- Main driver implementation +- Implements `SqlDriver` interface +- Manages connections and executes statements +- Manages a SQLite database handle + +## Dependencies + +- **`@prisma-next/sql-contract`**: SQL contract types (via `@prisma-next/sql-contract/types`) + +## Related Subsystems + +- **[Adapters & Targets](../../docs/architecture%20docs/subsystems/5.%20Adapters%20&%20Targets.md)**: Driver specification + +## Related ADRs + +- [ADR 005 - Thin Core Fat Targets](../../docs/architecture%20docs/adrs/ADR%20005%20-%20Thin%20Core%20Fat%20Targets.md) +- [ADR 016 - Adapter SPI for Lowering](../../docs/architecture%20docs/adrs/ADR%20016%20-%20Adapter%20SPI%20for%20Lowering.md) + +## Usage + +```typescript +import { createSqliteDriver } from '@prisma-next/driver-sqlite/runtime'; +import { createRuntime } from '@prisma-next/sql-runtime'; + +const driver = createSqliteDriver({ connectionString: process.env.DATABASE_URL }); + +const runtime = createRuntime({ + contract, + adapter: sqliteAdapter, + driver, +}); +``` + +## Exports + +- `./runtime`: Runtime entry point for driver implementation + - `createSqliteDriver({ connectionString | filename, ... })`: Convenience creator + - `createSqliteDriverFromOptions(options)`: Create driver from a `SqliteDriverOptions` object + - Types: `SqliteDriverOptions`, `CreateSqliteDriverOptions` +- `./control`: Control plane entry point for driver descriptors + - Default export: `DriverDescriptor` for use in `prisma-next.config.ts` diff --git a/packages/3-targets/7-drivers/sqlite/biome.jsonc b/packages/3-targets/7-drivers/sqlite/biome.jsonc new file mode 100644 index 0000000000..b8994a7330 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/biome.jsonc @@ -0,0 +1,4 @@ +{ + "$schema": "https://biomejs.dev/schemas/2.3.11/schema.json", + "extends": "//" +} diff --git a/packages/3-targets/7-drivers/sqlite/package.json b/packages/3-targets/7-drivers/sqlite/package.json new file mode 100644 index 0000000000..02d1f3fd07 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/package.json @@ -0,0 +1,48 @@ +{ + "name": "@prisma-next/driver-sqlite", + "version": "0.0.1", + "type": "module", + "sideEffects": false, + "scripts": { + "build": "tsup --config tsup.config.ts && tsc --project tsconfig.build.json", + "test": "vitest run", + "test:coverage": "vitest run --coverage", + "typecheck": "tsc --project tsconfig.json --noEmit", + "lint": "biome check . --error-on-warnings", + "lint:fix": "biome check --write .", + "lint:fix:unsafe": "biome check --write --unsafe .", + "clean": "rm -rf dist dist-tsc dist-tsc-prod coverage .tmp-output" + }, + "dependencies": { + "@prisma-next/contract": "workspace:*", + "@prisma-next/core-control-plane": "workspace:*", + "@prisma-next/core-execution-plane": "workspace:*", + "@prisma-next/sql-contract": "workspace:*", + "@prisma-next/sql-errors": "workspace:*", + "@prisma-next/sql-operations": "workspace:*", + "@prisma-next/sql-relational-core": "workspace:*", + "@prisma-next/utils": "workspace:*", + "arktype": "^2.0.0" + }, + "devDependencies": { + "@prisma-next/test-utils": "workspace:*", + "@prisma-next/tsconfig": "workspace:*", + "tsup": "catalog:", + "typescript": "catalog:", + "vitest": "catalog:" + }, + "files": [ + "dist", + "src" + ], + "exports": { + "./control": { + "types": "./dist/exports/control.d.ts", + "import": "./dist/exports/control.js" + }, + "./runtime": { + "types": "./dist/exports/runtime.d.ts", + "import": "./dist/exports/runtime.js" + } + } +} diff --git a/packages/3-targets/7-drivers/sqlite/src/bun-sqlite.ts b/packages/3-targets/7-drivers/sqlite/src/bun-sqlite.ts new file mode 100644 index 0000000000..970968382f --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/src/bun-sqlite.ts @@ -0,0 +1,53 @@ +import { createRequire } from 'node:module'; + +// Keep this in a helper module so we have a single place to dynamically load bun:sqlite. +// +// This package is built for Node (tsup/esbuild) and must not have a static dependency on +// `bun:sqlite` (which Node cannot resolve). We therefore load it via `require()` with a +// runtime-built specifier, similar to the `node:sqlite` workaround in node-sqlite.ts. + +type BunSqliteModule = { + readonly Database: new (filename: string, options?: Record) => BunDatabase; +}; + +export type BunStatement = { + readonly columnNames: readonly string[]; + all: (...params: unknown[]) => unknown[]; + iterate: (...params: unknown[]) => Iterable; + run: (...params: unknown[]) => { readonly changes: number; readonly lastInsertRowid: number }; + get: (...params: unknown[]) => unknown; +}; + +export type BunDatabase = { + prepare: (sql: string) => BunStatement; + exec: (sql: string) => void; + close: () => void; +}; + +const require = createRequire(import.meta.url); + +function loadBunSqlite(): BunSqliteModule { + // Avoid a literal `bun:sqlite` string to reduce the chance of build-time rewriting. + const bun = String.fromCharCode(98, 117, 110); // "bun" + const sqlite = String.fromCharCode(115, 113, 108, 105, 116, 101); // "sqlite" + return require(`${bun}:${sqlite}`) as BunSqliteModule; +} + +export type BunDatabaseOptions = { + readonly readonly?: boolean; + readonly create?: boolean; + readonly readwrite?: boolean; +}; + +export function createBunDatabase(filename: string, options?: BunDatabaseOptions): BunDatabase { + const { Database } = loadBunSqlite(); + // Bun requires specifying a mode when passing options. Default to readwrite. + const open = options + ? Object.freeze({ + readwrite: options.readwrite ?? !options.readonly, + readonly: options.readonly ?? false, + create: options.create ?? true, + }) + : undefined; + return (open ? new Database(filename, open) : new Database(filename)) as BunDatabase; +} diff --git a/packages/3-targets/7-drivers/sqlite/src/core/descriptor-meta.ts b/packages/3-targets/7-drivers/sqlite/src/core/descriptor-meta.ts new file mode 100644 index 0000000000..37440a5cf3 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/src/core/descriptor-meta.ts @@ -0,0 +1,8 @@ +export const sqliteDriverDescriptorMeta = { + kind: 'driver', + familyId: 'sql', + targetId: 'sqlite', + id: 'sqlite', + version: '0.0.1', + capabilities: {}, +} as const; diff --git a/packages/3-targets/7-drivers/sqlite/src/exports/control.ts b/packages/3-targets/7-drivers/sqlite/src/exports/control.ts new file mode 100644 index 0000000000..ba62d52288 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/src/exports/control.ts @@ -0,0 +1,202 @@ +import { errorRuntime } from '@prisma-next/core-control-plane/errors'; +import type { + ControlDriverDescriptor, + ControlDriverInstance, +} from '@prisma-next/core-control-plane/types'; +import { SqlQueryError } from '@prisma-next/sql-errors'; +import { ifDefined } from '@prisma-next/utils/defined'; +import { redactDatabaseUrl } from '@prisma-next/utils/redact-db-url'; +import type { BunDatabase } from '../bun-sqlite'; +import { createBunDatabase } from '../bun-sqlite'; +import { sqliteDriverDescriptorMeta } from '../core/descriptor-meta'; +import type { DatabaseSync } from '../node-sqlite'; +import { createDatabaseSync } from '../node-sqlite'; +import { normalizeSqliteError } from '../normalize-error'; +import { resolveSqliteFilename } from '../resolve-filename'; + +type SqliteEngine = 'node' | 'bun'; +type SqliteDatabase = DatabaseSync | BunDatabase; + +function isBunRuntime(): boolean { + const bun = (globalThis as { Bun?: unknown }).Bun; + return typeof bun === 'object' && bun !== null; +} + +function parseVersion(version: string): readonly [number, number, number] { + const [major, minor, patch] = version.split('.', 3).map((s) => Number.parseInt(s ?? '0', 10)); + return Object.freeze([major || 0, minor || 0, patch || 0]); +} + +function isVersionGte( + a: readonly [number, number, number], + b: readonly [number, number, number], +): boolean { + const [amaj, amin, apat] = a; + const [bmaj, bmin, bpat] = b; + + if (amaj !== bmaj) { + return amaj > bmaj; + } + if (amin !== bmin) { + return amin > bmin; + } + return apat >= bpat; +} + +function ensureMinimumSqlite(db: SqliteDatabase): void { + try { + const row = db.prepare('select sqlite_version() as v').get() as + | { readonly v?: unknown } + | undefined; + const version = typeof row?.v === 'string' ? row.v : String(row?.v ?? ''); + const parsed = parseVersion(version); + const required = Object.freeze([3, 38, 0] as const); + if (!isVersionGte(parsed, required)) { + throw new Error(`SQLite ${required.join('.')}+ is required (detected ${version})`); + } + + // JSON1 is required for includeMany lowering + sqlite-vector. + db.prepare("select json_object('a', 1) as j").get(); + db.prepare( + 'select json_group_array(value) as a from (select 1 as value union all select 2 as value)', + ).get(); + } catch (error) { + throw normalizeSqliteError(error); + } +} + +function toNumericBindings(params?: readonly unknown[]): Record | undefined { + if (!params || params.length === 0) { + return undefined; + } + const bindings: Record = {}; + for (let i = 0; i < params.length; i++) { + bindings[String(i + 1)] = params[i]; + } + return bindings; +} + +function normalizeSqlitePlaceholders(sql: string): string { + // Prisma Next raw lane emits $1, $2, ... placeholders. SQLite uses ?1, ?2, ... + return sql.replace(/\$(\d+)/g, '?$1'); +} + +type StatementRunResult = { + readonly changes: number | bigint; + readonly lastInsertRowid?: number | bigint; +}; + +type StatementLike = { + all: (...params: unknown[]) => unknown[]; + run: (...params: unknown[]) => StatementRunResult; +} & ({ columns: () => readonly unknown[] } | { readonly columnNames: readonly string[] }); + +function statementReturnsRows(stmt: StatementLike): boolean { + return 'columns' in stmt ? stmt.columns().length > 0 : stmt.columnNames.length > 0; +} + +/** + * SQLite control driver instance for control-plane operations. + * Implements ControlDriverInstance<'sql', 'sqlite'> for database queries. + */ +export class SqliteControlDriver implements ControlDriverInstance<'sql', 'sqlite'> { + readonly familyId = 'sql' as const; + readonly targetId = 'sqlite' as const; + /** + * @deprecated Use targetId instead + */ + readonly target = 'sqlite' as const; + + constructor( + private readonly engine: SqliteEngine, + private readonly db: SqliteDatabase, + ) {} + + async query>( + sql: string, + params?: readonly unknown[], + ): Promise<{ readonly rows: Row[] }> { + try { + const normalizedSql = normalizeSqlitePlaceholders(sql); + const stmt = this.db.prepare(normalizedSql) as StatementLike; + const usesNumericPlaceholders = /\?\d/.test(normalizedSql); + const bindings = + this.engine === 'node' && usesNumericPlaceholders ? toNumericBindings(params) : undefined; + const returnsRows = statementReturnsRows(stmt); + + if (!returnsRows) { + if (this.engine === 'node' && usesNumericPlaceholders) { + bindings ? stmt.run(bindings) : stmt.run(); + return { rows: [] }; + } + + if (params && params.length > 0) { + stmt.run(...params); + return { rows: [] }; + } + + stmt.run(); + return { rows: [] }; + } + + const rows = ( + this.engine === 'node' && usesNumericPlaceholders + ? bindings + ? stmt.all(bindings) + : stmt.all() + : params && params.length > 0 + ? stmt.all(...params) + : stmt.all() + ) as Row[]; + return { rows }; + } catch (error) { + throw normalizeSqliteError(error); + } + } + + async close(): Promise { + this.db.close(); + } +} + +/** + * SQLite driver descriptor for CLI config. + */ +const sqliteDriverDescriptor: ControlDriverDescriptor<'sql', 'sqlite', SqliteControlDriver> = { + ...sqliteDriverDescriptorMeta, + async create(url: string): Promise { + const filename = resolveSqliteFilename(url); + + try { + const engine: SqliteEngine = isBunRuntime() ? 'bun' : 'node'; + const db: SqliteDatabase = + engine === 'node' ? createDatabaseSync(filename) : createBunDatabase(filename); + // Default safety: enforce FK constraints unless explicitly disabled by app. + db.exec('PRAGMA foreign_keys = ON'); + ensureMinimumSqlite(db); + return new SqliteControlDriver(engine, db); + } catch (error) { + const normalized = normalizeSqliteError(error); + const redacted = redactDatabaseUrl(url); + + const codeFromSqlState = SqlQueryError.is(normalized) ? normalized.sqlState : undefined; + const code = + codeFromSqlState ?? + ('cause' in normalized && normalized.cause + ? ((normalized.cause as { code?: unknown }).code as string | undefined) + : undefined); + + throw errorRuntime('Database connection failed', { + why: normalized.message, + fix: 'Verify the sqlite file path (prefer a file: URL), ensure the directory exists, and confirm file permissions', + meta: { + ...ifDefined('code', code), + ...redacted, + ...(!Object.keys(redacted).length ? { filename } : {}), + }, + }); + } + }, +}; + +export default sqliteDriverDescriptor; diff --git a/packages/3-targets/7-drivers/sqlite/src/exports/runtime.ts b/packages/3-targets/7-drivers/sqlite/src/exports/runtime.ts new file mode 100644 index 0000000000..ee9a250df2 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/src/exports/runtime.ts @@ -0,0 +1,30 @@ +import type { + RuntimeDriverDescriptor, + RuntimeDriverInstance, +} from '@prisma-next/core-execution-plane/types'; +import type { SqlDriver } from '@prisma-next/sql-relational-core/ast'; +import { sqliteDriverDescriptorMeta } from '../core/descriptor-meta'; +import type { SqliteDriverOptions } from '../sqlite-driver'; +import { createSqliteDriverFromOptions } from '../sqlite-driver'; + +/** + * SQLite runtime driver instance interface. + * SqlDriver provides SQL-specific methods (execute, explain, close). + * RuntimeDriverInstance provides target identification (familyId, targetId). + */ +export type SqliteRuntimeDriver = RuntimeDriverInstance<'sql', 'sqlite'> & SqlDriver; + +/** + * SQLite driver descriptor for runtime plane. + */ +const sqliteRuntimeDriverDescriptor: RuntimeDriverDescriptor<'sql', 'sqlite', SqliteRuntimeDriver> = + { + ...sqliteDriverDescriptorMeta, + create(options: SqliteDriverOptions): SqliteRuntimeDriver { + return createSqliteDriverFromOptions(options) as SqliteRuntimeDriver; + }, + }; + +export default sqliteRuntimeDriverDescriptor; +export type { CreateSqliteDriverOptions, SqliteDriverOptions } from '../sqlite-driver'; +export { createSqliteDriver, createSqliteDriverFromOptions } from '../sqlite-driver'; diff --git a/packages/3-targets/7-drivers/sqlite/src/node-sqlite.ts b/packages/3-targets/7-drivers/sqlite/src/node-sqlite.ts new file mode 100644 index 0000000000..fb9dd26dcb --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/src/node-sqlite.ts @@ -0,0 +1,31 @@ +import type { PathLike } from 'node:fs'; +import { createRequire } from 'node:module'; + +// Keep this in a helper module so we have a single place to work around bundler behavior. +// +// esbuild (used by tsup) strips the `node:` prefix from builtin imports. For `node:sqlite` +// that produces `sqlite`, which Node does NOT treat as a builtin module specifier. +// +// To avoid that rewrite, we load the module via `require()` with a runtime-built specifier. + +type NodeSqliteModule = typeof import('node:sqlite'); + +export type DatabaseSync = import('node:sqlite').DatabaseSync; +export type DatabaseSyncOptions = import('node:sqlite').DatabaseSyncOptions; + +const require = createRequire(import.meta.url); + +function loadNodeSqlite(): NodeSqliteModule { + // Avoid a literal `node:sqlite` string to prevent build-time rewriting. + const name = String.fromCharCode(115, 113, 108, 105, 116, 101); // "sqlite" + return require(`node:${name}`) as NodeSqliteModule; +} + +export function createDatabaseSync(path: PathLike, options?: DatabaseSyncOptions): DatabaseSync { + const { DatabaseSync: DatabaseSyncCtor } = loadNodeSqlite(); + // node:sqlite is strict about the constructor arity. Passing `undefined` as the 2nd + // argument throws; omit it when unset. + return ( + options === undefined ? new DatabaseSyncCtor(path) : new DatabaseSyncCtor(path, options) + ) as DatabaseSync; +} diff --git a/packages/3-targets/7-drivers/sqlite/src/normalize-error.ts b/packages/3-targets/7-drivers/sqlite/src/normalize-error.ts new file mode 100644 index 0000000000..5cc64473f5 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/src/normalize-error.ts @@ -0,0 +1,87 @@ +import { SqlConnectionError, SqlQueryError } from '@prisma-next/sql-errors'; + +interface NodeSqliteError extends Error { + readonly code: string; + readonly errcode: number; + readonly errstr?: string; +} + +interface BunSqliteError extends Error { + readonly errno: number; + readonly byteOffset?: number; +} + +function isNodeSqliteError(error: unknown): error is NodeSqliteError { + if (!(error instanceof Error)) { + return false; + } + const record = error as Record; + return ( + typeof record.code === 'string' && + record.code === 'ERR_SQLITE_ERROR' && + typeof record.errcode === 'number' + ); +} + +function isBunSqliteError(error: unknown): error is BunSqliteError { + if (!(error instanceof Error)) { + return false; + } + if (error.name !== 'SQLiteError') { + return false; + } + const record = error as Record; + return typeof record.errno === 'number'; +} + +function isConnectionErrcode(errcode: number): boolean { + // SQLite primary error codes, see sqlite3.h: + // 14: SQLITE_CANTOPEN + // 26: SQLITE_NOTADB + // 10: SQLITE_IOERR + // 23: SQLITE_AUTH + // 13: SQLITE_FULL + // 8: SQLITE_READONLY + return ( + errcode === 14 || + errcode === 26 || + errcode === 10 || + errcode === 23 || + errcode === 13 || + errcode === 8 + ); +} + +function isTransientErrcode(errcode: number): boolean { + // 5: SQLITE_BUSY, 6: SQLITE_LOCKED, 10: SQLITE_IOERR + // Note: node:sqlite returns extended errcodes as well (e.g. 2067 for UNIQUE constraint). + return errcode === 5 || errcode === 6 || errcode === 10; +} + +export function normalizeSqliteError(error: unknown): SqlQueryError | SqlConnectionError | Error { + if (!(error instanceof Error)) { + return new Error(String(error)); + } + + const errcode = isNodeSqliteError(error) + ? error.errcode + : isBunSqliteError(error) + ? error.errno + : undefined; + if (errcode === undefined) { + return error; + } + const sqlState = `SQLITE_${errcode}`; + + if (isConnectionErrcode(errcode)) { + return new SqlConnectionError(error.message, { + cause: error, + transient: isTransientErrcode(errcode), + }); + } + + return new SqlQueryError(error.message, { + cause: error, + sqlState, + }); +} diff --git a/packages/3-targets/7-drivers/sqlite/src/resolve-filename.ts b/packages/3-targets/7-drivers/sqlite/src/resolve-filename.ts new file mode 100644 index 0000000000..32b7806915 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/src/resolve-filename.ts @@ -0,0 +1,32 @@ +import { resolve as resolvePath } from 'node:path'; +import { fileURLToPath } from 'node:url'; + +/** + * Resolve SQLite "connection strings" to a filename. + * + * Supports: + * - Absolute/relative paths (returned as-is / resolved by callers when desired) + * - Standard file URLs: file:///absolute/path.db + * - Prisma-style file URLs: file:./dev.db (resolved relative to process.cwd()) + */ +export function resolveSqliteFilename(urlOrPath: string): string { + if (!urlOrPath.startsWith('file:')) { + return urlOrPath; + } + + // Standard file URLs (file:///...) should be handled by the URL parser. + if (urlOrPath.startsWith('file://')) { + return fileURLToPath(new URL(urlOrPath)); + } + + // Prisma-style: file:./dev.db or file:../dev.db. The URL constructor incorrectly + // normalizes these to file:///dev.db, so we treat them as "file:" + path. + const rest = urlOrPath.slice('file:'.length); + const pathPart = rest.split('?', 1)[0] ?? rest; + if (pathPart === ':memory:') { + return ':memory:'; + } + + // Resolve relative paths from the current working directory. + return resolvePath(process.cwd(), decodeURIComponent(pathPart)); +} diff --git a/packages/3-targets/7-drivers/sqlite/src/sqlite-driver.ts b/packages/3-targets/7-drivers/sqlite/src/sqlite-driver.ts new file mode 100644 index 0000000000..b30094310d --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/src/sqlite-driver.ts @@ -0,0 +1,537 @@ +import { existsSync } from 'node:fs'; +import type { + SqlConnection, + SqlDriver, + SqlExecuteRequest, + SqlExplainResult, + SqlQueryable, + SqlQueryResult, + SqlTransaction, +} from '@prisma-next/sql-relational-core/ast'; +import type { BunDatabase } from './bun-sqlite'; +import { createBunDatabase } from './bun-sqlite'; +import type { DatabaseSync } from './node-sqlite'; +import { createDatabaseSync } from './node-sqlite'; +import { normalizeSqliteError } from './normalize-error'; +import { resolveSqliteFilename } from './resolve-filename'; + +export type SqliteEngine = 'node' | 'bun'; +export type SqliteEngineMode = SqliteEngine | 'auto'; + +export type SqliteConnectOptions = { + readonly readonly?: boolean; + readonly fileMustExist?: boolean; + readonly timeoutMs?: number; +}; + +type SqliteUserFunction = { bivarianceHack(...args: unknown[]): unknown }['bivarianceHack']; + +export interface SqliteDriverOptions { + /** + * Driver backend implementation. + * + * - `auto` (default): use bun:sqlite when running under Bun, otherwise node:sqlite. + * - `node`: force node:sqlite (Node runtime only) + * - `bun`: force bun:sqlite (Bun runtime only) + */ + readonly engine?: SqliteEngineMode | undefined; + readonly connect: + | { + readonly filename: string; + readonly options?: SqliteConnectOptions | undefined; + } + | { + /** + * Alias for filename that matches other drivers/configs. + * Supports `file:` URLs (including Prisma-style `file:./dev.db`). + */ + readonly connectionString: string; + readonly options?: SqliteConnectOptions | undefined; + } + | { readonly database: DatabaseSync }; + /** + * Pragmas to apply when opening the connection (e.g., foreign_keys, journal_mode). + * + * Note: Keep this minimal; policy/production tuning belongs in app config. + */ + readonly pragmas?: Record | undefined; + /** + * Optional user-defined functions to register on the connection. + * + * This is intentionally generic: callers can use it for extension packs or app-specific helpers. + */ + readonly functions?: Record | undefined; +} + +export interface CreateSqliteDriverOptions { + /** + * Back-compat convenience option. Prefer `connectionString`. + * + * Accepts either a filesystem path or a `file:` connection string. + */ + readonly filename?: string; + /** + * Accepts either a filesystem path or a `file:` connection string. + */ + readonly connectionString?: string; + readonly engine?: SqliteDriverOptions['engine']; + readonly options?: SqliteConnectOptions | undefined; + readonly pragmas?: SqliteDriverOptions['pragmas']; + readonly functions?: SqliteDriverOptions['functions']; +} + +type SqliteDatabase = DatabaseSync | BunDatabase; +type ConnectionOptions = { + readonly owned: boolean; + readonly engine: SqliteEngine; + readonly db: SqliteDatabase; +}; + +function normalizeSqlitePlaceholders(sql: string): string { + // Prisma Next raw lane emits $1, $2, ... placeholders. SQLite uses ?1, ?2, ... + return sql.replace(/\$(\d+)/g, '?$1'); +} + +function isBunRuntime(): boolean { + const bun = (globalThis as { Bun?: unknown }).Bun; + return typeof bun === 'object' && bun !== null; +} + +function resolveEngine(mode: SqliteEngineMode | undefined): SqliteEngine { + if (mode && mode !== 'auto') { + if (mode === 'bun' && !isBunRuntime()) { + throw new Error('SqliteDriverOptions.engine is "bun" but this runtime is not Bun'); + } + return mode; + } + + return isBunRuntime() ? 'bun' : 'node'; +} + +function inferEngineFromDatabase(db: SqliteDatabase): SqliteEngine { + // node:sqlite exposes db.function(). bun:sqlite does not. + const record = db as { readonly function?: unknown }; + return typeof record.function === 'function' ? 'node' : 'bun'; +} + +function parseVersion(version: string): readonly [number, number, number] { + const [major, minor, patch] = version.split('.', 3).map((s) => Number.parseInt(s ?? '0', 10)); + return Object.freeze([major || 0, minor || 0, patch || 0]); +} + +function isVersionGte( + a: readonly [number, number, number], + b: readonly [number, number, number], +): boolean { + const [amaj, amin, apat] = a; + const [bmaj, bmin, bpat] = b; + + if (amaj !== bmaj) { + return amaj > bmaj; + } + if (amin !== bmin) { + return amin > bmin; + } + return apat >= bpat; +} + +function ensureMinimumSqlite(db: SqliteDatabase): void { + try { + const row = db.prepare('select sqlite_version() as v').get() as + | { readonly v?: unknown } + | undefined; + const version = typeof row?.v === 'string' ? row.v : String(row?.v ?? ''); + const parsed = parseVersion(version); + const required = Object.freeze([3, 38, 0] as const); + if (!isVersionGte(parsed, required)) { + throw new Error(`SQLite ${required.join('.')}+ is required (detected ${version})`); + } + + // JSON1 is required for includeMany lowering + sqlite-vector. + db.prepare("select json_object('a', 1) as j").get(); + db.prepare( + 'select json_group_array(value) as a from (select 1 as value union all select 2 as value)', + ).get(); + } catch (error) { + throw normalizeSqliteError(error); + } +} + +type StatementRunResult = { + readonly changes: number | bigint; + readonly lastInsertRowid?: number | bigint; +}; + +type StatementLike = { + all: (...params: unknown[]) => unknown[]; + iterate: (...params: unknown[]) => Iterable; + run: (...params: unknown[]) => StatementRunResult; + get: (...params: unknown[]) => unknown; +} & ({ columns: () => readonly unknown[] } | { readonly columnNames: readonly string[] }); + +function statementReturnsRows(stmt: StatementLike): boolean { + return 'columns' in stmt ? stmt.columns().length > 0 : stmt.columnNames.length > 0; +} + +abstract class SqliteQueryable implements SqlQueryable { + protected abstract getDb(): SqliteDatabase; + protected abstract getEngine(): SqliteEngine; + + async *execute>(request: SqlExecuteRequest): AsyncIterable { + const db = this.getDb(); + const engine = this.getEngine(); + const sql = normalizeSqlitePlaceholders(request.sql); + const usesNumericPlaceholders = /\?\d/.test(sql); + const params = request.params; + const bindings = + engine === 'node' && usesNumericPlaceholders ? toNumericBindings(params) : undefined; + + try { + const stmt = db.prepare(sql) as StatementLike; + const returnsRows = statementReturnsRows(stmt); + + if (returnsRows) { + const iterator = + engine === 'node' && usesNumericPlaceholders + ? bindings + ? stmt.iterate(bindings) + : stmt.iterate() + : params && params.length > 0 + ? stmt.iterate(...params) + : stmt.iterate(); + for (const row of iterator) { + yield row as Row; + } + return; + } + + if (engine === 'node' && usesNumericPlaceholders) { + bindings ? stmt.run(bindings) : stmt.run(); + return; + } + + if (params && params.length > 0) { + stmt.run(...params); + return; + } + + stmt.run(); + } catch (error) { + throw normalizeSqliteError(error); + } + } + + async explain(requestOrSql: SqlExecuteRequest): Promise; + async explain(sql: string, params?: readonly unknown[]): Promise; + async explain( + requestOrSql: SqlExecuteRequest | string, + params?: readonly unknown[], + ): Promise { + const request: SqlExecuteRequest = + typeof requestOrSql === 'string' + ? params + ? { sql: requestOrSql, params } + : { sql: requestOrSql } + : requestOrSql; + + const db = this.getDb(); + const engine = this.getEngine(); + const sql = normalizeSqlitePlaceholders(request.sql); + const usesNumericPlaceholders = /\?\d/.test(sql); + const requestParams = request.params; + const bindings = + engine === 'node' && usesNumericPlaceholders ? toNumericBindings(requestParams) : undefined; + + try { + const stmt = db.prepare(`EXPLAIN QUERY PLAN ${sql}`) as StatementLike; + const rows = ( + engine === 'node' && usesNumericPlaceholders + ? bindings + ? stmt.all(bindings) + : stmt.all() + : requestParams && requestParams.length > 0 + ? stmt.all(...requestParams) + : stmt.all() + ) as Array>; + return { rows }; + } catch (error) { + throw normalizeSqliteError(error); + } + } + + async query>( + sql: string, + params?: readonly unknown[], + ): Promise> { + const db = this.getDb(); + const engine = this.getEngine(); + const normalizedSql = normalizeSqlitePlaceholders(sql); + const usesNumericPlaceholders = /\?\d/.test(normalizedSql); + const bindings = + engine === 'node' && usesNumericPlaceholders ? toNumericBindings(params) : undefined; + + try { + const stmt = db.prepare(normalizedSql) as StatementLike; + const returnsRows = statementReturnsRows(stmt); + + if (returnsRows) { + const rows = ( + engine === 'node' && usesNumericPlaceholders + ? bindings + ? stmt.all(bindings) + : stmt.all() + : params && params.length > 0 + ? stmt.all(...params) + : stmt.all() + ) as Row[]; + return { rows, rowCount: rows.length }; + } + + const result = + engine === 'node' && usesNumericPlaceholders + ? bindings + ? stmt.run(bindings) + : stmt.run() + : params && params.length > 0 + ? stmt.run(...params) + : stmt.run(); + return { + rows: [], + rowCount: typeof result.changes === 'bigint' ? Number(result.changes) : result.changes, + lastInsertRowid: + typeof result.lastInsertRowid === 'bigint' + ? Number(result.lastInsertRowid) + : result.lastInsertRowid, + }; + } catch (error) { + throw normalizeSqliteError(error); + } + } +} + +class SqliteConnectionImpl extends SqliteQueryable implements SqlConnection { + constructor( + private readonly engine: SqliteEngine, + private readonly db: SqliteDatabase, + ) { + super(); + } + + protected getDb(): SqliteDatabase { + return this.db; + } + + protected getEngine(): SqliteEngine { + return this.engine; + } + + async beginTransaction(): Promise { + try { + // Use IMMEDIATE to acquire a write lock early (single writer). + this.db.exec('BEGIN IMMEDIATE'); + return new SqliteTransactionImpl(this.engine, this.db); + } catch (error) { + throw normalizeSqliteError(error); + } + } + + async release(): Promise { + // Single-connection driver; no-op. + } +} + +class SqliteTransactionImpl extends SqliteQueryable implements SqlTransaction { + constructor( + private readonly engine: SqliteEngine, + private readonly db: SqliteDatabase, + ) { + super(); + } + + protected getDb(): SqliteDatabase { + return this.db; + } + + protected getEngine(): SqliteEngine { + return this.engine; + } + + async commit(): Promise { + try { + this.db.exec('COMMIT'); + } catch (error) { + throw normalizeSqliteError(error); + } + } + + async rollback(): Promise { + try { + this.db.exec('ROLLBACK'); + } catch (error) { + throw normalizeSqliteError(error); + } + } +} + +class SqliteDriverImpl extends SqliteQueryable implements SqlDriver { + constructor(private readonly conn: ConnectionOptions) { + super(); + } + + protected getDb(): SqliteDatabase { + return this.conn.db; + } + + protected getEngine(): SqliteEngine { + return this.conn.engine; + } + + async connect(): Promise { + // No-op: connection is established at construction time. + } + + async acquireConnection(): Promise { + return new SqliteConnectionImpl(this.conn.engine, this.conn.db); + } + + async close(): Promise { + if (this.conn.owned) { + this.conn.db.close(); + } + } +} + +function applyPragmas(db: SqliteDatabase, pragmas: SqliteDriverOptions['pragmas']): void { + if (!pragmas) { + return; + } + + for (const [key, value] of Object.entries(pragmas)) { + if (value === undefined) { + continue; + } + + const normalized = + value === null + ? 'NULL' + : typeof value === 'boolean' + ? value + ? 'ON' + : 'OFF' + : typeof value === 'number' + ? String(value) + : `'${String(value).replace(/'/g, "''")}'`; + + db.exec(`PRAGMA ${key} = ${normalized}`); + } +} + +function registerFunctions( + engine: SqliteEngine, + db: SqliteDatabase, + functions: SqliteDriverOptions['functions'], +): void { + if (!functions) { + return; + } + + if (engine !== 'node') { + throw new Error('SQLite driver functions are only supported on the node:sqlite backend'); + } + + for (const [name, fn] of Object.entries(functions)) { + (db as DatabaseSync).function(name, fn); + } +} + +function toNumericBindings(params?: readonly unknown[]): Record | undefined { + if (!params || params.length === 0) { + return undefined; + } + const bindings: Record = {}; + for (let i = 0; i < params.length; i++) { + bindings[String(i + 1)] = params[i]; + } + return bindings; +} + +export function createSqliteDriverFromOptions(options: SqliteDriverOptions): SqlDriver { + if ('database' in options.connect) { + const inferred = inferEngineFromDatabase(options.connect.database); + if (options.engine && options.engine !== 'auto' && options.engine !== inferred) { + throw new Error( + `SqliteDriverOptions.engine is "${options.engine}" but the provided database handle looks like "${inferred}"`, + ); + } + const engine = inferred; + try { + applyPragmas(options.connect.database, options.pragmas); + registerFunctions(engine, options.connect.database, options.functions); + ensureMinimumSqlite(options.connect.database); + return Object.freeze( + new SqliteDriverImpl({ owned: false, engine, db: options.connect.database }), + ); + } catch (error) { + throw normalizeSqliteError(error); + } + } + + const engine = resolveEngine(options.engine); + const connectionString = + 'connectionString' in options.connect + ? options.connect.connectionString + : options.connect.filename; + const filename = resolveSqliteFilename(connectionString); + if (options.connect.options?.fileMustExist && filename !== ':memory:' && !existsSync(filename)) { + // Mirror sqlite open errors as closely as we can for nicer envelopes. + const error = new Error(`SQLite database file does not exist: ${filename}`); + (error as { code?: string }).code = 'ERR_SQLITE_ERROR'; + (error as { errcode?: number }).errcode = 14; // SQLITE_CANTOPEN + throw normalizeSqliteError(error); + } + + try { + const db = + engine === 'node' + ? createDatabaseSync(filename, { + readOnly: options.connect.options?.readonly ?? false, + timeout: options.connect.options?.timeoutMs, + }) + : createBunDatabase(filename, { + readonly: options.connect.options?.readonly ?? false, + readwrite: !(options.connect.options?.readonly ?? false), + create: !(options.connect.options?.fileMustExist ?? false), + }); + + // Default safety: enforce FK constraints unless explicitly disabled. + applyPragmas( + db, + Object.freeze({ + foreign_keys: 'ON', + ...(options.connect.options?.timeoutMs + ? { busy_timeout: options.connect.options.timeoutMs } + : {}), + ...options.pragmas, + }), + ); + registerFunctions(engine, db, options.functions); + ensureMinimumSqlite(db); + + return Object.freeze(new SqliteDriverImpl({ owned: true, engine, db })); + } catch (error) { + throw normalizeSqliteError(error); + } +} + +export function createSqliteDriver(options: CreateSqliteDriverOptions): SqlDriver { + const connectionString = options.connectionString ?? options.filename; + if (!connectionString) { + throw new Error('createSqliteDriver requires connectionString or filename'); + } + return createSqliteDriverFromOptions({ + engine: options.engine, + connect: { connectionString, options: options.options }, + pragmas: options.pragmas, + functions: options.functions, + }); +} diff --git a/packages/3-targets/7-drivers/sqlite/test/driver.basic.test.ts b/packages/3-targets/7-drivers/sqlite/test/driver.basic.test.ts new file mode 100644 index 0000000000..e20e8dc9cf --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/test/driver.basic.test.ts @@ -0,0 +1,95 @@ +import { SqlConnectionError, SqlQueryError } from '@prisma-next/sql-errors'; +import { describe, expect, it } from 'vitest'; +import { createSqliteDriverFromOptions } from '../src/sqlite-driver'; + +describe('@prisma-next/driver-sqlite', () => { + it('queries and binds numeric params', async () => { + const driver = createSqliteDriverFromOptions({ + connect: { filename: ':memory:' }, + }); + + const result = await driver.query<{ v: number }>('select ?1 as v', [42]); + expect(result.rows).toEqual([{ v: 42 }]); + + await driver.close(); + }); + + it('streams rows via execute()', async () => { + const driver = createSqliteDriverFromOptions({ + connect: { filename: ':memory:' }, + }); + + await driver.query('create table items(id integer primary key, name text not null)'); + await driver.query('insert into items(name) values (?1), (?2)', ['a', 'b']); + + const rows: Array<{ id: number; name: string }> = []; + for await (const row of driver.execute<{ id: number; name: string }>({ + sql: 'select id, name from items order by id asc', + })) { + rows.push(row); + } + + expect(rows).toEqual([ + { id: 1, name: 'a' }, + { id: 2, name: 'b' }, + ]); + + await driver.close(); + }); + + it('supports transactions (commit + rollback)', async () => { + const driver = createSqliteDriverFromOptions({ + connect: { filename: ':memory:' }, + }); + + await driver.query('create table items(id integer primary key, name text not null)'); + + // Rollback + const conn1 = await driver.acquireConnection(); + const tx1 = await conn1.beginTransaction(); + await tx1.query('insert into items(name) values (?1)', ['a']); + await tx1.rollback(); + await conn1.release(); + + const countAfterRollback = await driver.query<{ c: number }>('select count(*) as c from items'); + expect(countAfterRollback.rows[0]?.c).toBe(0); + + // Commit + const conn2 = await driver.acquireConnection(); + const tx2 = await conn2.beginTransaction(); + await tx2.query('insert into items(name) values (?1)', ['b']); + await tx2.commit(); + await conn2.release(); + + const countAfterCommit = await driver.query<{ c: number }>('select count(*) as c from items'); + expect(countAfterCommit.rows[0]?.c).toBe(1); + + await driver.close(); + }); + + it('normalizes connection errors', async () => { + expect(() => + createSqliteDriverFromOptions({ + connect: { filename: '/this/does/not/exist/sqlite.db' }, + }), + ).toThrowError(SqlConnectionError); + }); + + it('normalizes query errors', async () => { + const driver = createSqliteDriverFromOptions({ + connect: { filename: ':memory:' }, + }); + + await driver.query('create table items(id integer primary key, name text unique)'); + await driver.query('insert into items(name) values (?1)', ['a']); + + try { + await driver.query('insert into items(name) values (?1)', ['a']); + throw new Error('expected unique constraint error'); + } catch (error) { + expect(error).toBeInstanceOf(SqlQueryError); + } finally { + await driver.close(); + } + }); +}); diff --git a/packages/3-targets/7-drivers/sqlite/tsconfig.build.json b/packages/3-targets/7-drivers/sqlite/tsconfig.build.json new file mode 100644 index 0000000000..671541c1a3 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/tsconfig.build.json @@ -0,0 +1,12 @@ +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "rootDir": "src", + "outDir": "dist", + "declaration": true, + "declarationMap": true, + "emitDeclarationOnly": true + }, + "include": ["src/**/*.ts"], + "exclude": ["test", "dist"] +} diff --git a/packages/3-targets/7-drivers/sqlite/tsconfig.json b/packages/3-targets/7-drivers/sqlite/tsconfig.json new file mode 100644 index 0000000000..7afa587436 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/tsconfig.json @@ -0,0 +1,9 @@ +{ + "extends": ["@prisma-next/tsconfig/base"], + "compilerOptions": { + "rootDir": ".", + "outDir": "dist" + }, + "include": ["src/**/*.ts", "test/**/*.ts"], + "exclude": ["dist"] +} diff --git a/packages/3-targets/7-drivers/sqlite/tsup.config.ts b/packages/3-targets/7-drivers/sqlite/tsup.config.ts new file mode 100644 index 0000000000..7ed6b29770 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/tsup.config.ts @@ -0,0 +1,28 @@ +import { defineConfig } from 'tsup'; + +export default defineConfig({ + entry: { + 'exports/control': 'src/exports/control.ts', + 'exports/runtime': 'src/exports/runtime.ts', + }, + outDir: 'dist', + format: ['esm'], + sourcemap: true, + dts: false, + clean: true, + target: 'es2022', + minify: false, + esbuildPlugins: [ + { + // esbuild strips the `node:` prefix from builtin imports. For `node:sqlite` that + // produces `sqlite`, which is not a valid builtin module specifier in Node. + name: 'keep-node-sqlite', + setup(build) { + build.onResolve({ filter: /^sqlite$/ }, () => ({ + path: 'node:sqlite', + external: true, + })); + }, + }, + ], +}); diff --git a/packages/3-targets/7-drivers/sqlite/vitest.config.ts b/packages/3-targets/7-drivers/sqlite/vitest.config.ts new file mode 100644 index 0000000000..cf1e80a584 --- /dev/null +++ b/packages/3-targets/7-drivers/sqlite/vitest.config.ts @@ -0,0 +1,30 @@ +import { timeouts } from '@prisma-next/test-utils'; +import { defineConfig } from 'vitest/config'; + +export default defineConfig({ + test: { + globals: true, + environment: 'node', + testTimeout: timeouts.default, + hookTimeout: timeouts.default, + coverage: { + provider: 'v8', + reporter: ['text', 'json', 'html'], + include: ['src/**/*.ts'], + exclude: [ + 'dist/**', + 'test/**', + '**/*.test.ts', + '**/*.test-d.ts', + '**/*.config.ts', + '**/exports/**', + ], + thresholds: { + lines: 94, + branches: 89, + functions: 100, + statements: 94, + }, + }, + }, +}); diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index a753b9432f..904d2337db 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -184,6 +184,100 @@ importers: specifier: 'catalog:' version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + examples/prisma-next-demo-sqlite: + dependencies: + '@prisma-next/adapter-sqlite': + specifier: workspace:* + version: link:../../packages/3-targets/6-adapters/sqlite + '@prisma-next/contract': + specifier: workspace:* + version: link:../../packages/1-framework/1-core/shared/contract + '@prisma-next/core-execution-plane': + specifier: workspace:* + version: link:../../packages/1-framework/1-core/runtime/execution-plane + '@prisma-next/driver-sqlite': + specifier: workspace:* + version: link:../../packages/3-targets/7-drivers/sqlite + '@prisma-next/extension-sqlite-vector': + specifier: workspace:* + version: link:../../packages/3-extensions/sqlite-vector + '@prisma-next/family-sql': + specifier: workspace:* + version: link:../../packages/2-sql/3-tooling/family + '@prisma-next/integration-kysely': + specifier: workspace:* + version: link:../../packages/3-extensions/integration-kysely + '@prisma-next/sql-contract': + specifier: workspace:* + version: link:../../packages/2-sql/1-core/contract + '@prisma-next/sql-contract-ts': + specifier: workspace:* + version: link:../../packages/2-sql/2-authoring/contract-ts + '@prisma-next/sql-lane': + specifier: workspace:* + version: link:../../packages/2-sql/4-lanes/sql-lane + '@prisma-next/sql-orm-lane': + specifier: workspace:* + version: link:../../packages/2-sql/4-lanes/orm-lane + '@prisma-next/sql-relational-core': + specifier: workspace:* + version: link:../../packages/2-sql/4-lanes/relational-core + '@prisma-next/sql-runtime': + specifier: workspace:* + version: link:../../packages/2-sql/5-runtime + '@prisma-next/target-sqlite': + specifier: workspace:* + version: link:../../packages/3-targets/3-targets/sqlite + arktype: + specifier: ^2.1.29 + version: 2.1.29 + dotenv: + specifier: ^16.4.5 + version: 16.6.1 + kysely: + specifier: 'catalog:' + version: 0.28.10 + devDependencies: + '@prisma-next/cli': + specifier: workspace:* + version: link:../../packages/1-framework/3-tooling/cli + '@prisma-next/core-control-plane': + specifier: workspace:* + version: link:../../packages/1-framework/1-core/migration/control-plane + '@prisma-next/emitter': + specifier: workspace:* + version: link:../../packages/1-framework/3-tooling/emitter + '@prisma-next/sql-contract-emitter': + specifier: workspace:* + version: link:../../packages/2-sql/3-tooling/emitter + '@prisma-next/test-utils': + specifier: workspace:* + version: link:../../test/utils + '@prisma-next/tsconfig': + specifier: workspace:* + version: link:../../packages/0-config/tsconfig + '@prisma-next/vite-plugin-contract-emit': + specifier: workspace:* + version: link:../../packages/1-framework/3-tooling/vite-plugin-contract-emit + '@types/node': + specifier: 'catalog:' + version: 24.10.4 + tsup: + specifier: 'catalog:' + version: 8.5.1(jiti@2.6.1)(postcss@8.5.6)(tsx@4.20.6)(typescript@5.9.3)(yaml@2.8.1) + tsx: + specifier: ^4.19.2 + version: 4.20.6 + typescript: + specifier: 'catalog:' + version: 5.9.3 + vite: + specifier: 'catalog:' + version: 7.3.1(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + vitest: + specifier: 'catalog:' + version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + examples/prisma-orm-demo: dependencies: '@prisma-next/adapter-postgres': @@ -1340,6 +1434,46 @@ importers: specifier: 'catalog:' version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + packages/3-extensions/sqlite-vector: + dependencies: + '@prisma-next/contract': + specifier: workspace:* + version: link:../../1-framework/1-core/shared/contract + '@prisma-next/contract-authoring': + specifier: workspace:* + version: link:../../1-framework/2-authoring/contract + '@prisma-next/family-sql': + specifier: workspace:* + version: link:../../2-sql/3-tooling/family + '@prisma-next/sql-operations': + specifier: workspace:* + version: link:../../2-sql/1-core/operations + '@prisma-next/sql-relational-core': + specifier: workspace:* + version: link:../../2-sql/4-lanes/relational-core + '@prisma-next/sql-runtime': + specifier: workspace:* + version: link:../../2-sql/5-runtime + devDependencies: + '@prisma-next/operations': + specifier: workspace:* + version: link:../../1-framework/1-core/shared/operations + '@prisma-next/test-utils': + specifier: workspace:* + version: link:../../../test/utils + '@prisma-next/tsconfig': + specifier: workspace:* + version: link:../../0-config/tsconfig + tsup: + specifier: 'catalog:' + version: 8.5.1(jiti@2.6.1)(postcss@8.5.6)(tsx@4.20.6)(typescript@5.9.3)(yaml@2.8.1) + typescript: + specifier: 'catalog:' + version: 5.9.3 + vitest: + specifier: 'catalog:' + version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + packages/3-targets/3-targets/postgres: dependencies: '@prisma-next/cli': @@ -1395,6 +1529,61 @@ importers: specifier: 'catalog:' version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + packages/3-targets/3-targets/sqlite: + dependencies: + '@prisma-next/cli': + specifier: workspace:* + version: link:../../../1-framework/3-tooling/cli + '@prisma-next/contract': + specifier: workspace:* + version: link:../../../1-framework/1-core/shared/contract + '@prisma-next/core-control-plane': + specifier: workspace:* + version: link:../../../1-framework/1-core/migration/control-plane + '@prisma-next/core-execution-plane': + specifier: workspace:* + version: link:../../../1-framework/1-core/runtime/execution-plane + '@prisma-next/family-sql': + specifier: workspace:* + version: link:../../../2-sql/3-tooling/family + '@prisma-next/sql-contract': + specifier: workspace:* + version: link:../../../2-sql/1-core/contract + '@prisma-next/sql-errors': + specifier: workspace:* + version: link:../../../2-sql/1-core/errors + '@prisma-next/sql-schema-ir': + specifier: workspace:* + version: link:../../../2-sql/1-core/schema-ir + '@prisma-next/utils': + specifier: workspace:* + version: link:../../../1-framework/1-core/shared/utils + arktype: + specifier: ^2.0.0 + version: 2.1.29 + devDependencies: + '@prisma-next/adapter-sqlite': + specifier: workspace:* + version: link:../../6-adapters/sqlite + '@prisma-next/driver-sqlite': + specifier: workspace:* + version: link:../../7-drivers/sqlite + '@prisma-next/test-utils': + specifier: workspace:* + version: link:../../../../test/utils + '@prisma-next/tsconfig': + specifier: workspace:* + version: link:../../../0-config/tsconfig + tsup: + specifier: 'catalog:' + version: 8.5.1(jiti@2.6.1)(postcss@8.5.6)(tsx@4.20.6)(typescript@5.9.3)(yaml@2.8.1) + typescript: + specifier: 'catalog:' + version: 5.9.3 + vitest: + specifier: 'catalog:' + version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + packages/3-targets/6-adapters/postgres: dependencies: '@prisma-next/cli': @@ -1453,6 +1642,64 @@ importers: specifier: 'catalog:' version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + packages/3-targets/6-adapters/sqlite: + dependencies: + '@prisma-next/cli': + specifier: workspace:* + version: link:../../../1-framework/3-tooling/cli + '@prisma-next/contract': + specifier: workspace:* + version: link:../../../1-framework/1-core/shared/contract + '@prisma-next/contract-authoring': + specifier: workspace:* + version: link:../../../1-framework/2-authoring/contract + '@prisma-next/core-control-plane': + specifier: workspace:* + version: link:../../../1-framework/1-core/migration/control-plane + '@prisma-next/core-execution-plane': + specifier: workspace:* + version: link:../../../1-framework/1-core/runtime/execution-plane + '@prisma-next/family-sql': + specifier: workspace:* + version: link:../../../2-sql/3-tooling/family + '@prisma-next/sql-contract': + specifier: workspace:* + version: link:../../../2-sql/1-core/contract + '@prisma-next/sql-contract-ts': + specifier: workspace:* + version: link:../../../2-sql/2-authoring/contract-ts + '@prisma-next/sql-operations': + specifier: workspace:* + version: link:../../../2-sql/1-core/operations + '@prisma-next/sql-relational-core': + specifier: workspace:* + version: link:../../../2-sql/4-lanes/relational-core + '@prisma-next/sql-schema-ir': + specifier: workspace:* + version: link:../../../2-sql/1-core/schema-ir + '@prisma-next/utils': + specifier: workspace:* + version: link:../../../1-framework/1-core/shared/utils + arktype: + specifier: ^2.0.0 + version: 2.1.29 + devDependencies: + '@prisma-next/test-utils': + specifier: workspace:* + version: link:../../../../test/utils + '@prisma-next/tsconfig': + specifier: workspace:* + version: link:../../../0-config/tsconfig + tsup: + specifier: 'catalog:' + version: 8.5.1(jiti@2.6.1)(postcss@8.5.6)(tsx@4.20.6)(typescript@5.9.3)(yaml@2.8.1) + typescript: + specifier: 'catalog:' + version: 5.9.3 + vitest: + specifier: 'catalog:' + version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + packages/3-targets/7-drivers/postgres: dependencies: '@prisma-next/contract': @@ -1514,6 +1761,52 @@ importers: specifier: 'catalog:' version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + packages/3-targets/7-drivers/sqlite: + dependencies: + '@prisma-next/contract': + specifier: workspace:* + version: link:../../../1-framework/1-core/shared/contract + '@prisma-next/core-control-plane': + specifier: workspace:* + version: link:../../../1-framework/1-core/migration/control-plane + '@prisma-next/core-execution-plane': + specifier: workspace:* + version: link:../../../1-framework/1-core/runtime/execution-plane + '@prisma-next/sql-contract': + specifier: workspace:* + version: link:../../../2-sql/1-core/contract + '@prisma-next/sql-errors': + specifier: workspace:* + version: link:../../../2-sql/1-core/errors + '@prisma-next/sql-operations': + specifier: workspace:* + version: link:../../../2-sql/1-core/operations + '@prisma-next/sql-relational-core': + specifier: workspace:* + version: link:../../../2-sql/4-lanes/relational-core + '@prisma-next/utils': + specifier: workspace:* + version: link:../../../1-framework/1-core/shared/utils + arktype: + specifier: ^2.0.0 + version: 2.1.29 + devDependencies: + '@prisma-next/test-utils': + specifier: workspace:* + version: link:../../../../test/utils + '@prisma-next/tsconfig': + specifier: workspace:* + version: link:../../../0-config/tsconfig + tsup: + specifier: 'catalog:' + version: 8.5.1(jiti@2.6.1)(postcss@8.5.6)(tsx@4.20.6)(typescript@5.9.3)(yaml@2.8.1) + typescript: + specifier: 'catalog:' + version: 5.9.3 + vitest: + specifier: 'catalog:' + version: 4.0.17(@types/node@24.10.4)(jiti@2.6.1)(tsx@4.20.6)(yaml@2.8.1) + test/e2e/framework: dependencies: '@prisma-next/adapter-postgres': diff --git a/tsconfig.base.json b/tsconfig.base.json index 0d84e954d4..3d38206eff 100644 --- a/tsconfig.base.json +++ b/tsconfig.base.json @@ -38,9 +38,14 @@ { "path": "./packages/3-extensions/compat-prisma" }, { "path": "./packages/3-extensions/pgvector" }, + { "path": "./packages/3-extensions/sqlite-vector" }, { "path": "./packages/3-targets/3-targets/postgres" }, { "path": "./packages/3-targets/6-adapters/postgres" }, - { "path": "./packages/3-targets/7-drivers/postgres" } + { "path": "./packages/3-targets/7-drivers/postgres" }, + + { "path": "./packages/3-targets/3-targets/sqlite" }, + { "path": "./packages/3-targets/6-adapters/sqlite" }, + { "path": "./packages/3-targets/7-drivers/sqlite" } ] }