diff --git a/docs/architecture docs/adrs/ADR 187 - MongoDB schema representation for migration diffing.md b/docs/architecture docs/adrs/ADR 187 - MongoDB schema representation for migration diffing.md index 74e4a22d0b..9018015123 100644 --- a/docs/architecture docs/adrs/ADR 187 - MongoDB schema representation for migration diffing.md +++ b/docs/architecture docs/adrs/ADR 187 - MongoDB schema representation for migration diffing.md @@ -76,7 +76,7 @@ MongoDB has a small set of server-side objects that migrations manage. Each one | `MongoSchemaCollectionOptions` | Capped, timeseries, collation, etc. | `{ capped: true, size: 1048576 }` | -Currently only `MongoSchemaCollection` and `MongoSchemaIndex` are implemented. Validators and collection options are defined in the visitor interface (so adding them later produces a compile error in all consumers) but not yet built. +All four node types are implemented. `MongoSchemaValidator` holds `$jsonSchema`, `validationLevel`, and `validationAction`. `MongoSchemaCollectionOptions` holds capped, timeseries, collation, clusteredIndex, and changeStreamPreAndPostImages. ## Decision @@ -94,17 +94,19 @@ interface MongoSchemaIR { An empty IR (for a new project with no prior contract) is `{ collections: {} }`. -A collection groups its indexes (and, in the future, its validator and options): +A collection groups its indexes, validator, and options: ```ts class MongoSchemaCollection extends MongoSchemaNode { readonly kind = 'collection' as const; readonly name: string; readonly indexes: ReadonlyArray; + readonly validator?: MongoSchemaValidator; + readonly options?: MongoSchemaCollectionOptions; } ``` -An index — the most important node — is defined by its keys and options: +An index is defined by its keys and options: ```ts class MongoSchemaIndex extends MongoSchemaNode { @@ -114,6 +116,11 @@ class MongoSchemaIndex extends MongoSchemaNode { readonly sparse?: boolean; readonly expireAfterSeconds?: number; readonly partialFilterExpression?: Record; + readonly wildcardProjection?: Record; + readonly collation?: Record; + readonly weights?: Record; + readonly default_language?: string; + readonly language_override?: string; } ``` @@ -133,12 +140,12 @@ Each node extends `MongoSchemaNode` and implements `accept(visitor: MongoSche interface MongoSchemaVisitor { collection(node: MongoSchemaCollection): R; index(node: MongoSchemaIndex): R; - validator(node: unknown): R; - collectionOptions(node: unknown): R; + validator(node: MongoSchemaValidator): R; + collectionOptions(node: MongoSchemaCollectionOptions): R; } ``` -Adding a new node type (e.g. `MongoSchemaValidator`) requires adding a method to this interface. Every existing visitor implementation gets a compile error until it handles the new case. This is the same exhaustiveness guarantee used by the DDL command visitors and filter expression visitors elsewhere in the codebase. +Adding a new node type requires adding a method to this interface. Every existing visitor implementation gets a compile error until it handles the new case. This is the same exhaustiveness guarantee used by the DDL command visitors and filter expression visitors elsewhere in the codebase. ### Structural identity for indexes @@ -168,4 +175,4 @@ We considered a generic `DocumentSchemaIR` shared across all document databases ### Define only the nodes needed today -We considered defining nodes only for indexes and adding collection/validator/options nodes later. We chose to define the full visitor interface up front (with `unknown` parameter types for unimplemented nodes) so that future additions produce compile errors in existing code. The node classes themselves are added incrementally — only `MongoSchemaCollection` and `MongoSchemaIndex` exist today. +We considered defining nodes only for indexes and adding collection/validator/options nodes later. We chose to define the full visitor interface up front so that future additions produce compile errors in existing code. All four node types (`MongoSchemaCollection`, `MongoSchemaIndex`, `MongoSchemaValidator`, `MongoSchemaCollectionOptions`) are now implemented. diff --git a/docs/architecture docs/adrs/ADR 188 - MongoDB migration operation model.md b/docs/architecture docs/adrs/ADR 188 - MongoDB migration operation model.md index 53c6aad483..2bdf5b4283 100644 --- a/docs/architecture docs/adrs/ADR 188 - MongoDB migration operation model.md +++ b/docs/architecture docs/adrs/ADR 188 - MongoDB migration operation model.md @@ -98,7 +98,7 @@ interface MongoMigrationStep { } ``` -The current command vocabulary is `CreateIndexCommand` and `DropIndexCommand`. Future additions (e.g. `CreateCollectionCommand`, `DropCollectionCommand`, `CollModCommand`) follow the same `MongoAstNode` pattern: frozen, `kind`-discriminated, `accept(visitor)` for dispatch. Adding a new command means one new class and one new case in the command executor — not a new operation type. +The command vocabulary is `CreateIndexCommand`, `DropIndexCommand`, `CreateCollectionCommand`, `DropCollectionCommand`, and `CollModCommand`. All follow the same `MongoAstNode` pattern: frozen, `kind`-discriminated, `accept(visitor)` for dispatch. Adding a new command means one new class and one new case in the command executor — not a new operation type. ### Checks diff --git a/docs/architecture docs/adrs/ADR 189 - Structural index matching for MongoDB migrations.md b/docs/architecture docs/adrs/ADR 189 - Structural index matching for MongoDB migrations.md index 3c30ed12e7..14e9dbde63 100644 --- a/docs/architecture docs/adrs/ADR 189 - Structural index matching for MongoDB migrations.md +++ b/docs/architecture docs/adrs/ADR 189 - Structural index matching for MongoDB migrations.md @@ -35,7 +35,12 @@ function buildIndexLookupKey(index: MongoSchemaIndex): string { index.unique ? 'unique' : '', index.sparse ? 'sparse' : '', index.expireAfterSeconds != null ? `ttl:${index.expireAfterSeconds}` : '', - index.partialFilterExpression ? `pfe:${JSON.stringify(index.partialFilterExpression)}` : '', + index.partialFilterExpression ? `pfe:${canonicalize(index.partialFilterExpression)}` : '', + index.wildcardProjection ? `wp:${canonicalize(index.wildcardProjection)}` : '', + index.collation ? `col:${canonicalize(index.collation)}` : '', + index.weights ? `wt:${canonicalize(index.weights)}` : '', + index.default_language ? `dl:${index.default_language}` : '', + index.language_override ? `lo:${index.language_override}` : '', ] .filter(Boolean) .join(';'); @@ -43,6 +48,8 @@ function buildIndexLookupKey(index: MongoSchemaIndex): string { } ``` +Object-valued options (`partialFilterExpression`, `wildcardProjection`, `collation`, `weights`) use `canonicalize()` — a key-order-independent serialization — so that `{ locale: 'en', strength: 2 }` and `{ strength: 2, locale: 'en' }` produce the same lookup key. + Two indexes that produce the same lookup key are the same index. For example: @@ -72,6 +79,11 @@ Each component is included because it changes the index's behavior at the databa - **`sparse`**. A sparse index omits documents missing the indexed field. - **`expireAfterSeconds`**. A TTL index with a 24-hour expiry is different from one with a 7-day expiry. - **`partialFilterExpression`**. A partial index scoped to `{ status: "active" }` is different from one scoped to `{ status: "archived" }`. +- **`wildcardProjection`**. A wildcard index on `{ name: 1, email: 1 }` differs from `{ name: 1 }`. +- **`collation`**. Per-index collation changes sort and comparison behavior. +- **`weights`**. Text index weights change relevance scoring. +- **`default_language`**. Changes how text indexes tokenize and stem words. +- **`language_override`**. Changes the per-document field used to determine language. ### What the lookup key excludes diff --git a/examples/mongo-demo/src/contract.json b/examples/mongo-demo/src/contract.json index a3fbacf27e..aabf6d706f 100644 --- a/examples/mongo-demo/src/contract.json +++ b/examples/mongo-demo/src/contract.json @@ -220,8 +220,8 @@ "users": { "indexes": [ { - "fields": { "email": 1 }, - "options": { "unique": true } + "keys": [{ "field": "email", "direction": 1 }], + "unique": true } ] } diff --git a/packages/1-framework/3-tooling/cli/src/control-api/operations/extract-mongo-statements.ts b/packages/1-framework/3-tooling/cli/src/control-api/operations/extract-mongo-statements.ts deleted file mode 100644 index 68d5eaec3f..0000000000 --- a/packages/1-framework/3-tooling/cli/src/control-api/operations/extract-mongo-statements.ts +++ /dev/null @@ -1,86 +0,0 @@ -import type { MigrationPlanOperation } from '@prisma-next/framework-components/control'; - -interface MongoIndexKey { - readonly field: string; - readonly direction: number | string; -} - -interface MongoCommand { - readonly kind: string; - readonly collection?: string; - readonly keys?: ReadonlyArray; - readonly unique?: boolean; - readonly sparse?: boolean; - readonly expireAfterSeconds?: number; - readonly name?: string; -} - -interface MongoExecuteStep { - readonly command: MongoCommand; -} - -function formatKeySpec(keys: ReadonlyArray): string { - const entries = keys.map((k) => `${JSON.stringify(k.field)}: ${JSON.stringify(k.direction)}`); - return `{ ${entries.join(', ')} }`; -} - -function formatCreateIndexOptions(cmd: MongoCommand): string | undefined { - const parts: string[] = []; - if (cmd.unique) parts.push('unique: true'); - if (cmd.sparse) parts.push('sparse: true'); - if (cmd.expireAfterSeconds !== undefined) - parts.push(`expireAfterSeconds: ${cmd.expireAfterSeconds}`); - if (cmd.name) parts.push(`name: ${JSON.stringify(cmd.name)}`); - if (parts.length === 0) return undefined; - return `{ ${parts.join(', ')} }`; -} - -function formatCommand(cmd: MongoCommand): string | undefined { - switch (cmd.kind) { - case 'createIndex': { - if (!cmd.keys || !cmd.collection) return undefined; - const keySpec = formatKeySpec(cmd.keys); - const opts = formatCreateIndexOptions(cmd); - return opts - ? `db.${cmd.collection}.createIndex(${keySpec}, ${opts})` - : `db.${cmd.collection}.createIndex(${keySpec})`; - } - case 'dropIndex': - if (!cmd.collection || !cmd.name) return undefined; - return `db.${cmd.collection}.dropIndex(${JSON.stringify(cmd.name)})`; - default: - return undefined; - } -} - -function hasMongoExecuteSteps( - operation: MigrationPlanOperation, -): operation is MigrationPlanOperation & { readonly execute: readonly MongoExecuteStep[] } { - const candidate = operation as unknown as Record; - if (!('execute' in candidate) || !Array.isArray(candidate['execute'])) { - return false; - } - return candidate['execute'].every( - (step: unknown) => - typeof step === 'object' && - step !== null && - 'command' in step && - typeof (step as Record)['command'] === 'object', - ); -} - -export function extractMongoStatements(operations: readonly MigrationPlanOperation[]): string[] { - const statements: string[] = []; - for (const operation of operations) { - if (!hasMongoExecuteSteps(operation)) { - continue; - } - for (const step of operation.execute) { - const formatted = formatCommand(step.command); - if (formatted) { - statements.push(formatted); - } - } - } - return statements; -} diff --git a/packages/1-framework/3-tooling/cli/src/control-api/operations/extract-operation-statements.ts b/packages/1-framework/3-tooling/cli/src/control-api/operations/extract-operation-statements.ts index 8128b5659e..eab1de6d19 100644 --- a/packages/1-framework/3-tooling/cli/src/control-api/operations/extract-operation-statements.ts +++ b/packages/1-framework/3-tooling/cli/src/control-api/operations/extract-operation-statements.ts @@ -1,5 +1,4 @@ import type { MigrationPlanOperation } from '@prisma-next/framework-components/control'; -import { extractMongoStatements } from './extract-mongo-statements'; import { extractSqlDdl } from './extract-sql-ddl'; export function extractOperationStatements( @@ -9,8 +8,6 @@ export function extractOperationStatements( switch (familyId) { case 'sql': return extractSqlDdl(operations); - case 'mongo': - return extractMongoStatements(operations); default: return undefined; } diff --git a/packages/1-framework/3-tooling/cli/test/extract-operation-statements.test.ts b/packages/1-framework/3-tooling/cli/test/extract-operation-statements.test.ts index 228869250b..760738f42e 100644 --- a/packages/1-framework/3-tooling/cli/test/extract-operation-statements.test.ts +++ b/packages/1-framework/3-tooling/cli/test/extract-operation-statements.test.ts @@ -23,58 +23,9 @@ describe('extractOperationStatements', () => { expect(result).toContain('CREATE TABLE t (id int)'); }); - it('delegates to Mongo extractor for mongo family', () => { - const ops: MigrationPlanOperation[] = [ - { - id: 'op1', - label: 'test', - operationClass: 'additive', - execute: [ - { - description: 'create index', - command: { - kind: 'createIndex', - collection: 'users', - keys: [{ field: 'email', direction: 1 }], - }, - }, - ], - } as unknown as MigrationPlanOperation, - ]; - const result = extractOperationStatements('mongo', ops); - expect(result).toBeDefined(); - expect(result).toHaveLength(1); - expect(result![0]).toContain('db.users.createIndex'); - expect(result![0]).toContain('"email"'); - }); - - it('extracts mongo dropIndex statement', () => { - const ops: MigrationPlanOperation[] = [ - { - id: 'op1', - label: 'test', - operationClass: 'destructive', - execute: [ - { - description: 'drop index', - command: { - kind: 'dropIndex', - collection: 'users', - name: 'email_1', - }, - }, - ], - } as unknown as MigrationPlanOperation, - ]; - const result = extractOperationStatements('mongo', ops); - expect(result).toEqual(['db.users.dropIndex("email_1")']); - }); - - it('returns empty array for mongo family with no execute steps', () => { - const ops: MigrationPlanOperation[] = [ - { id: 'op1', label: 'test', operationClass: 'additive' } as unknown as MigrationPlanOperation, - ]; + it('returns undefined for mongo family', () => { + const ops: MigrationPlanOperation[] = []; const result = extractOperationStatements('mongo', ops); - expect(result).toEqual([]); + expect(result).toBeUndefined(); }); }); diff --git a/packages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.ts b/packages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.ts index 5de5ded267..86d6fdecbc 100644 --- a/packages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.ts +++ b/packages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.ts @@ -240,10 +240,65 @@ const IndexSchema = type({ 'options?': IndexOptionsSchema, }); +const MongoIndexKeySchema = type({ + '+': 'reject', + field: 'string', + direction: '1 | -1 | "text" | "2dsphere" | "2d" | "hashed"', +}); + +const MongoStorageIndexSchema = type({ + '+': 'reject', + keys: MongoIndexKeySchema.array().atLeastLength(1), + 'unique?': 'boolean', + 'sparse?': 'boolean', + 'expireAfterSeconds?': 'number', + 'partialFilterExpression?': 'Record', + 'wildcardProjection?': 'Record', + 'collation?': 'Record', + 'weights?': 'Record', + 'default_language?': 'string', + 'language_override?': 'string', +}); + +const MongoStorageValidatorSchema = type({ + '+': 'reject', + jsonSchema: 'Record', + validationLevel: "'strict' | 'moderate'", + validationAction: "'error' | 'warn'", +}); + +const CappedOptionsSchema = type({ + '+': 'reject', + size: 'number', + 'max?': 'number', +}); + +const TimeseriesOptionsSchema = type({ + '+': 'reject', + timeField: 'string', + 'metaField?': 'string', + 'granularity?': "'seconds' | 'minutes' | 'hours'", +}); + +const ClusteredIndexSchema = type({ + '+': 'reject', + 'name?': 'string', +}); + +const MongoCollectionOptionsSchema = type({ + '+': 'reject', + 'capped?': CappedOptionsSchema, + 'timeseries?': TimeseriesOptionsSchema, + 'collation?': 'Record', + 'changeStreamPreAndPostImages?': ChangeStreamPreAndPostImagesSchema, + 'clusteredIndex?': ClusteredIndexSchema, +}); + const StorageCollectionSchema = type({ '+': 'reject', - 'indexes?': IndexSchema.array(), - 'options?': CollectionOptionsSchema, + 'indexes?': MongoStorageIndexSchema.array(), + 'validator?': MongoStorageValidatorSchema, + 'options?': MongoCollectionOptionsSchema, }); export const MongoContractSchema = type({ @@ -269,3 +324,15 @@ export const MongoContractSchema = type({ '[string]': type({ '+': 'reject', fields: type({ '[string]': FieldSchema }) }), }), }); + +export { + CollationSchema, + CollectionOptionsSchema, + IndexFieldsSchema, + IndexOptionsSchema, + IndexSchema, + MongoIndexKeySchema, + MongoStorageIndexSchema, + NumberRecordSchema, + WildcardProjectionSchema, +}; diff --git a/packages/2-mongo-family/1-foundation/mongo-contract/src/contract-types.ts b/packages/2-mongo-family/1-foundation/mongo-contract/src/contract-types.ts index c904cb0496..d0349b5d0c 100644 --- a/packages/2-mongo-family/1-foundation/mongo-contract/src/contract-types.ts +++ b/packages/2-mongo-family/1-foundation/mongo-contract/src/contract-types.ts @@ -117,11 +117,35 @@ export interface MongoStorageIndex { readonly sparse?: boolean; readonly expireAfterSeconds?: number; readonly partialFilterExpression?: Record; + readonly wildcardProjection?: Record; + readonly collation?: Record; + readonly weights?: Record; + readonly default_language?: string; + readonly language_override?: string; +} + +export interface MongoStorageValidator { + readonly jsonSchema: Record; + readonly validationLevel: 'strict' | 'moderate'; + readonly validationAction: 'error' | 'warn'; +} + +export interface MongoStorageCollectionOptions { + readonly capped?: { size: number; max?: number }; + readonly timeseries?: { + timeField: string; + metaField?: string; + granularity?: 'seconds' | 'minutes' | 'hours'; + }; + readonly collation?: Record; + readonly changeStreamPreAndPostImages?: { enabled: boolean }; + readonly clusteredIndex?: { name?: string }; } export interface MongoStorageCollection { - readonly indexes?: readonly MongoIndex[]; - readonly options?: MongoCollectionOptions; + readonly indexes?: ReadonlyArray; + readonly validator?: MongoStorageValidator; + readonly options?: MongoStorageCollectionOptions; } export type MongoStorage = StorageBase & { diff --git a/packages/2-mongo-family/1-foundation/mongo-contract/src/exports/index.ts b/packages/2-mongo-family/1-foundation/mongo-contract/src/exports/index.ts index 3e31530179..fe937bdc67 100644 --- a/packages/2-mongo-family/1-foundation/mongo-contract/src/exports/index.ts +++ b/packages/2-mongo-family/1-foundation/mongo-contract/src/exports/index.ts @@ -28,7 +28,9 @@ export type { MongoModelStorage, MongoStorage, MongoStorageCollection, + MongoStorageCollectionOptions, MongoStorageIndex, + MongoStorageValidator, MongoTimeSeriesCollectionOptions, MongoTimeSeriesGranularity, MongoTypeMaps, diff --git a/packages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.ts b/packages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.ts index bd2e245f67..8466ed759f 100644 --- a/packages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.ts +++ b/packages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.ts @@ -77,15 +77,10 @@ describe('validateMongoContract()', () => { items: { indexes: [ { - fields: { _id: 1 }, - options: { - unique: true, - hidden: true, - name: 'item_id_idx', - collation: { locale: 'en', strength: 2 }, - }, + keys: [{ field: '_id', direction: 1 }], + unique: true, }, - { fields: { name: 'text' } }, + { keys: [{ field: 'name', direction: 'text' }] }, ], }, }, @@ -106,26 +101,21 @@ describe('validateMongoContract()', () => { expect(result.contract.storage.collections['items']).toEqual({ indexes: [ { - fields: { _id: 1 }, - options: { - unique: true, - hidden: true, - name: 'item_id_idx', - collation: { locale: 'en', strength: 2 }, - }, + keys: [{ field: '_id', direction: 1 }], + unique: true, }, - { fields: { name: 'text' } }, + { keys: [{ field: 'name', direction: 'text' }] }, ], }); }); - it('rejects empty index field maps', () => { + it('rejects empty index keys array', () => { const json = { ...makeValidContractJson(), storage: { collections: { items: { - indexes: [{ fields: {} }], + indexes: [{ keys: [] }], }, }, }, @@ -141,9 +131,7 @@ describe('validateMongoContract()', () => { collections: { items: { options: { - capped: true, - size: 4096, - expireAfterSeconds: 3600, + capped: { size: 4096, max: 100 }, collation: { locale: 'en', strength: 2 }, changeStreamPreAndPostImages: { enabled: true }, timeseries: { @@ -152,8 +140,6 @@ describe('validateMongoContract()', () => { }, clusteredIndex: { name: '_id_', - key: { _id: 1 }, - unique: true, }, }, }, @@ -165,9 +151,7 @@ describe('validateMongoContract()', () => { expect(result.contract.storage.collections['items']).toEqual({ options: { - capped: true, - size: 4096, - expireAfterSeconds: 3600, + capped: { size: 4096, max: 100 }, collation: { locale: 'en', strength: 2 }, changeStreamPreAndPostImages: { enabled: true }, timeseries: { @@ -176,14 +160,12 @@ describe('validateMongoContract()', () => { }, clusteredIndex: { name: '_id_', - key: { _id: 1 }, - unique: true, }, }, }); }); - it('accepts record-shaped index and collection option maps', () => { + it('accepts record-shaped index partialFilterExpression', () => { const json = { ...makeValidContractJson(), storage: { @@ -191,32 +173,13 @@ describe('validateMongoContract()', () => { items: { indexes: [ { - fields: { name: 'text' }, - options: { - partialFilterExpression: { - archived: false, - $or: [{ status: 'active' }, { tags: ['priority', 'searchable'] }], - }, - weights: { name: 10 }, + keys: [{ field: 'name', direction: 'text' }], + partialFilterExpression: { + archived: false, + $or: [{ status: 'active' }, { tags: ['priority', 'searchable'] }], }, }, ], - options: { - storageEngine: { - wiredTiger: { - configString: 'block_compressor=zstd', - nested: [{ compression: 'zstd' }, 1, true, null], - }, - }, - indexOptionDefaults: { - storageEngine: { - wiredTiger: { - configString: 'prefix_compression=true', - nested: [{ prefixCompression: true }], - }, - }, - }, - }, }, }, }, @@ -236,139 +199,24 @@ describe('validateMongoContract()', () => { expect(result.contract.storage.collections['items']).toEqual({ indexes: [ { - fields: { name: 'text' }, - options: { - partialFilterExpression: { - archived: false, - $or: [{ status: 'active' }, { tags: ['priority', 'searchable'] }], - }, - weights: { name: 10 }, + keys: [{ field: 'name', direction: 'text' }], + partialFilterExpression: { + archived: false, + $or: [{ status: 'active' }, { tags: ['priority', 'searchable'] }], }, }, ], - options: { - storageEngine: { - wiredTiger: { - configString: 'block_compressor=zstd', - nested: [{ compression: 'zstd' }, 1, true, null], - }, - }, - indexOptionDefaults: { - storageEngine: { - wiredTiger: { - configString: 'prefix_compression=true', - nested: [{ prefixCompression: true }], - }, - }, - }, - }, }); }); - it('rejects non-JSON values in record-shaped index and collection option maps', () => { - const json = { - ...makeValidContractJson(), - storage: { - collections: { - items: { - indexes: [ - { - fields: { name: 'text' }, - options: { - partialFilterExpression: { - $or: [{ status: 'active' }, { updatedAt: 1n }], - }, - }, - }, - ], - options: { - storageEngine: { - wiredTiger: { configString: 'block_compressor=zstd' }, - }, - }, - }, - }, - }, - models: { - Item: { - fields: { - _id: { type: { kind: 'scalar', codecId: 'mongo/objectId@1' }, nullable: false }, - name: { type: { kind: 'scalar', codecId: 'mongo/string@1' }, nullable: false }, - }, - storage: { collection: 'items' }, - }, - }, - }; - - expect(() => validateMongoContract(json)).toThrow(); - }); - - it('rejects cyclic record-shaped index and collection option maps without overflowing the stack', () => { - const cyclicPartialFilterExpression: Record = { - archived: false, - }; - cyclicPartialFilterExpression['self'] = cyclicPartialFilterExpression; - - const cyclicStorageEngineEntries: unknown[] = []; - cyclicStorageEngineEntries.push(cyclicStorageEngineEntries); - - const json = { - ...makeValidContractJson(), - storage: { - collections: { - items: { - indexes: [ - { - fields: { name: 'text' }, - options: { - partialFilterExpression: cyclicPartialFilterExpression, - }, - }, - ], - options: { - storageEngine: { - wiredTiger: { - nested: cyclicStorageEngineEntries, - }, - }, - }, - }, - }, - }, - models: { - Item: { - fields: { - _id: { type: { kind: 'scalar', codecId: 'mongo/objectId@1' }, nullable: false }, - name: { type: { kind: 'scalar', codecId: 'mongo/string@1' }, nullable: false }, - }, - storage: { collection: 'items' }, - }, - }, - }; - - let error: unknown; - - try { - validateMongoContract(json); - } catch (caught) { - error = caught; - } - - expect(error).toBeDefined(); - expect(error).not.toBeInstanceOf(RangeError); - }); - - it('rejects empty clustered index keys', () => { + it('rejects unknown collection option properties', () => { const json = { ...makeValidContractJson(), storage: { collections: { items: { options: { - clusteredIndex: { - key: {}, - unique: true, - }, + unsupported: true, }, }, }, @@ -384,7 +232,7 @@ describe('validateMongoContract()', () => { storage: { collections: { items: { - indexes: [{ fields: { _id: 1 }, options: { unsupported: true } }], + indexes: [{ keys: [{ field: '_id', direction: 1 }], unsupported: true }], }, }, }, @@ -483,17 +331,23 @@ describe('validateMongoContract()', () => { const json = makeValidContractJson(); json.storage.collections.items = { indexes: [ - { fields: { name: 1 } }, - { fields: { email: 1 }, options: { unique: true } }, + { keys: [{ field: 'name', direction: 1 }] }, + { keys: [{ field: 'email', direction: 1 }], unique: true }, + { + keys: [{ field: 'createdAt', direction: -1 }], + sparse: true, + expireAfterSeconds: 3600, + }, { - fields: { createdAt: -1 }, - options: { sparse: true, expireAfterSeconds: 3600 }, + keys: [ + { field: 'a', direction: 1 }, + { field: 'b', direction: -1 }, + ], }, - { fields: { a: 1, b: -1 } }, - { fields: { description: 'text' } }, - { fields: { location: '2dsphere' } }, - { fields: { coords: '2d' } }, - { fields: { hash: 'hashed' } }, + { keys: [{ field: 'description', direction: 'text' }] }, + { keys: [{ field: 'location', direction: '2dsphere' }] }, + { keys: [{ field: 'coords', direction: '2d' }] }, + { keys: [{ field: 'hash', direction: 'hashed' }] }, ], } as typeof json.storage.collections.items; const result = validateMongoContract(json); @@ -505,8 +359,8 @@ describe('validateMongoContract()', () => { json.storage.collections.items = { indexes: [ { - fields: { status: 1 }, - options: { partialFilterExpression: { status: { $eq: 'active' } } }, + keys: [{ field: 'status', direction: 1 }], + partialFilterExpression: { status: { $eq: 'active' } }, }, ], } as typeof json.storage.collections.items; @@ -514,26 +368,26 @@ describe('validateMongoContract()', () => { expect(result.contract).toBeDefined(); }); - it('rejects index with empty fields', () => { + it('rejects index with empty keys array', () => { const json = makeValidContractJson(); json.storage.collections.items = { - indexes: [{ fields: {} }], + indexes: [{ keys: [] }], } as typeof json.storage.collections.items; expect(() => validateMongoContract(json)).toThrow(); }); - it('rejects index field with invalid direction', () => { + it('rejects index key with invalid direction', () => { const json = makeValidContractJson(); json.storage.collections.items = { - indexes: [{ fields: { name: 'invalid' } }], + indexes: [{ keys: [{ field: 'name', direction: 'invalid' }] }], } as typeof json.storage.collections.items; expect(() => validateMongoContract(json)).toThrow(); }); - it('rejects index missing fields', () => { + it('rejects index key missing field', () => { const json = makeValidContractJson(); json.storage.collections.items = { - indexes: [{}], + indexes: [{ keys: [{ direction: 1 }] }], } as typeof json.storage.collections.items; expect(() => validateMongoContract(json)).toThrow(); }); @@ -541,7 +395,7 @@ describe('validateMongoContract()', () => { it('rejects index with extra properties', () => { const json = makeValidContractJson(); json.storage.collections.items = { - indexes: [{ fields: { name: 1 }, extra: true }], + indexes: [{ keys: [{ field: 'name', direction: 1 }], extra: true }], } as typeof json.storage.collections.items; expect(() => validateMongoContract(json)).toThrow(); }); @@ -549,11 +403,245 @@ describe('validateMongoContract()', () => { it('rejects collection with extra properties', () => { const json = makeValidContractJson(); json.storage.collections.items = { - indexes: [{ fields: { name: 1 } }], + indexes: [{ keys: [{ field: 'name', direction: 1 }] }], extra: true, } as typeof json.storage.collections.items; expect(() => validateMongoContract(json)).toThrow(); }); + + it('accepts index with wildcardProjection', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + indexes: [ + { + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { name: 1, email: 1 }, + }, + ], + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts index with collation', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + indexes: [ + { + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'en', strength: 2 }, + }, + ], + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts index with text options (weights, default_language, language_override)', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + indexes: [ + { + keys: [{ field: 'bio', direction: 'text' }], + weights: { bio: 10 }, + default_language: 'english', + language_override: 'lang', + }, + ], + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + }); + + describe('storage validator validation', () => { + it('accepts collection with valid validator', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + validator: { + jsonSchema: { bsonType: 'object', properties: { name: { bsonType: 'string' } } }, + validationLevel: 'strict', + validationAction: 'error', + }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts collection with validator and indexes', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + indexes: [{ keys: [{ field: 'name', direction: 1 }] }], + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'moderate', + validationAction: 'warn', + }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('rejects validator with invalid validationLevel', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + validator: { + jsonSchema: {}, + validationLevel: 'invalid', + validationAction: 'error', + }, + } as typeof json.storage.collections.items; + expect(() => validateMongoContract(json)).toThrow(); + }); + + it('rejects validator with invalid validationAction', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + validator: { + jsonSchema: {}, + validationLevel: 'strict', + validationAction: 'invalid', + }, + } as typeof json.storage.collections.items; + expect(() => validateMongoContract(json)).toThrow(); + }); + + it('rejects validator missing jsonSchema', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + validator: { + validationLevel: 'strict', + validationAction: 'error', + }, + } as typeof json.storage.collections.items; + expect(() => validateMongoContract(json)).toThrow(); + }); + + it('rejects validator missing validationLevel', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + validator: { + jsonSchema: { bsonType: 'object' }, + validationAction: 'error', + }, + } as typeof json.storage.collections.items; + expect(() => validateMongoContract(json)).toThrow(); + }); + + it('rejects validator missing validationAction', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + }, + } as typeof json.storage.collections.items; + expect(() => validateMongoContract(json)).toThrow(); + }); + }); + + describe('storage collection options validation', () => { + it('accepts collection with capped option', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + options: { capped: { size: 1048576 } }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts collection with capped option including max', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + options: { capped: { size: 1048576, max: 1000 } }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts collection with timeseries option', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + options: { + timeseries: { timeField: 'timestamp', metaField: 'meta', granularity: 'hours' }, + }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts collection with collation option', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + options: { collation: { locale: 'en', strength: 2 } }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts collection with changeStreamPreAndPostImages', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + options: { changeStreamPreAndPostImages: { enabled: true } }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts collection with clusteredIndex', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + options: { clusteredIndex: { name: 'myCluster' } }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('rejects capped option without required size', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + options: { capped: { max: 100 } }, + } as typeof json.storage.collections.items; + expect(() => validateMongoContract(json)).toThrow(); + }); + + it('rejects invalid wildcardProjection values', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + indexes: [ + { + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { name: 2 }, + }, + ], + } as typeof json.storage.collections.items; + expect(() => validateMongoContract(json)).toThrow(); + }); + + it('accepts collection with no validator or options (backward compat)', () => { + const json = makeValidContractJson(); + json.storage.collections.items = {}; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); + + it('accepts collection with all options combined', () => { + const json = makeValidContractJson(); + json.storage.collections.items = { + indexes: [{ keys: [{ field: 'name', direction: 1 }] }], + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }, + options: { + changeStreamPreAndPostImages: { enabled: true }, + collation: { locale: 'en' }, + }, + } as typeof json.storage.collections.items; + const result = validateMongoContract(json); + expect(result.contract).toBeDefined(); + }); }); describe('happy path', () => { diff --git a/packages/2-mongo-family/2-authoring/contract-psl/README.md b/packages/2-mongo-family/2-authoring/contract-psl/README.md index bbe8985c69..f36930a0eb 100644 --- a/packages/2-mongo-family/2-authoring/contract-psl/README.md +++ b/packages/2-mongo-family/2-authoring/contract-psl/README.md @@ -9,6 +9,11 @@ PSL-to-Mongo contract interpreter for Prisma Next. Transforms Prisma Schema Lang - **Contract provider**: `mongoContract()` (exported from `./provider`) integrates with the CLI's `prisma-next contract emit` command, reading a `.prisma` schema file and producing a `ContractConfig` - **Diagnostics**: Emits structured diagnostics for unsupported field types (`PSL_UNSUPPORTED_FIELD_TYPE`), missing `@id` fields (`PSL_MISSING_ID_FIELD`), orphaned backrelations (`PSL_ORPHANED_BACKRELATION`), and ambiguous backrelations (`PSL_AMBIGUOUS_BACKRELATION`) +## Known limitations + +- **Per-index `collation`**: PSL authoring does not support the `collation` index option. Users requiring per-index collation must use the TypeScript contract builder (`@prisma-next/mongo-contract-ts`). +- **`partialFilterExpression` / `wildcardProjection`**: These object-valued index options are not supported in PSL and require the TypeScript contract builder. + ## Dependencies - **Depends on**: diff --git a/packages/2-mongo-family/2-authoring/contract-psl/package.json b/packages/2-mongo-family/2-authoring/contract-psl/package.json index ff10f809f5..00f674d377 100644 --- a/packages/2-mongo-family/2-authoring/contract-psl/package.json +++ b/packages/2-mongo-family/2-authoring/contract-psl/package.json @@ -16,6 +16,7 @@ "dependencies": { "@prisma-next/config": "workspace:*", "@prisma-next/contract": "workspace:*", + "@prisma-next/mongo-contract": "workspace:*", "@prisma-next/psl-parser": "workspace:*", "@prisma-next/utils": "workspace:*", "pathe": "^2.0.3" diff --git a/packages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.ts b/packages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.ts new file mode 100644 index 0000000000..29ffe4b4d6 --- /dev/null +++ b/packages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.ts @@ -0,0 +1,80 @@ +import type { ContractField, ContractValueObject } from '@prisma-next/contract/types'; +import type { MongoStorageValidator } from '@prisma-next/mongo-contract'; + +const CODEC_TO_BSON_TYPE: Record = { + 'mongo/string@1': 'string', + 'mongo/int32@1': 'int', + 'mongo/bool@1': 'bool', + 'mongo/date@1': 'date', + 'mongo/objectId@1': 'objectId', +}; + +function fieldToBsonSchema( + field: ContractField, + valueObjects: Record | undefined, +): Record | undefined { + if (field.type.kind === 'scalar') { + const bsonType = CODEC_TO_BSON_TYPE[field.type.codecId]; + if (!bsonType) return undefined; + + if ('many' in field && field.many) { + return { bsonType: 'array', items: { bsonType } }; + } + + if (field.nullable) { + return { bsonType: ['null', bsonType] }; + } + + return { bsonType }; + } + + if (field.type.kind === 'valueObject') { + const vo = valueObjects?.[field.type.name]; + if (!vo) return undefined; + const voSchema = deriveObjectSchema(vo.fields, valueObjects); + if ('many' in field && field.many) { + return { bsonType: 'array', items: voSchema }; + } + return voSchema; + } + + return undefined; +} + +function deriveObjectSchema( + fields: Record, + valueObjects: Record | undefined, +): Record { + const properties: Record = {}; + const required: string[] = []; + + for (const [fieldName, field] of Object.entries(fields)) { + const schema = fieldToBsonSchema(field, valueObjects); + if (schema) { + properties[fieldName] = schema; + if (!field.nullable) { + required.push(fieldName); + } + } + } + + const result: Record = { + bsonType: 'object', + properties, + }; + if (required.length > 0) { + result['required'] = required.sort(); + } + return result; +} + +export function deriveJsonSchema( + fields: Record, + valueObjects?: Record, +): MongoStorageValidator { + return { + jsonSchema: deriveObjectSchema(fields, valueObjects), + validationLevel: 'strict', + validationAction: 'error', + }; +} diff --git a/packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts b/packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts index 6e9563a1b8..2379386455 100644 --- a/packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts +++ b/packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts @@ -9,13 +9,17 @@ import type { ContractReferenceRelation, ContractValueObject, } from '@prisma-next/contract/types'; +import type { MongoIndexKeyDirection, MongoStorageIndex } from '@prisma-next/mongo-contract'; import type { ParsePslDocumentResult, PslField, PslModel } from '@prisma-next/psl-parser'; import { notOk, ok, type Result } from '@prisma-next/utils/result'; +import { deriveJsonSchema } from './derive-json-schema'; import { getAttribute, getMapName, + getNamedArgument, getPositionalArgument, lowerFirst, + parseIndexFieldList, parseQuotedStringLiteral, parseRelationAttribute, } from './psl-helpers'; @@ -271,6 +275,319 @@ function resolvePolymorphism(input: { return { models: patched, roots, diagnostics }; } +function parseIndexDirection(raw: string | undefined): MongoIndexKeyDirection { + if (!raw) return 1; + const stripped = raw.replace(/^["']/, '').replace(/["']$/, ''); + const num = Number(stripped); + if (num === 1 || num === -1) return num; + if (['text', '2dsphere', '2d', 'hashed'].includes(stripped)) + return stripped as MongoIndexKeyDirection; + return 1; +} + +function parseNumericArg(raw: string | undefined): number | undefined { + if (!raw) return undefined; + const n = Number(raw); + return Number.isFinite(n) ? n : undefined; +} + +function parseBooleanArg(raw: string | undefined): boolean | undefined { + if (raw === 'true') return true; + if (raw === 'false') return false; + return undefined; +} + +function parseJsonArg(raw: string | undefined): Record | undefined { + if (!raw) return undefined; + const stripped = raw.replace(/^["']/, '').replace(/["']$/, '').replace(/\\"/g, '"'); + try { + const parsed = JSON.parse(stripped); + if (typeof parsed === 'object' && parsed !== null && !Array.isArray(parsed)) { + return parsed as Record; + } + } catch { + // not valid JSON + } + return undefined; +} + +function parseCollation( + attr: import('@prisma-next/psl-parser').PslAttribute, +): Record | null | undefined { + const locale = stripQuotesHelper(getNamedArgument(attr, 'collationLocale')); + if (!locale) { + const hasAnyCollationArg = + getNamedArgument(attr, 'collationStrength') != null || + getNamedArgument(attr, 'collationCaseLevel') != null || + getNamedArgument(attr, 'collationCaseFirst') != null || + getNamedArgument(attr, 'collationNumericOrdering') != null || + getNamedArgument(attr, 'collationAlternate') != null || + getNamedArgument(attr, 'collationMaxVariable') != null || + getNamedArgument(attr, 'collationBackwards') != null || + getNamedArgument(attr, 'collationNormalization') != null; + return hasAnyCollationArg ? null : undefined; + } + + const collation: Record = { locale }; + const strength = parseNumericArg(getNamedArgument(attr, 'collationStrength')); + if (strength != null) collation['strength'] = strength; + const caseLevel = parseBooleanArg(getNamedArgument(attr, 'collationCaseLevel')); + if (caseLevel != null) collation['caseLevel'] = caseLevel; + const caseFirst = stripQuotesHelper(getNamedArgument(attr, 'collationCaseFirst')); + if (caseFirst != null) collation['caseFirst'] = caseFirst; + const numericOrdering = parseBooleanArg(getNamedArgument(attr, 'collationNumericOrdering')); + if (numericOrdering != null) collation['numericOrdering'] = numericOrdering; + const alternate = stripQuotesHelper(getNamedArgument(attr, 'collationAlternate')); + if (alternate != null) collation['alternate'] = alternate; + const maxVariable = stripQuotesHelper(getNamedArgument(attr, 'collationMaxVariable')); + if (maxVariable != null) collation['maxVariable'] = maxVariable; + const backwards = parseBooleanArg(getNamedArgument(attr, 'collationBackwards')); + if (backwards != null) collation['backwards'] = backwards; + const normalization = parseBooleanArg(getNamedArgument(attr, 'collationNormalization')); + if (normalization != null) collation['normalization'] = normalization; + return collation; +} + +function stripQuotesHelper(raw: string | undefined): string | undefined { + if (!raw) return undefined; + return raw.replace(/^["']/, '').replace(/["']$/, ''); +} + +function parseProjectionList( + raw: string | undefined, + value: 0 | 1, +): Record | undefined { + if (!raw) return undefined; + const stripped = raw.replace(/^["']/, '').replace(/["']$/, ''); + const inner = stripped.replace(/^\[/, '').replace(/\]$/, '').trim(); + if (inner.length === 0) return undefined; + const fields = inner + .split(',') + .map((s) => s.trim()) + .filter((s) => s.length > 0); + const result: Record = {}; + for (const f of fields) { + result[f] = value; + } + return result; +} + +function collectIndexes( + pslModel: PslModel, + fieldMappings: FieldMappings, + modelNames: ReadonlySet, + sourceId: string, + diagnostics: ContractSourceDiagnostic[], +): MongoStorageIndex[] { + const indexes: MongoStorageIndex[] = []; + let textIndexCount = 0; + + for (const field of pslModel.fields) { + if (modelNames.has(field.typeName)) continue; + const uniqueAttr = getAttribute(field.attributes, 'unique'); + if (!uniqueAttr) continue; + const mappedName = fieldMappings.pslNameToMapped.get(field.name) ?? field.name; + indexes.push({ + keys: [{ field: mappedName, direction: 1 }], + unique: true, + }); + } + + for (const attr of pslModel.attributes) { + const isIndex = attr.name === 'index'; + const isUnique = attr.name === 'unique'; + const isTextIndex = attr.name === 'textIndex'; + if (!isIndex && !isUnique && !isTextIndex) continue; + + const fieldsArg = getPositionalArgument(attr, 0); + if (!fieldsArg) continue; + const parsedFields = parseIndexFieldList(fieldsArg); + if (parsedFields.length === 0) continue; + + const hasWildcard = parsedFields.some((f) => f.isWildcard); + const wildcardCount = parsedFields.filter((f) => f.isWildcard).length; + + if (wildcardCount > 1) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: 'An index can contain at most one wildcard() field', + sourceId, + span: attr.span, + }); + continue; + } + + if (isUnique && hasWildcard) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: 'Unique indexes cannot use wildcard() fields', + sourceId, + span: attr.span, + }); + continue; + } + + if (isTextIndex) { + textIndexCount++; + if (textIndexCount > 1) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: `Only one @@textIndex is allowed per collection (model "${pslModel.name}")`, + sourceId, + span: attr.span, + }); + continue; + } + + if (hasWildcard) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: + 'wildcard() fields cannot be combined with type: hashed/2dsphere/2d or @@textIndex', + sourceId, + span: attr.span, + }); + continue; + } + } + + const typeArg = getNamedArgument(attr, 'type'); + const defaultDirection: MongoIndexKeyDirection = isTextIndex + ? 'text' + : parseIndexDirection(typeArg); + + if ( + hasWildcard && + typeof defaultDirection === 'string' && + ['hashed', '2dsphere', '2d'].includes(defaultDirection) + ) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: `wildcard() fields cannot be combined with type: ${defaultDirection}`, + sourceId, + span: attr.span, + }); + continue; + } + + if (defaultDirection === 'hashed' && parsedFields.length > 1) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: 'Hashed indexes must have exactly one field', + sourceId, + span: attr.span, + }); + continue; + } + + const keys = parsedFields.map((pf) => { + const mappedName = pf.isWildcard + ? pf.name.replace(/^(.+)\.\$\*\*$/, (_, prefix: string) => { + const mapped = fieldMappings.pslNameToMapped.get(prefix); + return mapped ? `${mapped}.$**` : `${prefix}.$**`; + }) + : (fieldMappings.pslNameToMapped.get(pf.name) ?? pf.name); + const direction: MongoIndexKeyDirection = + pf.direction != null ? (pf.direction as MongoIndexKeyDirection) : defaultDirection; + return { field: mappedName, direction }; + }); + + const unique = isUnique ? true : undefined; + const sparse = isTextIndex ? undefined : parseBooleanArg(getNamedArgument(attr, 'sparse')); + const expireAfterSeconds = isTextIndex + ? undefined + : parseNumericArg(getNamedArgument(attr, 'expireAfterSeconds')); + + if (hasWildcard && expireAfterSeconds != null) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: 'expireAfterSeconds cannot be combined with wildcard() fields', + sourceId, + span: attr.span, + }); + continue; + } + + const partialFilterExpression = parseJsonArg(getNamedArgument(attr, 'filter')); + + const includeArg = getNamedArgument(attr, 'include'); + const excludeArg = getNamedArgument(attr, 'exclude'); + + if (includeArg != null && excludeArg != null) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: 'Cannot specify both include and exclude on the same index', + sourceId, + span: attr.span, + }); + continue; + } + + if ((includeArg != null || excludeArg != null) && !hasWildcard) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: + 'include/exclude options are only valid when the index contains a wildcard() field', + sourceId, + span: attr.span, + }); + continue; + } + + const wildcardProjection = + includeArg != null + ? parseProjectionList(includeArg, 1) + : excludeArg != null + ? parseProjectionList(excludeArg, 0) + : undefined; + + const collation = parseCollation(attr); + if (collation === null) { + diagnostics.push({ + code: 'PSL_INVALID_INDEX', + message: 'collationLocale is required when using collation options', + sourceId, + span: attr.span, + }); + continue; + } + + const rawWeights = parseJsonArg(getNamedArgument(attr, 'weights')); + let weights: Record | undefined; + if (rawWeights) { + weights = {}; + for (const [k, v] of Object.entries(rawWeights)) { + if (typeof v === 'number') weights[k] = v; + } + } + + const rawDefaultLang = isTextIndex + ? getNamedArgument(attr, 'language') + : getNamedArgument(attr, 'default_language'); + const default_language = stripQuotesHelper(rawDefaultLang); + + const rawLangOverride = getNamedArgument(attr, 'languageOverride'); + const language_override = stripQuotesHelper(rawLangOverride); + + const index: MongoStorageIndex = { + keys, + ...(unique != null && { unique }), + ...(sparse != null && { sparse }), + ...(expireAfterSeconds != null && { expireAfterSeconds }), + ...(partialFilterExpression != null && { partialFilterExpression }), + ...(wildcardProjection != null && { wildcardProjection }), + ...(collation != null && { collation }), + ...(weights != null && { weights }), + ...(default_language != null && { default_language }), + ...(language_override != null && { language_override }), + }; + + indexes.push(index); + } + + return indexes; +} + function isRelationField(field: PslField, modelNames: ReadonlySet): boolean { return modelNames.has(field.typeName); } @@ -420,7 +737,8 @@ export function interpretPslDocumentToMongoContract( } models[pslModel.name] = { fields, relations, storage: { collection: collectionName } }; - collections[collectionName] = {}; + const modelIndexes = collectIndexes(pslModel, fieldMappings, modelNames, sourceId, diagnostics); + collections[collectionName] = modelIndexes.length > 0 ? { indexes: modelIndexes } : {}; roots[collectionName] = pslModel.name; } @@ -442,6 +760,15 @@ export function interpretPslDocumentToMongoContract( valueObjects[compositeType.name] = { fields }; } + for (const [, modelEntry] of Object.entries(models)) { + const collectionName = modelEntry.storage.collection; + const coll = collections[collectionName]; + if (coll) { + const validator = deriveJsonSchema(modelEntry.fields, valueObjects); + coll['validator'] = validator; + } + } + const fkRelationsByPair = new Map(); for (const fk of allFkRelations) { const key = fkRelationPairKey(fk.declaringModel, fk.targetModel); diff --git a/packages/2-mongo-family/2-authoring/contract-psl/src/psl-helpers.ts b/packages/2-mongo-family/2-authoring/contract-psl/src/psl-helpers.ts index d493256eaf..53e1f1cb5e 100644 --- a/packages/2-mongo-family/2-authoring/contract-psl/src/psl-helpers.ts +++ b/packages/2-mongo-family/2-authoring/contract-psl/src/psl-helpers.ts @@ -3,6 +3,69 @@ import { getPositionalArgument, parseQuotedStringLiteral } from '@prisma-next/ps export { getPositionalArgument, parseQuotedStringLiteral }; +export function getNamedArgument(attr: PslAttribute, name: string): string | undefined { + const arg = attr.args.find((a) => a.kind === 'named' && a.name === name); + return arg?.value; +} + +export function parseFieldList(value: string): readonly string[] { + const inner = value.replace(/^\[/, '').replace(/\]$/, '').trim(); + if (inner.length === 0) return []; + return splitTopLevel(inner).map((s) => s.trim()); +} + +export interface ParsedIndexField { + readonly name: string; + readonly isWildcard: boolean; + readonly direction?: number; +} + +export function parseIndexFieldList(value: string): readonly ParsedIndexField[] { + const segments = parseFieldList(value); + return segments.map(parseIndexFieldSegment); +} + +function parseIndexFieldSegment(segment: string): ParsedIndexField { + const wildcardMatch = segment.match(/^wildcard\(\s*(.*?)\s*\)$/); + if (wildcardMatch) { + const scope = wildcardMatch[1] ?? ''; + return { + name: scope.length > 0 ? `${scope}.$**` : '$**', + isWildcard: true, + }; + } + + const modifierMatch = segment.match(/^(\w+)\(\s*sort:\s*(\w+)\s*\)$/); + if (modifierMatch) { + const fieldName = modifierMatch[1] ?? segment; + const sortValue = modifierMatch[2]; + return { + name: fieldName, + isWildcard: false, + direction: sortValue === 'Desc' ? -1 : 1, + }; + } + + return { name: segment, isWildcard: false }; +} + +function splitTopLevel(input: string): string[] { + const parts: string[] = []; + let depth = 0; + let start = 0; + for (let i = 0; i < input.length; i++) { + const ch = input[i]; + if (ch === '(' || ch === '[' || ch === '{') depth++; + else if (ch === ')' || ch === ']' || ch === '}') depth = Math.max(0, depth - 1); + else if (ch === ',' && depth === 0) { + parts.push(input.slice(start, i)); + start = i + 1; + } + } + parts.push(input.slice(start)); + return parts; +} + export function lowerFirst(value: string): string { if (value.length === 0) return value; return value[0]?.toLowerCase() + value.slice(1); @@ -61,12 +124,6 @@ export function parseRelationAttribute( }; } -function parseFieldList(value: string): readonly string[] { - const inner = value.replace(/^\[/, '').replace(/\]$/, '').trim(); - if (inner.length === 0) return []; - return inner.split(',').map((s) => s.trim()); -} - function stripQuotes(value: string): string { if (value.startsWith('"') && value.endsWith('"')) { return value.slice(1, -1); diff --git a/packages/2-mongo-family/2-authoring/contract-psl/test/derive-json-schema.test.ts b/packages/2-mongo-family/2-authoring/contract-psl/test/derive-json-schema.test.ts new file mode 100644 index 0000000000..02c6cfd081 --- /dev/null +++ b/packages/2-mongo-family/2-authoring/contract-psl/test/derive-json-schema.test.ts @@ -0,0 +1,227 @@ +import type { ContractField, ContractValueObject } from '@prisma-next/contract/types'; +import { describe, expect, it } from 'vitest'; +import { deriveJsonSchema } from '../src/derive-json-schema'; + +function scalarField(codecId: string, nullable = false): ContractField { + return { type: { kind: 'scalar', codecId }, nullable }; +} + +function arrayField(codecId: string, nullable = false): ContractField { + return { type: { kind: 'scalar', codecId }, nullable, many: true }; +} + +function voField(name: string, nullable = false): ContractField { + return { type: { kind: 'valueObject', name }, nullable }; +} + +function voArrayField(name: string, nullable = false): ContractField { + return { type: { kind: 'valueObject', name }, nullable, many: true }; +} + +describe('deriveJsonSchema', () => { + it('maps String, Int, Boolean, DateTime, ObjectId to correct BSON types', () => { + const result = deriveJsonSchema({ + name: scalarField('mongo/string@1'), + age: scalarField('mongo/int32@1'), + active: scalarField('mongo/bool@1'), + created: scalarField('mongo/date@1'), + _id: scalarField('mongo/objectId@1'), + }); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + required: ['_id', 'active', 'age', 'created', 'name'], + properties: { + name: { bsonType: 'string' }, + age: { bsonType: 'int' }, + active: { bsonType: 'bool' }, + created: { bsonType: 'date' }, + _id: { bsonType: 'objectId' }, + }, + }); + expect(result.validationLevel).toBe('strict'); + expect(result.validationAction).toBe('error'); + }); + + it('handles nullable field with bsonType array including null', () => { + const result = deriveJsonSchema({ + email: scalarField('mongo/string@1', true), + }); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + properties: { + email: { bsonType: ['null', 'string'] }, + }, + }); + }); + + it('handles array field (many: true)', () => { + const result = deriveJsonSchema({ + tags: arrayField('mongo/string@1'), + }); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + required: ['tags'], + properties: { + tags: { bsonType: 'array', items: { bsonType: 'string' } }, + }, + }); + }); + + it('handles nullable array field', () => { + const result = deriveJsonSchema({ + tags: arrayField('mongo/string@1', true), + }); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + properties: { + tags: { bsonType: 'array', items: { bsonType: 'string' } }, + }, + }); + }); + + it('handles value object field as nested object', () => { + const valueObjects: Record = { + Address: { + fields: { + street: scalarField('mongo/string@1'), + city: scalarField('mongo/string@1'), + zip: scalarField('mongo/string@1', true), + }, + }, + }; + + const result = deriveJsonSchema({ address: voField('Address') }, valueObjects); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + required: ['address'], + properties: { + address: { + bsonType: 'object', + required: ['city', 'street'], + properties: { + street: { bsonType: 'string' }, + city: { bsonType: 'string' }, + zip: { bsonType: ['null', 'string'] }, + }, + }, + }, + }); + }); + + it('handles value object array field', () => { + const valueObjects: Record = { + Tag: { + fields: { + label: scalarField('mongo/string@1'), + }, + }, + }; + + const result = deriveJsonSchema({ tags: voArrayField('Tag') }, valueObjects); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + required: ['tags'], + properties: { + tags: { + bsonType: 'array', + items: { + bsonType: 'object', + required: ['label'], + properties: { + label: { bsonType: 'string' }, + }, + }, + }, + }, + }); + }); + + it('derives minimal schema from empty model', () => { + const result = deriveJsonSchema({}); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + properties: {}, + }); + }); + + it('handles mixed nullable and non-nullable fields', () => { + const result = deriveJsonSchema({ + name: scalarField('mongo/string@1'), + bio: scalarField('mongo/string@1', true), + age: scalarField('mongo/int32@1'), + }); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + required: ['age', 'name'], + properties: { + name: { bsonType: 'string' }, + bio: { bsonType: ['null', 'string'] }, + age: { bsonType: 'int' }, + }, + }); + }); + + it('skips fields with unknown codec IDs', () => { + const result = deriveJsonSchema({ + name: scalarField('mongo/string@1'), + custom: scalarField('custom/unknown@1'), + }); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + required: ['name'], + properties: { + name: { bsonType: 'string' }, + }, + }); + }); + + it('handles nested value objects (recursive)', () => { + const valueObjects: Record = { + Geo: { + fields: { + lat: scalarField('mongo/int32@1'), + lng: scalarField('mongo/int32@1'), + }, + }, + Address: { + fields: { + city: scalarField('mongo/string@1'), + geo: voField('Geo'), + }, + }, + }; + + const result = deriveJsonSchema({ address: voField('Address') }, valueObjects); + + expect(result.jsonSchema).toEqual({ + bsonType: 'object', + required: ['address'], + properties: { + address: { + bsonType: 'object', + required: ['city', 'geo'], + properties: { + city: { bsonType: 'string' }, + geo: { + bsonType: 'object', + required: ['lat', 'lng'], + properties: { + lat: { bsonType: 'int' }, + lng: { bsonType: 'int' }, + }, + }, + }, + }, + }, + }); + }); +}); diff --git a/packages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.ts b/packages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.ts index 5eec3b8fad..4ac5c96efe 100644 --- a/packages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.ts +++ b/packages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.ts @@ -39,6 +39,20 @@ function interpretOk( return result.value; } +function getIndexes( + ir: unknown, + collectionName: string, +): ReadonlyArray> | undefined { + const contract = ir as Record; + const storage = contract['storage'] as unknown as Record< + string, + Record> + >; + return storage['collections']?.[collectionName]?.['indexes'] as + | ReadonlyArray> + | undefined; +} + describe('interpretPslDocumentToMongoContract', () => { describe('scalar type mapping', () => { it('maps standard PSL types to Mongo codec IDs', () => { @@ -739,8 +753,39 @@ describe('interpretPslDocumentToMongoContract', () => { storage: { storageHash: expect.stringMatching(/^sha256:/), collections: { - users: {}, - posts: {}, + users: { + validator: { + jsonSchema: { + bsonType: 'object', + required: ['_id', 'email', 'name'], + properties: { + _id: { bsonType: 'objectId' }, + name: { bsonType: 'string' }, + email: { bsonType: 'string' }, + bio: { bsonType: ['null', 'string'] }, + }, + }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + posts: { + validator: { + jsonSchema: { + bsonType: 'object', + required: ['_id', 'authorId', 'content', 'createdAt', 'title'], + properties: { + _id: { bsonType: 'objectId' }, + title: { bsonType: 'string' }, + content: { bsonType: 'string' }, + authorId: { bsonType: 'objectId' }, + createdAt: { bsonType: 'date' }, + }, + }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, }, }, extensionPacks: {}, @@ -749,4 +794,669 @@ describe('interpretPslDocumentToMongoContract', () => { }); }); }); + + describe('index authoring', () => { + it('creates ascending index from @@index', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + email String + @@index([email]) + } + `); + const storage = ir.storage as unknown as Record< + string, + Record> + >; + const indexes = storage['collections']?.['user']?.['indexes'] as + | ReadonlyArray> + | undefined; + expect(indexes).toHaveLength(1); + expect(indexes![0]!['keys']).toEqual([{ field: 'email', direction: 1 }]); + }); + + it('creates unique index from @@unique', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + email String + @@unique([email]) + } + `); + const storage = ir.storage as unknown as Record< + string, + Record> + >; + const indexes = storage['collections']?.['user']?.['indexes'] as + | ReadonlyArray> + | undefined; + expect(indexes).toHaveLength(1); + expect(indexes![0]!['unique']).toBe(true); + }); + + it('creates compound index', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + email String + name String + @@index([email, name]) + } + `); + const storage = ir.storage as unknown as Record< + string, + Record> + >; + const indexes = storage['collections']?.['user']?.['indexes'] as + | ReadonlyArray> + | undefined; + expect(indexes).toHaveLength(1); + expect(indexes![0]!['keys']).toEqual([ + { field: 'email', direction: 1 }, + { field: 'name', direction: 1 }, + ]); + }); + + it('creates field-level @unique index', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + email String @unique + } + `); + const storage = ir.storage as unknown as Record< + string, + Record> + >; + const indexes = storage['collections']?.['user']?.['indexes'] as + | ReadonlyArray> + | undefined; + expect(indexes).toHaveLength(1); + expect(indexes![0]!['unique']).toBe(true); + expect(indexes![0]!['keys']).toEqual([{ field: 'email', direction: 1 }]); + }); + + it('creates index with sparse and TTL options', () => { + const ir = interpretOk(` + model Session { + id ObjectId @id @map("_id") + expiresAt DateTime + @@index([expiresAt], sparse: true, expireAfterSeconds: 3600) + } + `); + const storage = ir.storage as unknown as Record< + string, + Record> + >; + const indexes = storage['collections']?.['session']?.['indexes'] as + | ReadonlyArray> + | undefined; + expect(indexes).toHaveLength(1); + expect(indexes![0]!['sparse']).toBe(true); + expect(indexes![0]!['expireAfterSeconds']).toBe(3600); + }); + + it('respects @map on indexed fields', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + email String @map("email_address") + @@index([email]) + } + `); + const storage = ir.storage as unknown as Record< + string, + Record> + >; + const indexes = storage['collections']?.['user']?.['indexes'] as + | ReadonlyArray> + | undefined; + expect(indexes![0]!['keys']).toEqual([{ field: 'email_address', direction: 1 }]); + }); + + it('creates no indexes when none declared', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + } + `); + const storage = ir.storage as unknown as Record< + string, + Record> + >; + const userColl = storage['collections']?.['user']; + expect(userColl?.['indexes']).toBeUndefined(); + }); + + it('creates wildcard index from wildcard()', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + metadata String + @@index([wildcard()]) + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes).toHaveLength(1); + expect(indexes![0]!['keys']).toEqual([{ field: '$**', direction: 1 }]); + }); + + it('creates scoped wildcard index from wildcard(field)', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + metadata String + @@index([wildcard(metadata)]) + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes).toHaveLength(1); + expect(indexes![0]!['keys']).toEqual([{ field: 'metadata.$**', direction: 1 }]); + }); + + it('creates compound wildcard index', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + tenantId String + metadata String + @@index([tenantId, wildcard(metadata)]) + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes).toHaveLength(1); + expect(indexes![0]!['keys']).toEqual([ + { field: 'tenantId', direction: 1 }, + { field: 'metadata.$**', direction: 1 }, + ]); + }); + + it('applies @map to scoped wildcard field', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + meta String @map("metadata") + @@index([wildcard(meta)]) + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['keys']).toEqual([{ field: 'metadata.$**', direction: 1 }]); + }); + + it('creates descending index from sort: Desc', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + createdAt DateTime + @@index([createdAt(sort: Desc)]) + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['keys']).toEqual([{ field: 'createdAt', direction: -1 }]); + }); + + it('creates mixed-direction compound index', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + status String + createdAt DateTime + @@index([status, createdAt(sort: Desc)]) + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['keys']).toEqual([ + { field: 'status', direction: 1 }, + { field: 'createdAt', direction: -1 }, + ]); + }); + + it('creates hashed index from type: "hashed"', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + tenantId String + @@index([tenantId], type: "hashed") + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['keys']).toEqual([{ field: 'tenantId', direction: 'hashed' }]); + }); + + it('creates 2dsphere index', () => { + const ir = interpretOk(` + model Places { + id ObjectId @id @map("_id") + location String + @@index([location], type: "2dsphere") + } + `); + const indexes = getIndexes(ir, 'places'); + expect(indexes![0]!['keys']).toEqual([{ field: 'location', direction: '2dsphere' }]); + }); + + it('parses filter as partialFilterExpression', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + status String + @@index([status], filter: "{\\"status\\": \\"active\\"}") + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['partialFilterExpression']).toEqual({ status: 'active' }); + }); + + it('parses collation from named scalar arguments', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + status String + @@index([status], collationLocale: "fr", collationStrength: 2) + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['collation']).toEqual({ locale: 'fr', strength: 2 }); + }); + + it('parses all collation arguments', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + status String + @@index([status], collationLocale: "en", collationStrength: 2, collationCaseLevel: true, collationCaseFirst: "upper", collationNumericOrdering: true, collationAlternate: "shifted", collationMaxVariable: "punct", collationBackwards: false, collationNormalization: true) + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['collation']).toEqual({ + locale: 'en', + strength: 2, + caseLevel: true, + caseFirst: 'upper', + numericOrdering: true, + alternate: 'shifted', + maxVariable: 'punct', + backwards: false, + normalization: true, + }); + }); + + it('parses include as wildcardProjection with 1 values', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + metadata String + tags String + @@index([wildcard()], include: "[metadata, tags]") + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['wildcardProjection']).toEqual({ metadata: 1, tags: 1 }); + }); + + it('parses exclude as wildcardProjection with 0 values', () => { + const ir = interpretOk(` + model Events { + id ObjectId @id @map("_id") + internal String + @@index([wildcard()], exclude: "[internal, _class]") + } + `); + const indexes = getIndexes(ir, 'events'); + expect(indexes![0]!['wildcardProjection']).toEqual({ internal: 0, _class: 0 }); + }); + + it('creates @@textIndex with direction text', () => { + const ir = interpretOk(` + model Article { + id ObjectId @id @map("_id") + title String + body String + @@textIndex([title, body]) + } + `); + const indexes = getIndexes(ir, 'article'); + expect(indexes).toHaveLength(1); + expect(indexes![0]!['keys']).toEqual([ + { field: 'title', direction: 'text' }, + { field: 'body', direction: 'text' }, + ]); + }); + + it('creates @@textIndex with weights and language options', () => { + const ir = interpretOk(` + model Article { + id ObjectId @id @map("_id") + title String + body String + @@textIndex([title, body], weights: "{\\"title\\": 10, \\"body\\": 5}", language: "english", languageOverride: "idioma") + } + `); + const indexes = getIndexes(ir, 'article'); + expect(indexes![0]!['weights']).toEqual({ title: 10, body: 5 }); + expect(indexes![0]!['default_language']).toBe('english'); + expect(indexes![0]!['language_override']).toBe('idioma'); + }); + + it('creates @@unique with collation', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + email String + @@unique([email], collationLocale: "en", collationStrength: 2) + } + `); + const indexes = getIndexes(ir, 'user'); + expect(indexes![0]!['unique']).toBe(true); + expect(indexes![0]!['collation']).toEqual({ locale: 'en', strength: 2 }); + }); + + it('creates @@unique with filter', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + email String + @@unique([email], filter: "{\\"active\\": true}") + } + `); + const indexes = getIndexes(ir, 'user'); + expect(indexes![0]!['unique']).toBe(true); + expect(indexes![0]!['partialFilterExpression']).toEqual({ active: true }); + }); + }); + + describe('index validation', () => { + it('rejects multiple wildcard() fields in one index', () => { + const result = interpret(` + model Events { + id ObjectId @id @map("_id") + metadata String + tags String + @@index([wildcard(metadata), wildcard(tags)]) + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => d.message.includes('at most one wildcard()')), + ).toBe(true); + } + }); + + it('rejects wildcard() in @@unique', () => { + const result = interpret(` + model Events { + id ObjectId @id @map("_id") + metadata String + @@unique([wildcard(metadata)]) + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => + d.message.includes('Unique indexes cannot use wildcard()'), + ), + ).toBe(true); + } + }); + + it('rejects include and exclude on the same index', () => { + const result = interpret(` + model Events { + id ObjectId @id @map("_id") + metadata String + @@index([wildcard()], include: "[metadata]", exclude: "[_class]") + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => + d.message.includes('Cannot specify both include and exclude'), + ), + ).toBe(true); + } + }); + + it('rejects include/exclude without wildcard', () => { + const result = interpret(` + model Events { + id ObjectId @id @map("_id") + status String + @@index([status], include: "[status]") + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => + d.message.includes('only valid when the index contains a wildcard()'), + ), + ).toBe(true); + } + }); + + it('rejects TTL with wildcard', () => { + const result = interpret(` + model Events { + id ObjectId @id @map("_id") + metadata String + @@index([wildcard()], expireAfterSeconds: 3600) + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => + d.message.includes('expireAfterSeconds cannot be combined with wildcard()'), + ), + ).toBe(true); + } + }); + + it('rejects wildcard with hashed type', () => { + const result = interpret(` + model Events { + id ObjectId @id @map("_id") + metadata String + @@index([wildcard()], type: "hashed") + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => + d.message.includes('wildcard() fields cannot be combined with type'), + ), + ).toBe(true); + } + }); + + it('rejects multiple @@textIndex on same collection', () => { + const result = interpret(` + model Article { + id ObjectId @id @map("_id") + title String + body String + @@textIndex([title]) + @@textIndex([body]) + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => + d.message.includes('Only one @@textIndex is allowed'), + ), + ).toBe(true); + } + }); + + it('rejects hashed index with multiple fields', () => { + const result = interpret(` + model Events { + id ObjectId @id @map("_id") + tenantId String + status String + @@index([tenantId, status], type: "hashed") + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => + d.message.includes('Hashed indexes must have exactly one field'), + ), + ).toBe(true); + } + }); + + it('rejects collation options without collationLocale', () => { + const result = interpret(` + model Events { + id ObjectId @id @map("_id") + status String + @@index([status], collationStrength: 2) + } + `); + expect(result.ok).toBe(false); + if (!result.ok) { + expect( + result.failure.diagnostics.some((d) => d.message.includes('collationLocale is required')), + ).toBe(true); + } + }); + }); + + describe('validator derivation', () => { + function getValidator(ir: unknown, collectionName: string) { + const contract = ir as Record; + const storage = contract['storage'] as unknown as Record< + string, + Record> + >; + return storage['collections']?.[collectionName]?.['validator'] as + | Record + | undefined; + } + + it('derives $jsonSchema from model fields', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + name String + age Int + } + `); + const validator = getValidator(ir, 'user'); + expect(validator).toBeDefined(); + expect(validator!['validationLevel']).toBe('strict'); + expect(validator!['validationAction']).toBe('error'); + const schema = validator!['jsonSchema'] as Record; + expect(schema['bsonType']).toBe('object'); + const props = schema['properties'] as Record>; + expect(props['_id']).toEqual({ bsonType: 'objectId' }); + expect(props['name']).toEqual({ bsonType: 'string' }); + expect(props['age']).toEqual({ bsonType: 'int' }); + }); + + it('handles nullable fields with bsonType array', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + bio String? + } + `); + const validator = getValidator(ir, 'user'); + const schema = validator!['jsonSchema'] as Record; + const props = schema['properties'] as Record>; + expect(props['bio']).toEqual({ bsonType: ['null', 'string'] }); + }); + + it('handles array fields', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + tags String[] + } + `); + const validator = getValidator(ir, 'user'); + const schema = validator!['jsonSchema'] as Record; + const props = schema['properties'] as Record>; + expect(props['tags']).toEqual({ bsonType: 'array', items: { bsonType: 'string' } }); + }); + + it('uses @map names in jsonSchema properties', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + firstName String @map("first_name") + } + `); + const validator = getValidator(ir, 'user'); + const schema = validator!['jsonSchema'] as Record; + const props = schema['properties'] as Record>; + expect(props['first_name']).toEqual({ bsonType: 'string' }); + expect(props['firstName']).toBeUndefined(); + }); + + it('includes non-nullable fields in required array', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + name String + bio String? + } + `); + const validator = getValidator(ir, 'user'); + const schema = validator!['jsonSchema'] as Record; + const required = schema['required'] as string[]; + expect(required).toContain('_id'); + expect(required).toContain('name'); + expect(required).not.toContain('bio'); + }); + + it('includes validator alongside indexes', () => { + const ir = interpretOk(` + model User { + id ObjectId @id @map("_id") + email String + @@index([email]) + } + `); + const storage = ir['storage'] as unknown as Record< + string, + Record> + >; + const userColl = storage['collections']?.['user']; + expect(userColl?.['indexes']).toBeDefined(); + expect(userColl?.['validator']).toBeDefined(); + }); + + it('handles value object fields as nested objects', () => { + const ir = interpretOk(` + type Address { + street String + city String + } + + model User { + id ObjectId @id @map("_id") + address Address + } + `); + const validator = getValidator(ir, 'user'); + const schema = validator!['jsonSchema'] as Record; + const props = schema['properties'] as Record>; + expect(props['address']).toEqual({ + bsonType: 'object', + required: ['city', 'street'], + properties: { + street: { bsonType: 'string' }, + city: { bsonType: 'string' }, + }, + }); + }); + }); }); diff --git a/packages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.ts b/packages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.ts index ae06c5621c..4d39d7db9a 100644 --- a/packages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.ts +++ b/packages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.ts @@ -22,6 +22,8 @@ import { type MongoIndexOptions, type MongoStorage, type MongoStorageCollection, + type MongoStorageCollectionOptions, + type MongoStorageIndex, type MongoTypeMaps, validateMongoContract, } from '@prisma-next/mongo-contract'; @@ -1192,6 +1194,35 @@ function stableStringify(value: unknown): string { return JSON.stringify(value); } +function toStorageIndex(index: MongoIndex): MongoStorageIndex { + const keys = Object.entries(index.fields).map(([field, direction]) => ({ + field, + direction, + })); + const result: Record = { keys }; + if (index.options) { + for (const [key, value] of Object.entries(index.options)) { + if (value !== undefined) { + result[key] = value; + } + } + } + return result as unknown as MongoStorageIndex; +} + +function toStorageCollectionOptions(opts: MongoCollectionOptions): MongoStorageCollectionOptions { + const result: Record = {}; + if (opts.capped) { + result['capped'] = { size: opts.size ?? 0, ...(opts.max != null ? { max: opts.max } : {}) }; + } + if (opts.timeseries) result['timeseries'] = opts.timeseries; + if (opts.collation) result['collation'] = opts.collation; + if (opts.changeStreamPreAndPostImages) + result['changeStreamPreAndPostImages'] = opts.changeStreamPreAndPostImages; + if (opts.clusteredIndex) result['clusteredIndex'] = { name: opts.clusteredIndex.name }; + return result as unknown as MongoStorageCollectionOptions; +} + function buildCollections( models: Record | undefined, ): Record { @@ -1236,19 +1267,22 @@ function buildCollections( declaredIndexOwners.set(collectionIndexKey, modelBuilder.__name); } + const storageIndexes = (modelBuilder.__indexes ?? []).map(toStorageIndex); + const storageOptions = modelBuilder.__collectionOptions + ? toStorageCollectionOptions(modelBuilder.__collectionOptions) + : undefined; + collections[modelBuilder.__collection] = - modelBuilder.__indexes && modelBuilder.__indexes.length > 0 + storageIndexes.length > 0 ? { ...existingCollection, - indexes: [...existingIndexes, ...modelBuilder.__indexes], - ...(modelBuilder.__collectionOptions - ? { options: modelBuilder.__collectionOptions } - : {}), + indexes: [...existingIndexes, ...storageIndexes], + ...(storageOptions ? { options: storageOptions } : {}), } - : modelBuilder.__collectionOptions + : storageOptions ? { ...existingCollection, - options: modelBuilder.__collectionOptions, + options: storageOptions, } : existingCollection; } diff --git a/packages/2-mongo-family/2-authoring/contract-ts/test/contract-builder.dsl.test.ts b/packages/2-mongo-family/2-authoring/contract-ts/test/contract-builder.dsl.test.ts index 82e108b095..f82de0ce69 100644 --- a/packages/2-mongo-family/2-authoring/contract-ts/test/contract-builder.dsl.test.ts +++ b/packages/2-mongo-family/2-authoring/contract-ts/test/contract-builder.dsl.test.ts @@ -190,9 +190,9 @@ describe('mongo contract builder', () => { expect(contract.storage.collections).toEqual({ users: { indexes: [ - { fields: { email: 1 }, options: { unique: true } }, - { fields: { createdAt: 1 }, options: { expireAfterSeconds: 3600 } }, - { fields: { location: '2dsphere' } }, + { keys: [{ field: 'email', direction: 1 }], unique: true }, + { keys: [{ field: 'createdAt', direction: 1 }], expireAfterSeconds: 3600 }, + { keys: [{ field: 'location', direction: '2dsphere' }] }, ], }, }); @@ -254,8 +254,8 @@ describe('mongo contract builder', () => { expect(contract.storage.collections).toEqual({ tasks: { indexes: [ - { fields: { title: 1 }, options: { unique: true } }, - { fields: { expiresAt: 1 }, options: { expireAfterSeconds: 3600 } }, + { keys: [{ field: 'title', direction: 1 }], unique: true }, + { keys: [{ field: 'expiresAt', direction: 1 }], expireAfterSeconds: 3600 }, ], }, }); diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/README.md b/packages/2-mongo-family/3-tooling/mongo-schema-ir/README.md index 972b54b343..883e109da8 100644 --- a/packages/2-mongo-family/3-tooling/mongo-schema-ir/README.md +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/README.md @@ -9,8 +9,9 @@ This package defines the in-memory representation of MongoDB collection schemas ## Responsibilities - **Schema AST nodes**: `MongoSchemaCollection`, `MongoSchemaIndex`, `MongoSchemaValidator`, `MongoSchemaCollectionOptions` — frozen, visitable AST nodes representing MongoDB schema elements. -- **Index equivalence**: `indexesEquivalent()` compares two `MongoSchemaIndex` nodes field-by-field (keys, direction, unique, sparse, TTL, partial filter). Used by the planner to decide create/drop operations. -- **Deep equality**: `deepEqual()` provides order-sensitive structural comparison for MongoDB values (objects compare key order, matching BSON semantics). +- **Index equivalence**: `indexesEquivalent()` compares two `MongoSchemaIndex` nodes field-by-field (keys, direction, unique, sparse, TTL, partial filter, wildcardProjection, collation, weights, default_language, language_override). Used by the planner to decide create/drop operations. +- **Deep equality**: `deepEqual()` provides key-order-sensitive structural comparison for MongoDB values. For key-order-independent comparison, use `canonicalize()`. +- **Canonical serialization**: `canonicalize()` produces a key-order-independent string representation of values. Used by the planner for index lookup keys. - **Visitor pattern**: `MongoSchemaVisitor` enables extensible traversal without modifying AST nodes. ## Dependencies @@ -19,8 +20,7 @@ This package defines the in-memory representation of MongoDB collection schemas **Dependents:** -- `@prisma-next/adapter-mongo` — uses the schema IR for contract-to-schema conversion, migration planning, and filter evaluation. -- `@prisma-next/mongo-emitter` — produces schema IR during contract emission. +- `@prisma-next/adapter-mongo` — uses the schema IR via `contractToMongoSchemaIR()` for contract-to-schema conversion, migration planning, and filter evaluation. ## Usage diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/canonicalize.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/canonicalize.ts new file mode 100644 index 0000000000..3d8518976a --- /dev/null +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/canonicalize.ts @@ -0,0 +1,10 @@ +export function canonicalize(obj: unknown): string { + if (obj === null) return 'null'; + if (obj === undefined) return 'undefined'; + if (typeof obj !== 'object') return JSON.stringify(obj); + if (Array.isArray(obj)) return `[${obj.map(canonicalize).join(',')}]`; + const record = obj as Record; + const sorted = Object.keys(record).sort(); + const entries = sorted.map((k) => `${JSON.stringify(k)}:${canonicalize(record[k])}`); + return `{${entries.join(',')}}`; +} diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/exports/index.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/exports/index.ts index a1ae5b6b30..28d689d899 100644 --- a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/exports/index.ts +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/exports/index.ts @@ -1,9 +1,14 @@ +export { canonicalize } from '../canonicalize'; export { deepEqual, indexesEquivalent } from '../index-equivalence'; -export type { MongoSchemaCollectionOptions } from '../schema-collection'; +export type { MongoSchemaCollectionCtorOptions } from '../schema-collection'; export { MongoSchemaCollection } from '../schema-collection'; +export type { MongoSchemaCollectionOptionsInput } from '../schema-collection-options'; +export { MongoSchemaCollectionOptions } from '../schema-collection-options'; export type { MongoSchemaIndexOptions } from '../schema-index'; export { MongoSchemaIndex } from '../schema-index'; export type { MongoSchemaIR } from '../schema-ir'; export { MongoSchemaNode } from '../schema-node'; +export type { MongoSchemaValidatorOptions } from '../schema-validator'; +export { MongoSchemaValidator } from '../schema-validator'; export type { AnyMongoSchemaNode } from '../types'; export type { MongoSchemaVisitor } from '../visitor'; diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.ts index b076cd1f19..f193ae8924 100644 --- a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.ts +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.ts @@ -1,5 +1,10 @@ +import { canonicalize } from './canonicalize'; import type { MongoSchemaIndex } from './schema-index'; +/** + * Key-order-sensitive structural comparison. For key-order-independent + * comparison (e.g. lookup key construction), use {@link canonicalize}. + */ export function deepEqual(a: unknown, b: unknown): boolean { if (a === b) return true; if (a === null || b === null) return false; @@ -44,5 +49,12 @@ export function indexesEquivalent(a: MongoSchemaIndex, b: MongoSchemaIndex): boo if (a.unique !== b.unique) return false; if (a.sparse !== b.sparse) return false; if (a.expireAfterSeconds !== b.expireAfterSeconds) return false; - return deepEqual(a.partialFilterExpression, b.partialFilterExpression); + if (canonicalize(a.partialFilterExpression) !== canonicalize(b.partialFilterExpression)) + return false; + if (canonicalize(a.wildcardProjection) !== canonicalize(b.wildcardProjection)) return false; + if (canonicalize(a.collation) !== canonicalize(b.collation)) return false; + if (canonicalize(a.weights) !== canonicalize(b.weights)) return false; + if (a.default_language !== b.default_language) return false; + if (a.language_override !== b.language_override) return false; + return true; } diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection-options.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection-options.ts new file mode 100644 index 0000000000..8886d95fe9 --- /dev/null +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection-options.ts @@ -0,0 +1,39 @@ +import { MongoSchemaNode } from './schema-node'; +import type { MongoSchemaVisitor } from './visitor'; + +export interface MongoSchemaCollectionOptionsInput { + readonly capped?: { size: number; max?: number }; + readonly timeseries?: { + timeField: string; + metaField?: string; + granularity?: 'seconds' | 'minutes' | 'hours'; + }; + readonly collation?: Record; + readonly changeStreamPreAndPostImages?: { enabled: boolean }; + readonly clusteredIndex?: { name?: string }; +} + +export class MongoSchemaCollectionOptions extends MongoSchemaNode { + readonly kind = 'collectionOptions' as const; + readonly capped?: { size: number; max?: number } | undefined; + readonly timeseries?: + | { timeField: string; metaField?: string; granularity?: 'seconds' | 'minutes' | 'hours' } + | undefined; + readonly collation?: Record | undefined; + readonly changeStreamPreAndPostImages?: { enabled: boolean } | undefined; + readonly clusteredIndex?: { name?: string } | undefined; + + constructor(options: MongoSchemaCollectionOptionsInput) { + super(); + this.capped = options.capped; + this.timeseries = options.timeseries; + this.collation = options.collation; + this.changeStreamPreAndPostImages = options.changeStreamPreAndPostImages; + this.clusteredIndex = options.clusteredIndex; + this.freeze(); + } + + accept(visitor: MongoSchemaVisitor): R { + return visitor.collectionOptions(this); + } +} diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection.ts index 02e9c789b9..5c9d108460 100644 --- a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection.ts +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection.ts @@ -1,21 +1,29 @@ +import type { MongoSchemaCollectionOptions } from './schema-collection-options'; import type { MongoSchemaIndex } from './schema-index'; import { MongoSchemaNode } from './schema-node'; +import type { MongoSchemaValidator } from './schema-validator'; import type { MongoSchemaVisitor } from './visitor'; -export interface MongoSchemaCollectionOptions { +export interface MongoSchemaCollectionCtorOptions { readonly name: string; readonly indexes?: ReadonlyArray; + readonly validator?: MongoSchemaValidator; + readonly options?: MongoSchemaCollectionOptions; } export class MongoSchemaCollection extends MongoSchemaNode { readonly kind = 'collection' as const; readonly name: string; readonly indexes: ReadonlyArray; + readonly validator?: MongoSchemaValidator | undefined; + readonly options?: MongoSchemaCollectionOptions | undefined; - constructor(options: MongoSchemaCollectionOptions) { + constructor(options: MongoSchemaCollectionCtorOptions) { super(); this.name = options.name; this.indexes = options.indexes ?? []; + this.validator = options.validator; + this.options = options.options; this.freeze(); } diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-index.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-index.ts index 199632bc5b..7ee88ce276 100644 --- a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-index.ts +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-index.ts @@ -8,6 +8,11 @@ export interface MongoSchemaIndexOptions { readonly sparse?: boolean | undefined; readonly expireAfterSeconds?: number | undefined; readonly partialFilterExpression?: Record | undefined; + readonly wildcardProjection?: Record | undefined; + readonly collation?: Record | undefined; + readonly weights?: Record | undefined; + readonly default_language?: string | undefined; + readonly language_override?: string | undefined; } export class MongoSchemaIndex extends MongoSchemaNode { @@ -17,6 +22,11 @@ export class MongoSchemaIndex extends MongoSchemaNode { readonly sparse?: boolean | undefined; readonly expireAfterSeconds?: number | undefined; readonly partialFilterExpression?: Record | undefined; + readonly wildcardProjection?: Record | undefined; + readonly collation?: Record | undefined; + readonly weights?: Record | undefined; + readonly default_language?: string | undefined; + readonly language_override?: string | undefined; constructor(options: MongoSchemaIndexOptions) { super(); @@ -25,6 +35,11 @@ export class MongoSchemaIndex extends MongoSchemaNode { this.sparse = options.sparse; this.expireAfterSeconds = options.expireAfterSeconds; this.partialFilterExpression = options.partialFilterExpression; + this.wildcardProjection = options.wildcardProjection; + this.collation = options.collation; + this.weights = options.weights; + this.default_language = options.default_language; + this.language_override = options.language_override; this.freeze(); } diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-validator.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-validator.ts new file mode 100644 index 0000000000..5d2a3da875 --- /dev/null +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-validator.ts @@ -0,0 +1,27 @@ +import { MongoSchemaNode } from './schema-node'; +import type { MongoSchemaVisitor } from './visitor'; + +export interface MongoSchemaValidatorOptions { + readonly jsonSchema: Record; + readonly validationLevel: 'strict' | 'moderate'; + readonly validationAction: 'error' | 'warn'; +} + +export class MongoSchemaValidator extends MongoSchemaNode { + readonly kind = 'validator' as const; + readonly jsonSchema: Record; + readonly validationLevel: 'strict' | 'moderate'; + readonly validationAction: 'error' | 'warn'; + + constructor(options: MongoSchemaValidatorOptions) { + super(); + this.jsonSchema = options.jsonSchema; + this.validationLevel = options.validationLevel; + this.validationAction = options.validationAction; + this.freeze(); + } + + accept(visitor: MongoSchemaVisitor): R { + return visitor.validator(this); + } +} diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/types.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/types.ts index b18f6973b1..e28c1a0415 100644 --- a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/types.ts +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/types.ts @@ -1,4 +1,10 @@ import type { MongoSchemaCollection } from './schema-collection'; +import type { MongoSchemaCollectionOptions } from './schema-collection-options'; import type { MongoSchemaIndex } from './schema-index'; +import type { MongoSchemaValidator } from './schema-validator'; -export type AnyMongoSchemaNode = MongoSchemaCollection | MongoSchemaIndex; +export type AnyMongoSchemaNode = + | MongoSchemaCollection + | MongoSchemaCollectionOptions + | MongoSchemaIndex + | MongoSchemaValidator; diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/visitor.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/visitor.ts index 54a514a123..27833f7fc9 100644 --- a/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/visitor.ts +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/src/visitor.ts @@ -1,9 +1,11 @@ import type { MongoSchemaCollection } from './schema-collection'; +import type { MongoSchemaCollectionOptions } from './schema-collection-options'; import type { MongoSchemaIndex } from './schema-index'; +import type { MongoSchemaValidator } from './schema-validator'; export interface MongoSchemaVisitor { collection(node: MongoSchemaCollection): R; index(node: MongoSchemaIndex): R; - validator(node: unknown): R; // M2: MongoSchemaValidator - collectionOptions(node: unknown): R; // M2: MongoSchemaCollectionOptions + validator(node: MongoSchemaValidator): R; + collectionOptions(node: MongoSchemaCollectionOptions): R; } diff --git a/packages/2-mongo-family/3-tooling/mongo-schema-ir/test/schema-ir.test.ts b/packages/2-mongo-family/3-tooling/mongo-schema-ir/test/schema-ir.test.ts index 9ab2e79853..99cf805050 100644 --- a/packages/2-mongo-family/3-tooling/mongo-schema-ir/test/schema-ir.test.ts +++ b/packages/2-mongo-family/3-tooling/mongo-schema-ir/test/schema-ir.test.ts @@ -1,7 +1,10 @@ import { describe, expect, it } from 'vitest'; +import { canonicalize } from '../src/canonicalize'; import { indexesEquivalent } from '../src/index-equivalence'; import { MongoSchemaCollection } from '../src/schema-collection'; +import { MongoSchemaCollectionOptions } from '../src/schema-collection-options'; import { MongoSchemaIndex } from '../src/schema-index'; +import { MongoSchemaValidator } from '../src/schema-validator'; import type { MongoSchemaVisitor } from '../src/visitor'; describe('MongoSchemaIndex', () => { @@ -31,6 +34,28 @@ describe('MongoSchemaIndex', () => { expect(index.partialFilterExpression).toEqual({ active: { $eq: true } }); }); + it('constructs with M2 index options', () => { + const index = new MongoSchemaIndex({ + keys: [{ field: 'bio', direction: 'text' }], + weights: { bio: 10 }, + default_language: 'english', + language_override: 'lang', + collation: { locale: 'en', strength: 2 }, + }); + expect(index.weights).toEqual({ bio: 10 }); + expect(index.default_language).toBe('english'); + expect(index.language_override).toBe('lang'); + expect(index.collation).toEqual({ locale: 'en', strength: 2 }); + }); + + it('constructs with wildcardProjection', () => { + const index = new MongoSchemaIndex({ + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { name: 1, email: 1 }, + }); + expect(index.wildcardProjection).toEqual({ name: 1, email: 1 }); + }); + it('is frozen after construction', () => { const index = new MongoSchemaIndex({ keys: [{ field: 'email', direction: 1 }], @@ -90,6 +115,116 @@ describe('MongoSchemaCollection', () => { }); }); +describe('MongoSchemaValidator', () => { + it('constructs with required fields', () => { + const v = new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object', properties: { name: { bsonType: 'string' } } }, + validationLevel: 'strict', + validationAction: 'error', + }); + expect(v.kind).toBe('validator'); + expect(v.jsonSchema).toEqual({ + bsonType: 'object', + properties: { name: { bsonType: 'string' } }, + }); + expect(v.validationLevel).toBe('strict'); + expect(v.validationAction).toBe('error'); + }); + + it('is frozen after construction', () => { + const v = new MongoSchemaValidator({ + jsonSchema: {}, + validationLevel: 'moderate', + validationAction: 'warn', + }); + expect(Object.isFrozen(v)).toBe(true); + }); + + it('dispatches via visitor', () => { + const v = new MongoSchemaValidator({ + jsonSchema: {}, + validationLevel: 'strict', + validationAction: 'error', + }); + const visitor: MongoSchemaVisitor = { + collection: () => 'collection', + index: () => 'index', + validator: (node) => `validator:${node.validationLevel}`, + collectionOptions: () => 'collectionOptions', + }; + expect(v.accept(visitor)).toBe('validator:strict'); + }); +}); + +describe('MongoSchemaCollectionOptions', () => { + it('constructs with no options', () => { + const opts = new MongoSchemaCollectionOptions({}); + expect(opts.kind).toBe('collectionOptions'); + expect(opts.capped).toBeUndefined(); + expect(opts.timeseries).toBeUndefined(); + expect(opts.collation).toBeUndefined(); + expect(opts.changeStreamPreAndPostImages).toBeUndefined(); + expect(opts.clusteredIndex).toBeUndefined(); + }); + + it('constructs with all options', () => { + const opts = new MongoSchemaCollectionOptions({ + capped: { size: 1048576, max: 1000 }, + timeseries: { timeField: 'ts', metaField: 'meta', granularity: 'hours' }, + collation: { locale: 'en' }, + changeStreamPreAndPostImages: { enabled: true }, + clusteredIndex: { name: 'myCluster' }, + }); + expect(opts.capped).toEqual({ size: 1048576, max: 1000 }); + expect(opts.timeseries).toEqual({ timeField: 'ts', metaField: 'meta', granularity: 'hours' }); + expect(opts.collation).toEqual({ locale: 'en' }); + expect(opts.changeStreamPreAndPostImages).toEqual({ enabled: true }); + expect(opts.clusteredIndex).toEqual({ name: 'myCluster' }); + }); + + it('is frozen after construction', () => { + const opts = new MongoSchemaCollectionOptions({}); + expect(Object.isFrozen(opts)).toBe(true); + }); + + it('dispatches via visitor', () => { + const opts = new MongoSchemaCollectionOptions({ capped: { size: 100 } }); + const visitor: MongoSchemaVisitor = { + collection: () => 'collection', + index: () => 'index', + validator: () => 'validator', + collectionOptions: () => 'collectionOptions', + }; + expect(opts.accept(visitor)).toBe('collectionOptions'); + }); +}); + +describe('MongoSchemaCollection with validator and options', () => { + it('constructs with validator and options', () => { + const validator = new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }); + const options = new MongoSchemaCollectionOptions({ + capped: { size: 1048576 }, + }); + const coll = new MongoSchemaCollection({ + name: 'users', + validator, + options, + }); + expect(coll.validator).toBe(validator); + expect(coll.options).toBe(options); + }); + + it('defaults validator and options to undefined', () => { + const coll = new MongoSchemaCollection({ name: 'users' }); + expect(coll.validator).toBeUndefined(); + expect(coll.options).toBeUndefined(); + }); +}); + describe('indexesEquivalent', () => { it('returns true for identical indexes', () => { const a = new MongoSchemaIndex({ @@ -220,4 +355,142 @@ describe('indexesEquivalent', () => { }); expect(indexesEquivalent(a, b)).toBe(true); }); + + it('returns false for different wildcardProjection', () => { + const a = new MongoSchemaIndex({ + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { name: 1 }, + }); + const b = new MongoSchemaIndex({ + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { email: 1 }, + }); + expect(indexesEquivalent(a, b)).toBe(false); + }); + + it('returns true for same wildcardProjection', () => { + const a = new MongoSchemaIndex({ + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { name: 1, email: 1 }, + }); + const b = new MongoSchemaIndex({ + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { name: 1, email: 1 }, + }); + expect(indexesEquivalent(a, b)).toBe(true); + }); + + it('returns false for different collation', () => { + const a = new MongoSchemaIndex({ + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'en', strength: 2 }, + }); + const b = new MongoSchemaIndex({ + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'fr', strength: 2 }, + }); + expect(indexesEquivalent(a, b)).toBe(false); + }); + + it('returns true for same collation', () => { + const a = new MongoSchemaIndex({ + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'en', strength: 2 }, + }); + const b = new MongoSchemaIndex({ + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'en', strength: 2 }, + }); + expect(indexesEquivalent(a, b)).toBe(true); + }); + + it('returns false for different weights', () => { + const a = new MongoSchemaIndex({ + keys: [{ field: 'bio', direction: 'text' }], + weights: { bio: 10 }, + }); + const b = new MongoSchemaIndex({ + keys: [{ field: 'bio', direction: 'text' }], + weights: { bio: 5 }, + }); + expect(indexesEquivalent(a, b)).toBe(false); + }); + + it('returns false for different default_language', () => { + const a = new MongoSchemaIndex({ + keys: [{ field: 'bio', direction: 'text' }], + default_language: 'english', + }); + const b = new MongoSchemaIndex({ + keys: [{ field: 'bio', direction: 'text' }], + default_language: 'french', + }); + expect(indexesEquivalent(a, b)).toBe(false); + }); + + it('returns false for different language_override', () => { + const a = new MongoSchemaIndex({ + keys: [{ field: 'bio', direction: 'text' }], + language_override: 'lang', + }); + const b = new MongoSchemaIndex({ + keys: [{ field: 'bio', direction: 'text' }], + language_override: 'language', + }); + expect(indexesEquivalent(a, b)).toBe(false); + }); + + it('treats object-valued options with different key order as equivalent', () => { + const a = new MongoSchemaIndex({ + keys: [{ field: 'status', direction: 1 }], + partialFilterExpression: { status: 'active', age: { $gte: 18 } }, + collation: { locale: 'en', strength: 2 }, + weights: { title: 10, body: 5 }, + wildcardProjection: { name: 1, email: 1 }, + }); + const b = new MongoSchemaIndex({ + keys: [{ field: 'status', direction: 1 }], + partialFilterExpression: { age: { $gte: 18 }, status: 'active' }, + collation: { strength: 2, locale: 'en' }, + weights: { body: 5, title: 10 }, + wildcardProjection: { email: 1, name: 1 }, + }); + expect(indexesEquivalent(a, b)).toBe(true); + }); +}); + +describe('canonicalize', () => { + it('produces same string for objects with different key order', () => { + expect(canonicalize({ b: 1, a: 2 })).toBe(canonicalize({ a: 2, b: 1 })); + }); + + it('handles nested objects with different key order', () => { + expect(canonicalize({ outer: { b: 1, a: 2 } })).toBe(canonicalize({ outer: { a: 2, b: 1 } })); + }); + + it('preserves array order', () => { + expect(canonicalize([1, 2, 3])).not.toBe(canonicalize([3, 2, 1])); + }); + + it('handles primitives', () => { + expect(canonicalize(42)).toBe('42'); + expect(canonicalize('hello')).toBe('"hello"'); + expect(canonicalize(true)).toBe('true'); + }); + + it('handles null and undefined', () => { + expect(canonicalize(null)).toBe('null'); + expect(canonicalize(undefined)).toBe('undefined'); + }); + + it('handles empty objects and arrays', () => { + expect(canonicalize({})).toBe('{}'); + expect(canonicalize([])).toBe('[]'); + }); + + it('produces deterministic output for complex nested structures', () => { + const a = { z: [1, { y: 2, x: 3 }], a: { c: 1, b: 2 } }; + const b = { a: { b: 2, c: 1 }, z: [1, { x: 3, y: 2 }] }; + expect(canonicalize(a)).toBe(canonicalize(b)); + }); }); diff --git a/packages/2-mongo-family/4-query/query-ast/src/ddl-commands.ts b/packages/2-mongo-family/4-query/query-ast/src/ddl-commands.ts index 0678734bca..03024cee82 100644 --- a/packages/2-mongo-family/4-query/query-ast/src/ddl-commands.ts +++ b/packages/2-mongo-family/4-query/query-ast/src/ddl-commands.ts @@ -8,6 +8,11 @@ export interface CreateIndexOptions { readonly expireAfterSeconds?: number | undefined; readonly partialFilterExpression?: Record | undefined; readonly name?: string | undefined; + readonly wildcardProjection?: Record | undefined; + readonly collation?: Record | undefined; + readonly weights?: Record | undefined; + readonly default_language?: string | undefined; + readonly language_override?: string | undefined; } export class CreateIndexCommand extends MongoAstNode { @@ -19,6 +24,11 @@ export class CreateIndexCommand extends MongoAstNode { readonly expireAfterSeconds: number | undefined; readonly partialFilterExpression: Record | undefined; readonly name: string | undefined; + readonly wildcardProjection: Record | undefined; + readonly collation: Record | undefined; + readonly weights: Record | undefined; + readonly default_language: string | undefined; + readonly language_override: string | undefined; constructor( collection: string, @@ -33,6 +43,11 @@ export class CreateIndexCommand extends MongoAstNode { this.expireAfterSeconds = options?.expireAfterSeconds; this.partialFilterExpression = options?.partialFilterExpression; this.name = options?.name; + this.wildcardProjection = options?.wildcardProjection; + this.collation = options?.collation; + this.weights = options?.weights; + this.default_language = options?.default_language; + this.language_override = options?.language_override; this.freeze(); } @@ -58,4 +73,126 @@ export class DropIndexCommand extends MongoAstNode { } } -export type AnyMongoDdlCommand = CreateIndexCommand | DropIndexCommand; +export interface CreateCollectionOptions { + readonly validator?: Record | undefined; + readonly validationLevel?: 'strict' | 'moderate' | undefined; + readonly validationAction?: 'error' | 'warn' | undefined; + readonly capped?: boolean | undefined; + readonly size?: number | undefined; + readonly max?: number | undefined; + readonly timeseries?: + | { + timeField: string; + metaField?: string; + granularity?: 'seconds' | 'minutes' | 'hours'; + } + | undefined; + readonly collation?: Record | undefined; + readonly changeStreamPreAndPostImages?: { enabled: boolean } | undefined; + readonly clusteredIndex?: + | { + key: Record; + unique: boolean; + name?: string; + } + | undefined; +} + +export class CreateCollectionCommand extends MongoAstNode { + readonly kind = 'createCollection' as const; + readonly collection: string; + readonly validator: Record | undefined; + readonly validationLevel: 'strict' | 'moderate' | undefined; + readonly validationAction: 'error' | 'warn' | undefined; + readonly capped: boolean | undefined; + readonly size: number | undefined; + readonly max: number | undefined; + readonly timeseries: + | { + timeField: string; + metaField?: string; + granularity?: 'seconds' | 'minutes' | 'hours'; + } + | undefined; + readonly collation: Record | undefined; + readonly changeStreamPreAndPostImages: { enabled: boolean } | undefined; + readonly clusteredIndex: + | { + key: Record; + unique: boolean; + name?: string; + } + | undefined; + + constructor(collection: string, options?: CreateCollectionOptions) { + super(); + this.collection = collection; + this.validator = options?.validator; + this.validationLevel = options?.validationLevel; + this.validationAction = options?.validationAction; + this.capped = options?.capped; + this.size = options?.size; + this.max = options?.max; + this.timeseries = options?.timeseries; + this.collation = options?.collation; + this.changeStreamPreAndPostImages = options?.changeStreamPreAndPostImages; + this.clusteredIndex = options?.clusteredIndex; + this.freeze(); + } + + accept(visitor: MongoDdlCommandVisitor): R { + return visitor.createCollection(this); + } +} + +export class DropCollectionCommand extends MongoAstNode { + readonly kind = 'dropCollection' as const; + readonly collection: string; + + constructor(collection: string) { + super(); + this.collection = collection; + this.freeze(); + } + + accept(visitor: MongoDdlCommandVisitor): R { + return visitor.dropCollection(this); + } +} + +export interface CollModOptions { + readonly validator?: Record | undefined; + readonly validationLevel?: 'strict' | 'moderate' | undefined; + readonly validationAction?: 'error' | 'warn' | undefined; + readonly changeStreamPreAndPostImages?: { enabled: boolean } | undefined; +} + +export class CollModCommand extends MongoAstNode { + readonly kind = 'collMod' as const; + readonly collection: string; + readonly validator: Record | undefined; + readonly validationLevel: 'strict' | 'moderate' | undefined; + readonly validationAction: 'error' | 'warn' | undefined; + readonly changeStreamPreAndPostImages: { enabled: boolean } | undefined; + + constructor(collection: string, options: CollModOptions) { + super(); + this.collection = collection; + this.validator = options.validator; + this.validationLevel = options.validationLevel; + this.validationAction = options.validationAction; + this.changeStreamPreAndPostImages = options.changeStreamPreAndPostImages; + this.freeze(); + } + + accept(visitor: MongoDdlCommandVisitor): R { + return visitor.collMod(this); + } +} + +export type AnyMongoDdlCommand = + | CreateIndexCommand + | DropIndexCommand + | CreateCollectionCommand + | DropCollectionCommand + | CollModCommand; diff --git a/packages/2-mongo-family/4-query/query-ast/src/ddl-visitors.ts b/packages/2-mongo-family/4-query/query-ast/src/ddl-visitors.ts index 25b91f89db..a53730f93f 100644 --- a/packages/2-mongo-family/4-query/query-ast/src/ddl-visitors.ts +++ b/packages/2-mongo-family/4-query/query-ast/src/ddl-visitors.ts @@ -1,9 +1,18 @@ -import type { CreateIndexCommand, DropIndexCommand } from './ddl-commands'; +import type { + CollModCommand, + CreateCollectionCommand, + CreateIndexCommand, + DropCollectionCommand, + DropIndexCommand, +} from './ddl-commands'; import type { ListCollectionsCommand, ListIndexesCommand } from './inspection-commands'; export interface MongoDdlCommandVisitor { createIndex(command: CreateIndexCommand): R; dropIndex(command: DropIndexCommand): R; + createCollection(command: CreateCollectionCommand): R; + dropCollection(command: DropCollectionCommand): R; + collMod(command: CollModCommand): R; } export interface MongoInspectionCommandVisitor { diff --git a/packages/2-mongo-family/4-query/query-ast/src/exports/control.ts b/packages/2-mongo-family/4-query/query-ast/src/exports/control.ts index 178e86c32d..b188c0ec74 100644 --- a/packages/2-mongo-family/4-query/query-ast/src/exports/control.ts +++ b/packages/2-mongo-family/4-query/query-ast/src/exports/control.ts @@ -1,6 +1,17 @@ export type { MongoIndexKey, MongoIndexKeyDirection } from '@prisma-next/mongo-contract'; -export type { AnyMongoDdlCommand, CreateIndexOptions } from '../ddl-commands'; -export { CreateIndexCommand, DropIndexCommand } from '../ddl-commands'; +export type { + AnyMongoDdlCommand, + CollModOptions, + CreateCollectionOptions, + CreateIndexOptions, +} from '../ddl-commands'; +export { + CollModCommand, + CreateCollectionCommand, + CreateIndexCommand, + DropCollectionCommand, + DropIndexCommand, +} from '../ddl-commands'; export type { MongoDdlCommandVisitor, MongoInspectionCommandVisitor } from '../ddl-visitors'; export type { MongoFilterExpr } from '../filter-expressions'; export { diff --git a/packages/2-mongo-family/4-query/query-ast/test/ddl-commands.test.ts b/packages/2-mongo-family/4-query/query-ast/test/ddl-commands.test.ts index 6187a886d5..9caa82f5e8 100644 --- a/packages/2-mongo-family/4-query/query-ast/test/ddl-commands.test.ts +++ b/packages/2-mongo-family/4-query/query-ast/test/ddl-commands.test.ts @@ -1,8 +1,27 @@ import { describe, expect, it } from 'vitest'; -import { CreateIndexCommand, DropIndexCommand } from '../src/ddl-commands'; +import { + CollModCommand, + CreateCollectionCommand, + CreateIndexCommand, + DropCollectionCommand, + DropIndexCommand, +} from '../src/ddl-commands'; import type { MongoDdlCommandVisitor, MongoInspectionCommandVisitor } from '../src/ddl-visitors'; import { ListCollectionsCommand, ListIndexesCommand } from '../src/inspection-commands'; +function makeDdlVisitor( + overrides?: Partial>, +): MongoDdlCommandVisitor { + return { + createIndex: () => 'createIndex', + dropIndex: () => 'dropIndex', + createCollection: () => 'createCollection', + dropCollection: () => 'dropCollection', + collMod: () => 'collMod', + ...overrides, + }; +} + describe('CreateIndexCommand', () => { it('constructs with required fields', () => { const cmd = new CreateIndexCommand('users', [{ field: 'email', direction: 1 }]); @@ -27,6 +46,26 @@ describe('CreateIndexCommand', () => { expect(cmd.name).toBe('email_1'); }); + it('constructs with M2 index options', () => { + const cmd = new CreateIndexCommand('users', [{ field: 'bio', direction: 'text' }], { + weights: { bio: 10 }, + default_language: 'english', + language_override: 'lang', + collation: { locale: 'en', strength: 2 }, + }); + expect(cmd.weights).toEqual({ bio: 10 }); + expect(cmd.default_language).toBe('english'); + expect(cmd.language_override).toBe('lang'); + expect(cmd.collation).toEqual({ locale: 'en', strength: 2 }); + }); + + it('constructs with wildcardProjection', () => { + const cmd = new CreateIndexCommand('users', [{ field: '$**', direction: 1 }], { + wildcardProjection: { name: 1, email: 1 }, + }); + expect(cmd.wildcardProjection).toEqual({ name: 1, email: 1 }); + }); + it('is frozen', () => { const cmd = new CreateIndexCommand('users', [{ field: 'email', direction: 1 }]); expect(() => { @@ -36,10 +75,9 @@ describe('CreateIndexCommand', () => { it('dispatches via DDL visitor', () => { const cmd = new CreateIndexCommand('users', [{ field: 'email', direction: 1 }]); - const visitor: MongoDdlCommandVisitor = { + const visitor = makeDdlVisitor({ createIndex: (c) => `create:${c.collection}`, - dropIndex: () => 'drop', - }; + }); expect(cmd.accept(visitor)).toBe('create:users'); }); }); @@ -61,14 +99,137 @@ describe('DropIndexCommand', () => { it('dispatches via DDL visitor', () => { const cmd = new DropIndexCommand('users', 'email_1'); - const visitor: MongoDdlCommandVisitor = { - createIndex: () => 'create', + const visitor = makeDdlVisitor({ dropIndex: (c) => `drop:${c.name}`, - }; + }); expect(cmd.accept(visitor)).toBe('drop:email_1'); }); }); +describe('CreateCollectionCommand', () => { + it('constructs with collection name only', () => { + const cmd = new CreateCollectionCommand('events'); + expect(cmd.kind).toBe('createCollection'); + expect(cmd.collection).toBe('events'); + expect(cmd.capped).toBeUndefined(); + expect(cmd.validator).toBeUndefined(); + }); + + it('constructs with capped options', () => { + const cmd = new CreateCollectionCommand('events', { + capped: true, + size: 1048576, + max: 1000, + }); + expect(cmd.capped).toBe(true); + expect(cmd.size).toBe(1048576); + expect(cmd.max).toBe(1000); + }); + + it('constructs with validator', () => { + const cmd = new CreateCollectionCommand('events', { + validator: { $jsonSchema: { bsonType: 'object' } }, + validationLevel: 'strict', + validationAction: 'error', + }); + expect(cmd.validator).toEqual({ $jsonSchema: { bsonType: 'object' } }); + expect(cmd.validationLevel).toBe('strict'); + expect(cmd.validationAction).toBe('error'); + }); + + it('constructs with timeseries', () => { + const cmd = new CreateCollectionCommand('metrics', { + timeseries: { timeField: 'ts', metaField: 'meta', granularity: 'hours' }, + }); + expect(cmd.timeseries).toEqual({ timeField: 'ts', metaField: 'meta', granularity: 'hours' }); + }); + + it('constructs with clusteredIndex', () => { + const cmd = new CreateCollectionCommand('items', { + clusteredIndex: { key: { _id: 1 }, unique: true, name: 'myCluster' }, + }); + expect(cmd.clusteredIndex).toEqual({ key: { _id: 1 }, unique: true, name: 'myCluster' }); + }); + + it('constructs with collation and changeStreamPreAndPostImages', () => { + const cmd = new CreateCollectionCommand('items', { + collation: { locale: 'en' }, + changeStreamPreAndPostImages: { enabled: true }, + }); + expect(cmd.collation).toEqual({ locale: 'en' }); + expect(cmd.changeStreamPreAndPostImages).toEqual({ enabled: true }); + }); + + it('is frozen', () => { + const cmd = new CreateCollectionCommand('events'); + expect(Object.isFrozen(cmd)).toBe(true); + }); + + it('dispatches via DDL visitor', () => { + const cmd = new CreateCollectionCommand('events'); + const visitor = makeDdlVisitor({ + createCollection: (c) => `create:${c.collection}`, + }); + expect(cmd.accept(visitor)).toBe('create:events'); + }); +}); + +describe('DropCollectionCommand', () => { + it('constructs correctly', () => { + const cmd = new DropCollectionCommand('events'); + expect(cmd.kind).toBe('dropCollection'); + expect(cmd.collection).toBe('events'); + }); + + it('is frozen', () => { + const cmd = new DropCollectionCommand('events'); + expect(Object.isFrozen(cmd)).toBe(true); + }); + + it('dispatches via DDL visitor', () => { + const cmd = new DropCollectionCommand('events'); + const visitor = makeDdlVisitor({ + dropCollection: (c) => `drop:${c.collection}`, + }); + expect(cmd.accept(visitor)).toBe('drop:events'); + }); +}); + +describe('CollModCommand', () => { + it('constructs with validator', () => { + const cmd = new CollModCommand('users', { + validator: { $jsonSchema: { bsonType: 'object' } }, + validationLevel: 'strict', + validationAction: 'error', + }); + expect(cmd.kind).toBe('collMod'); + expect(cmd.collection).toBe('users'); + expect(cmd.validator).toEqual({ $jsonSchema: { bsonType: 'object' } }); + expect(cmd.validationLevel).toBe('strict'); + expect(cmd.validationAction).toBe('error'); + }); + + it('constructs with changeStreamPreAndPostImages', () => { + const cmd = new CollModCommand('users', { + changeStreamPreAndPostImages: { enabled: true }, + }); + expect(cmd.changeStreamPreAndPostImages).toEqual({ enabled: true }); + }); + + it('is frozen', () => { + const cmd = new CollModCommand('users', { validationLevel: 'strict' }); + expect(Object.isFrozen(cmd)).toBe(true); + }); + + it('dispatches via DDL visitor', () => { + const cmd = new CollModCommand('users', { validationLevel: 'strict' }); + const visitor = makeDdlVisitor({ + collMod: (c) => `collMod:${c.collection}`, + }); + expect(cmd.accept(visitor)).toBe('collMod:users'); + }); +}); + describe('ListIndexesCommand', () => { it('constructs correctly', () => { const cmd = new ListIndexesCommand('users'); diff --git a/packages/2-mongo-family/9-family/src/core/control-instance.ts b/packages/2-mongo-family/9-family/src/core/control-instance.ts index c19d35e3dc..e38f3e2210 100644 --- a/packages/2-mongo-family/9-family/src/core/control-instance.ts +++ b/packages/2-mongo-family/9-family/src/core/control-instance.ts @@ -32,8 +32,6 @@ function extractDb(driver: ControlDriverInstance<'mongo', string>): Db { class MongoFamilyInstance implements MongoControlFamilyInstance { readonly familyId = 'mongo' as const; - constructor(_controlStack: ControlStack) {} - validateContract(contractJson: unknown): Contract { const validated = validateMongoContract(contractJson); // MongoContract and Contract share structure but are typed independently; @@ -77,6 +75,6 @@ class MongoFamilyInstance implements MongoControlFamilyInstance { } } -export function createMongoFamilyInstance(controlStack: ControlStack): MongoControlFamilyInstance { - return new MongoFamilyInstance(controlStack); +export function createMongoFamilyInstance(_controlStack: ControlStack): MongoControlFamilyInstance { + return new MongoFamilyInstance(); } diff --git a/packages/3-mongo-target/2-mongo-adapter/src/core/command-executor.ts b/packages/3-mongo-target/2-mongo-adapter/src/core/command-executor.ts index 8dc0cfc0d6..38ce66b7a0 100644 --- a/packages/3-mongo-target/2-mongo-adapter/src/core/command-executor.ts +++ b/packages/3-mongo-target/2-mongo-adapter/src/core/command-executor.ts @@ -1,5 +1,8 @@ import type { + CollModCommand, + CreateCollectionCommand, CreateIndexCommand, + DropCollectionCommand, DropIndexCommand, ListCollectionsCommand, ListIndexesCommand, @@ -22,12 +25,48 @@ export class MongoCommandExecutor implements MongoDdlCommandVisitor { await this.db.collection(cmd.collection).dropIndex(cmd.name); } + + async createCollection(cmd: CreateCollectionCommand): Promise { + const options: Record = {}; + if (cmd.capped !== undefined) options['capped'] = cmd.capped; + if (cmd.size !== undefined) options['size'] = cmd.size; + if (cmd.max !== undefined) options['max'] = cmd.max; + if (cmd.timeseries !== undefined) options['timeseries'] = cmd.timeseries; + if (cmd.collation !== undefined) options['collation'] = cmd.collation; + if (cmd.clusteredIndex !== undefined) options['clusteredIndex'] = cmd.clusteredIndex; + if (cmd.validator !== undefined) options['validator'] = cmd.validator; + if (cmd.validationLevel !== undefined) options['validationLevel'] = cmd.validationLevel; + if (cmd.validationAction !== undefined) options['validationAction'] = cmd.validationAction; + if (cmd.changeStreamPreAndPostImages !== undefined) + options['changeStreamPreAndPostImages'] = cmd.changeStreamPreAndPostImages; + await this.db.createCollection(cmd.collection, options); + } + + async dropCollection(cmd: DropCollectionCommand): Promise { + await this.db.collection(cmd.collection).drop(); + } + + async collMod(cmd: CollModCommand): Promise { + const command: Record = { collMod: cmd.collection }; + if (cmd.validator !== undefined) command['validator'] = cmd.validator; + if (cmd.validationLevel !== undefined) command['validationLevel'] = cmd.validationLevel; + if (cmd.validationAction !== undefined) command['validationAction'] = cmd.validationAction; + if (cmd.changeStreamPreAndPostImages !== undefined) + command['changeStreamPreAndPostImages'] = cmd.changeStreamPreAndPostImages; + await this.db.command(command); + } } export class MongoInspectionExecutor implements MongoInspectionCommandVisitor> { diff --git a/packages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.ts b/packages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.ts index 53a10f1d95..52bc29b067 100644 --- a/packages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.ts +++ b/packages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.ts @@ -1,34 +1,52 @@ import type { MongoContract, - MongoIndex, MongoStorageCollection, + MongoStorageCollectionOptions, + MongoStorageIndex, + MongoStorageValidator, } from '@prisma-next/mongo-contract'; import { MongoSchemaCollection, + MongoSchemaCollectionOptions, MongoSchemaIndex, type MongoSchemaIR, + MongoSchemaValidator, } from '@prisma-next/mongo-schema-ir'; -function convertIndex(index: MongoIndex): MongoSchemaIndex { - const keys = Object.entries(index.fields).map(([field, direction]) => ({ - field, - direction, - })); +function convertIndex(index: MongoStorageIndex): MongoSchemaIndex { return new MongoSchemaIndex({ - keys, - unique: index.options?.unique, - sparse: index.options?.sparse, - expireAfterSeconds: index.options?.expireAfterSeconds, - partialFilterExpression: index.options?.partialFilterExpression as - | Record - | undefined, + keys: index.keys, + unique: index.unique, + sparse: index.sparse, + expireAfterSeconds: index.expireAfterSeconds, + partialFilterExpression: index.partialFilterExpression, + wildcardProjection: index.wildcardProjection, + collation: index.collation, + weights: index.weights, + default_language: index.default_language, + language_override: index.language_override, }); } +function convertValidator(v: MongoStorageValidator): MongoSchemaValidator { + return new MongoSchemaValidator({ + jsonSchema: v.jsonSchema, + validationLevel: v.validationLevel, + validationAction: v.validationAction, + }); +} + +function convertOptions(o: MongoStorageCollectionOptions): MongoSchemaCollectionOptions { + return new MongoSchemaCollectionOptions(o); +} + function convertCollection(name: string, def: MongoStorageCollection): MongoSchemaCollection { + const indexes = (def.indexes ?? []).map(convertIndex); return new MongoSchemaCollection({ name, - indexes: (def.indexes ?? []).map(convertIndex), + indexes, + ...(def.validator != null && { validator: convertValidator(def.validator) }), + ...(def.options != null && { options: convertOptions(def.options) }), }); } diff --git a/packages/3-mongo-target/2-mongo-adapter/src/core/ddl-formatter.ts b/packages/3-mongo-target/2-mongo-adapter/src/core/ddl-formatter.ts index 052bab0357..965ab5c6c1 100644 --- a/packages/3-mongo-target/2-mongo-adapter/src/core/ddl-formatter.ts +++ b/packages/3-mongo-target/2-mongo-adapter/src/core/ddl-formatter.ts @@ -1,6 +1,9 @@ import type { MigrationPlanOperation } from '@prisma-next/framework-components/control'; import type { + CollModCommand, + CreateCollectionCommand, CreateIndexCommand, + DropCollectionCommand, DropIndexCommand, MongoDdlCommandVisitor, MongoIndexKey, @@ -18,6 +21,32 @@ function formatOptions(cmd: CreateIndexCommand): string | undefined { if (cmd.expireAfterSeconds !== undefined) parts.push(`expireAfterSeconds: ${cmd.expireAfterSeconds}`); if (cmd.name) parts.push(`name: ${JSON.stringify(cmd.name)}`); + if (cmd.collation) parts.push(`collation: ${JSON.stringify(cmd.collation)}`); + if (cmd.weights) parts.push(`weights: ${JSON.stringify(cmd.weights)}`); + if (cmd.default_language) parts.push(`default_language: ${JSON.stringify(cmd.default_language)}`); + if (cmd.language_override) + parts.push(`language_override: ${JSON.stringify(cmd.language_override)}`); + if (cmd.wildcardProjection) + parts.push(`wildcardProjection: ${JSON.stringify(cmd.wildcardProjection)}`); + if (cmd.partialFilterExpression) + parts.push(`partialFilterExpression: ${JSON.stringify(cmd.partialFilterExpression)}`); + if (parts.length === 0) return undefined; + return `{ ${parts.join(', ')} }`; +} + +function formatCreateCollectionOptions(cmd: CreateCollectionCommand): string | undefined { + const parts: string[] = []; + if (cmd.capped) parts.push('capped: true'); + if (cmd.size !== undefined) parts.push(`size: ${cmd.size}`); + if (cmd.max !== undefined) parts.push(`max: ${cmd.max}`); + if (cmd.timeseries) parts.push(`timeseries: ${JSON.stringify(cmd.timeseries)}`); + if (cmd.collation) parts.push(`collation: ${JSON.stringify(cmd.collation)}`); + if (cmd.clusteredIndex) parts.push(`clusteredIndex: ${JSON.stringify(cmd.clusteredIndex)}`); + if (cmd.validator) parts.push(`validator: ${JSON.stringify(cmd.validator)}`); + if (cmd.validationLevel) parts.push(`validationLevel: ${JSON.stringify(cmd.validationLevel)}`); + if (cmd.validationAction) parts.push(`validationAction: ${JSON.stringify(cmd.validationAction)}`); + if (cmd.changeStreamPreAndPostImages) + parts.push(`changeStreamPreAndPostImages: ${JSON.stringify(cmd.changeStreamPreAndPostImages)}`); if (parts.length === 0) return undefined; return `{ ${parts.join(', ')} }`; } @@ -34,6 +63,30 @@ class MongoDdlCommandFormatter implements MongoDdlCommandVisitor { dropIndex(cmd: DropIndexCommand): string { return `db.${cmd.collection}.dropIndex(${JSON.stringify(cmd.name)})`; } + + createCollection(cmd: CreateCollectionCommand): string { + const opts = formatCreateCollectionOptions(cmd); + return opts + ? `db.createCollection(${JSON.stringify(cmd.collection)}, ${opts})` + : `db.createCollection(${JSON.stringify(cmd.collection)})`; + } + + dropCollection(cmd: DropCollectionCommand): string { + return `db.${cmd.collection}.drop()`; + } + + collMod(cmd: CollModCommand): string { + const parts: string[] = [`collMod: ${JSON.stringify(cmd.collection)}`]; + if (cmd.validator) parts.push(`validator: ${JSON.stringify(cmd.validator)}`); + if (cmd.validationLevel) parts.push(`validationLevel: ${JSON.stringify(cmd.validationLevel)}`); + if (cmd.validationAction) + parts.push(`validationAction: ${JSON.stringify(cmd.validationAction)}`); + if (cmd.changeStreamPreAndPostImages) + parts.push( + `changeStreamPreAndPostImages: ${JSON.stringify(cmd.changeStreamPreAndPostImages)}`, + ); + return `db.runCommand({ ${parts.join(', ')} })`; + } } const formatter = new MongoDdlCommandFormatter(); diff --git a/packages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.ts b/packages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.ts index dc3b500247..b7f90b994a 100644 --- a/packages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.ts +++ b/packages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.ts @@ -2,7 +2,10 @@ import type { MigrationOperationClass } from '@prisma-next/framework-components/ import { type AnyMongoDdlCommand, type AnyMongoInspectionCommand, + CollModCommand, + CreateCollectionCommand, CreateIndexCommand, + DropCollectionCommand, DropIndexCommand, ListCollectionsCommand, ListIndexesCommand, @@ -30,6 +33,11 @@ const CreateIndexJson = type({ 'expireAfterSeconds?': 'number', 'partialFilterExpression?': 'Record', 'name?': 'string', + 'wildcardProjection?': 'Record', + 'collation?': 'Record', + 'weights?': 'Record', + 'default_language?': 'string', + 'language_override?': 'string', }); const DropIndexJson = type({ @@ -38,6 +46,35 @@ const DropIndexJson = type({ name: 'string', }); +const CreateCollectionJson = type({ + kind: '"createCollection"', + collection: 'string', + 'validator?': 'Record', + 'validationLevel?': '"strict" | "moderate"', + 'validationAction?': '"error" | "warn"', + 'capped?': 'boolean', + 'size?': 'number', + 'max?': 'number', + 'timeseries?': 'Record', + 'collation?': 'Record', + 'changeStreamPreAndPostImages?': 'Record', + 'clusteredIndex?': 'Record', +}); + +const DropCollectionJson = type({ + kind: '"dropCollection"', + collection: 'string', +}); + +const CollModJson = type({ + kind: '"collMod"', + collection: 'string', + 'validator?': 'Record', + 'validationLevel?': '"strict" | "moderate"', + 'validationAction?': '"error" | "warn"', + 'changeStreamPreAndPostImages?': 'Record', +}); + const ListIndexesJson = type({ kind: '"listIndexes"', collection: 'string', @@ -134,12 +171,49 @@ function deserializeDdlCommand(json: unknown): AnyMongoDdlCommand { expireAfterSeconds: data.expireAfterSeconds, partialFilterExpression: data.partialFilterExpression, name: data.name, + wildcardProjection: data.wildcardProjection as Record | undefined, + collation: data.collation, + weights: data.weights as Record | undefined, + default_language: data.default_language, + language_override: data.language_override, }); } case 'dropIndex': { const data = validate(DropIndexJson, json, 'dropIndex command'); return new DropIndexCommand(data.collection, data.name); } + case 'createCollection': { + const data = validate(CreateCollectionJson, json, 'createCollection command'); + return new CreateCollectionCommand(data.collection, { + validator: data.validator, + validationLevel: data.validationLevel, + validationAction: data.validationAction, + capped: data.capped, + size: data.size, + max: data.max, + timeseries: data.timeseries as CreateCollectionCommand['timeseries'], + collation: data.collation, + changeStreamPreAndPostImages: data.changeStreamPreAndPostImages as + | { enabled: boolean } + | undefined, + clusteredIndex: data.clusteredIndex as CreateCollectionCommand['clusteredIndex'], + }); + } + case 'dropCollection': { + const data = validate(DropCollectionJson, json, 'dropCollection command'); + return new DropCollectionCommand(data.collection); + } + case 'collMod': { + const data = validate(CollModJson, json, 'collMod command'); + return new CollModCommand(data.collection, { + validator: data.validator, + validationLevel: data.validationLevel, + validationAction: data.validationAction, + changeStreamPreAndPostImages: data.changeStreamPreAndPostImages as + | { enabled: boolean } + | undefined, + }); + } default: throw new Error(`Unknown DDL command kind: ${kind}`); } diff --git a/packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts b/packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts index 843de0cf82..4360c2b01b 100644 --- a/packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts +++ b/packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts @@ -8,16 +8,28 @@ import type { import type { MongoContract, MongoIndexKey } from '@prisma-next/mongo-contract'; import { buildIndexOpId, + CollModCommand, + CreateCollectionCommand, CreateIndexCommand, + DropCollectionCommand, DropIndexCommand, defaultMongoIndexName, keysToKeySpec, + ListCollectionsCommand, ListIndexesCommand, MongoAndExpr, MongoFieldFilter, type MongoMigrationPlanOperation, } from '@prisma-next/mongo-query-ast/control'; -import type { MongoSchemaIndex, MongoSchemaIR } from '@prisma-next/mongo-schema-ir'; +import { + canonicalize, + deepEqual, + type MongoSchemaCollection, + type MongoSchemaCollectionOptions, + type MongoSchemaIndex, + type MongoSchemaIR, + type MongoSchemaValidator, +} from '@prisma-next/mongo-schema-ir'; import { contractToMongoSchemaIR } from './contract-to-schema'; function buildIndexLookupKey(index: MongoSchemaIndex): string { @@ -26,7 +38,12 @@ function buildIndexLookupKey(index: MongoSchemaIndex): string { index.unique ? 'unique' : '', index.sparse ? 'sparse' : '', index.expireAfterSeconds != null ? `ttl:${index.expireAfterSeconds}` : '', - index.partialFilterExpression ? `pfe:${JSON.stringify(index.partialFilterExpression)}` : '', + index.partialFilterExpression ? `pfe:${canonicalize(index.partialFilterExpression)}` : '', + index.wildcardProjection ? `wp:${canonicalize(index.wildcardProjection)}` : '', + index.collation ? `col:${canonicalize(index.collation)}` : '', + index.weights ? `wt:${canonicalize(index.weights)}` : '', + index.default_language ? `dl:${index.default_language}` : '', + index.language_override ? `lo:${index.language_override}` : '', ] .filter(Boolean) .join(';'); @@ -37,10 +54,18 @@ function formatKeys(keys: ReadonlyArray): string { return keys.map((k) => `${k.field}:${k.direction}`).join(', '); } +function isTextIndex(keys: ReadonlyArray): boolean { + return keys.some((k) => k.direction === 'text'); +} + function planCreateIndex(collection: string, index: MongoSchemaIndex): MongoMigrationPlanOperation { const { keys } = index; const name = defaultMongoIndexName(keys); - const keyFilter = MongoFieldFilter.eq('key', keysToKeySpec(keys)); + + const textIndex = isTextIndex(keys); + const keyFilter = textIndex + ? MongoFieldFilter.eq('key._fts', 'text') + : MongoFieldFilter.eq('key', keysToKeySpec(keys)); const fullFilter = index.unique ? MongoAndExpr.of([keyFilter, MongoFieldFilter.eq('unique', true)]) : keyFilter; @@ -65,6 +90,11 @@ function planCreateIndex(collection: string, index: MongoSchemaIndex): MongoMigr sparse: index.sparse, expireAfterSeconds: index.expireAfterSeconds, partialFilterExpression: index.partialFilterExpression, + wildcardProjection: index.wildcardProjection, + collation: index.collation, + weights: index.weights, + default_language: index.default_language, + language_override: index.language_override, name, }), }, @@ -83,7 +113,10 @@ function planCreateIndex(collection: string, index: MongoSchemaIndex): MongoMigr function planDropIndex(collection: string, index: MongoSchemaIndex): MongoMigrationPlanOperation { const { keys } = index; const indexName = defaultMongoIndexName(keys); - const keyFilter = MongoFieldFilter.eq('key', keysToKeySpec(keys)); + const textIndex = isTextIndex(keys); + const keyFilter = textIndex + ? MongoFieldFilter.eq('key._fts', 'text') + : MongoFieldFilter.eq('key', keysToKeySpec(keys)); return { id: buildIndexOpId('drop', collection, keys), @@ -114,6 +147,210 @@ function planDropIndex(collection: string, index: MongoSchemaIndex): MongoMigrat }; } +function validatorsEqual( + a: MongoSchemaValidator | undefined, + b: MongoSchemaValidator | undefined, +): boolean { + if (!a && !b) return true; + if (!a || !b) return false; + return ( + a.validationLevel === b.validationLevel && + a.validationAction === b.validationAction && + canonicalize(a.jsonSchema) === canonicalize(b.jsonSchema) + ); +} + +function classifyValidatorUpdate( + origin: MongoSchemaValidator, + dest: MongoSchemaValidator, +): 'widening' | 'destructive' { + let hasDestructive = false; + + if (canonicalize(origin.jsonSchema) !== canonicalize(dest.jsonSchema)) { + hasDestructive = true; + } + + if (origin.validationAction !== dest.validationAction) { + if (dest.validationAction === 'error') hasDestructive = true; + } + + if (origin.validationLevel !== dest.validationLevel) { + if (dest.validationLevel === 'strict') hasDestructive = true; + } + + return hasDestructive ? 'destructive' : 'widening'; +} + +function planValidatorDiff( + collName: string, + originValidator: MongoSchemaValidator | undefined, + destValidator: MongoSchemaValidator | undefined, +): MongoMigrationPlanOperation | undefined { + if (validatorsEqual(originValidator, destValidator)) return undefined; + + const collExistsPrecheck = { + description: `collection ${collName} exists`, + source: new ListCollectionsCommand(), + filter: MongoFieldFilter.eq('name', collName), + expect: 'exists' as const, + }; + + if (destValidator) { + const operationClass = originValidator + ? classifyValidatorUpdate(originValidator, destValidator) + : 'destructive'; + return { + id: `validator.${collName}.${originValidator ? 'update' : 'add'}`, + label: `${originValidator ? 'Update' : 'Add'} validator on ${collName}`, + operationClass, + precheck: [collExistsPrecheck], + execute: [ + { + description: `set validator on ${collName}`, + command: new CollModCommand(collName, { + validator: { $jsonSchema: destValidator.jsonSchema }, + validationLevel: destValidator.validationLevel, + validationAction: destValidator.validationAction, + }), + }, + ], + postcheck: [ + { + description: `validator applied on ${collName}`, + source: new ListCollectionsCommand(), + filter: MongoAndExpr.of([ + MongoFieldFilter.eq('name', collName), + MongoFieldFilter.eq('options.validationLevel', destValidator.validationLevel), + MongoFieldFilter.eq('options.validationAction', destValidator.validationAction), + ]), + expect: 'exists' as const, + }, + ], + }; + } + + return { + id: `validator.${collName}.remove`, + label: `Remove validator on ${collName}`, + operationClass: 'widening', + precheck: [collExistsPrecheck], + execute: [ + { + description: `remove validator on ${collName}`, + command: new CollModCommand(collName, { + validator: {}, + validationLevel: 'strict', + validationAction: 'error', + }), + }, + ], + postcheck: [], + }; +} + +function hasImmutableOptionChange( + origin: MongoSchemaCollectionOptions | undefined, + dest: MongoSchemaCollectionOptions | undefined, +): string | undefined { + if (canonicalize(origin?.capped) !== canonicalize(dest?.capped)) return 'capped'; + if (canonicalize(origin?.timeseries) !== canonicalize(dest?.timeseries)) return 'timeseries'; + if (canonicalize(origin?.collation) !== canonicalize(dest?.collation)) return 'collation'; + if (canonicalize(origin?.clusteredIndex) !== canonicalize(dest?.clusteredIndex)) + return 'clusteredIndex'; + return undefined; +} + +function planCreateCollection( + collName: string, + dest: MongoSchemaCollection, +): MongoMigrationPlanOperation { + const opts = dest.options; + const validator = dest.validator; + return { + id: `collection.${collName}.create`, + label: `Create collection ${collName}`, + operationClass: 'additive', + precheck: [ + { + description: `collection ${collName} does not exist`, + source: new ListCollectionsCommand(), + filter: MongoFieldFilter.eq('name', collName), + expect: 'notExists', + }, + ], + execute: [ + { + description: `create collection ${collName}`, + command: new CreateCollectionCommand(collName, { + capped: opts?.capped ? true : undefined, + size: opts?.capped?.size, + max: opts?.capped?.max, + timeseries: opts?.timeseries, + collation: opts?.collation, + clusteredIndex: opts?.clusteredIndex + ? { + key: { _id: 1 } as Record, + unique: true as boolean, + ...(opts.clusteredIndex.name != null ? { name: opts.clusteredIndex.name } : {}), + } + : undefined, + validator: validator ? { $jsonSchema: validator.jsonSchema } : undefined, + validationLevel: validator?.validationLevel, + validationAction: validator?.validationAction, + changeStreamPreAndPostImages: opts?.changeStreamPreAndPostImages, + }), + }, + ], + postcheck: [], + }; +} + +function planDropCollection(collName: string): MongoMigrationPlanOperation { + return { + id: `collection.${collName}.drop`, + label: `Drop collection ${collName}`, + operationClass: 'destructive', + precheck: [], + execute: [ + { + description: `drop collection ${collName}`, + command: new DropCollectionCommand(collName), + }, + ], + postcheck: [], + }; +} + +function planMutableOptionsDiff( + collName: string, + origin: MongoSchemaCollectionOptions | undefined, + dest: MongoSchemaCollectionOptions | undefined, +): MongoMigrationPlanOperation | undefined { + const originCSPPI = origin?.changeStreamPreAndPostImages; + const destCSPPI = dest?.changeStreamPreAndPostImages; + if (deepEqual(originCSPPI, destCSPPI)) return undefined; + + return { + id: `options.${collName}.update`, + label: `Update mutable options on ${collName}`, + operationClass: destCSPPI?.enabled ? 'widening' : 'destructive', + precheck: [], + execute: [ + { + description: `update options on ${collName}`, + command: new CollModCommand(collName, { + changeStreamPreAndPostImages: destCSPPI, + }), + }, + ], + postcheck: [], + }; +} + +function collectionHasOptions(coll: MongoSchemaCollection): boolean { + return !!(coll.options || coll.validator); +} + export class MongoMigrationPlanner implements MigrationPlanner<'mongo', 'mongo'> { plan(options: { readonly contract: unknown; @@ -125,8 +362,13 @@ export class MongoMigrationPlanner implements MigrationPlanner<'mongo', 'mongo'> const originIR = options.schema as MongoSchemaIR; const destinationIR = contractToMongoSchemaIR(contract); + const collCreates: MongoMigrationPlanOperation[] = []; const drops: MongoMigrationPlanOperation[] = []; const creates: MongoMigrationPlanOperation[] = []; + const validatorOps: MongoMigrationPlanOperation[] = []; + const mutableOptionOps: MongoMigrationPlanOperation[] = []; + const collDrops: MongoMigrationPlanOperation[] = []; + const conflicts: MigrationPlannerConflict[] = []; const allCollectionNames = new Set([ ...Object.keys(originIR.collections), @@ -137,6 +379,29 @@ export class MongoMigrationPlanner implements MigrationPlanner<'mongo', 'mongo'> const originColl = originIR.collections[collName]; const destColl = destinationIR.collections[collName]; + if (!originColl && destColl) { + if (collectionHasOptions(destColl)) { + collCreates.push(planCreateCollection(collName, destColl)); + } + } else if (originColl && !destColl) { + collDrops.push(planDropCollection(collName)); + } else if (originColl && destColl) { + const immutableChange = hasImmutableOptionChange(originColl.options, destColl.options); + if (immutableChange) { + conflicts.push({ + kind: 'policy-violation', + summary: `Cannot change immutable collection option '${immutableChange}' on ${collName}`, + why: `MongoDB does not support modifying the '${immutableChange}' option after collection creation`, + }); + } + + const mutableOp = planMutableOptionsDiff(collName, originColl.options, destColl.options); + if (mutableOp) mutableOptionOps.push(mutableOp); + + const validatorOp = planValidatorDiff(collName, originColl.validator, destColl.validator); + if (validatorOp) validatorOps.push(validatorOp); + } + const originLookup = new Map(); if (originColl) { for (const idx of originColl.indexes) { @@ -164,9 +429,19 @@ export class MongoMigrationPlanner implements MigrationPlanner<'mongo', 'mongo'> } } - const allOps = [...drops, ...creates]; + if (conflicts.length > 0) { + return { kind: 'failure', conflicts }; + } + + const allOps = [ + ...collCreates, + ...drops, + ...creates, + ...validatorOps, + ...mutableOptionOps, + ...collDrops, + ]; - const conflicts: MigrationPlannerConflict[] = []; for (const op of allOps) { if (!options.policy.allowedOperationClasses.includes(op.operationClass)) { conflicts.push({ diff --git a/packages/3-mongo-target/2-mongo-adapter/test/command-executor.test.ts b/packages/3-mongo-target/2-mongo-adapter/test/command-executor.test.ts index b536ddf8a6..112f7aa546 100644 --- a/packages/3-mongo-target/2-mongo-adapter/test/command-executor.test.ts +++ b/packages/3-mongo-target/2-mongo-adapter/test/command-executor.test.ts @@ -1,5 +1,8 @@ import { + CollModCommand, + CreateCollectionCommand, CreateIndexCommand, + DropCollectionCommand, DropIndexCommand, ListCollectionsCommand, ListIndexesCommand, @@ -31,7 +34,9 @@ afterAll(async () => { beforeEach(async () => { const collections = await db.listCollections().toArray(); for (const col of collections) { - await db.dropCollection(col['name'] as string); + const name = col['name'] as string; + if (name.startsWith('system.')) continue; + await db.dropCollection(name); } }); @@ -96,6 +101,217 @@ describe('MongoCommandExecutor', () => { const titleIndex = indexes.find((idx) => idx['name'] === 'title_1'); expect(titleIndex).toBeUndefined(); }); + + it('createIndex passes M2 options (collation, wildcardProjection)', async () => { + await db.createCollection('products'); + const executor = new MongoCommandExecutor(db); + const cmd = new CreateIndexCommand('products', [{ field: 'name', direction: 1 }], { + collation: { locale: 'en', strength: 2 }, + }); + + await cmd.accept(executor); + + const indexes = await db.collection('products').listIndexes().toArray(); + const nameIndex = indexes.find((idx) => idx['key']?.['name'] === 1); + expect(nameIndex).toBeDefined(); + expect(nameIndex?.['collation']?.['locale']).toBe('en'); + }); + + it('createCollection creates a new collection', async () => { + const executor = new MongoCommandExecutor(db); + const cmd = new CreateCollectionCommand('events'); + + await cmd.accept(executor); + + const colls = await db.listCollections({ name: 'events' }).toArray(); + expect(colls).toHaveLength(1); + }); + + it('createCollection creates a capped collection', async () => { + const executor = new MongoCommandExecutor(db); + const cmd = new CreateCollectionCommand('logs', { + capped: true, + size: 1048576, + max: 1000, + }); + + await cmd.accept(executor); + + const colls = await db.listCollections({ name: 'logs' }).toArray(); + expect(colls).toHaveLength(1); + expect((colls[0] as Record)['options']).toHaveProperty('capped', true); + }); + + it('dropCollection drops an existing collection', async () => { + await db.createCollection('temp'); + const executor = new MongoCommandExecutor(db); + const cmd = new DropCollectionCommand('temp'); + + await cmd.accept(executor); + + const colls = await db.listCollections({ name: 'temp' }).toArray(); + expect(colls).toHaveLength(0); + }); + + it('collMod updates validator on a collection', async () => { + await db.createCollection('docs'); + const executor = new MongoCommandExecutor(db); + const cmd = new CollModCommand('docs', { + validator: { $jsonSchema: { bsonType: 'object', required: ['name'] } }, + validationLevel: 'strict', + validationAction: 'error', + }); + + await cmd.accept(executor); + + const colls = await db.listCollections({ name: 'docs' }).toArray(); + expect((colls[0] as Record)['options']).toHaveProperty('validator'); + }); + + it('createIndex passes text-index options (weights, default_language, language_override)', async () => { + await db.createCollection('articles'); + const executor = new MongoCommandExecutor(db); + const cmd = new CreateIndexCommand( + 'articles', + [ + { field: 'title', direction: 'text' }, + { field: 'body', direction: 'text' }, + ], + { + weights: { title: 10, body: 1 }, + default_language: 'english', + language_override: 'lang', + }, + ); + + await cmd.accept(executor); + + const indexes = await db.collection('articles').listIndexes().toArray(); + const textIndex = indexes.find( + (idx) => + idx['default_language'] === 'english' && + idx['language_override'] === 'lang' && + idx['weights'] !== undefined, + ); + expect(textIndex).toBeDefined(); + expect(textIndex?.['weights']).toEqual({ title: 10, body: 1 }); + expect(textIndex?.['default_language']).toBe('english'); + expect(textIndex?.['language_override']).toBe('lang'); + }); + + it('createIndex passes wildcardProjection option', async () => { + await db.createCollection('wildcard_items'); + const executor = new MongoCommandExecutor(db); + const cmd = new CreateIndexCommand('wildcard_items', [{ field: '$**', direction: 1 }], { + wildcardProjection: { name: 1 }, + }); + + await cmd.accept(executor); + + const indexes = await db.collection('wildcard_items').listIndexes().toArray(); + const wildcardIdx = indexes.find((idx) => idx['key']?.['$**'] === 1); + expect(wildcardIdx).toBeDefined(); + expect(wildcardIdx?.['wildcardProjection']).toEqual({ name: 1 }); + }); + + it('createCollection passes validator and validation options', async () => { + const executor = new MongoCommandExecutor(db); + const validator = { $jsonSchema: { bsonType: 'object', required: ['name'] } }; + const cmd = new CreateCollectionCommand('validated_coll', { + validator, + validationLevel: 'strict', + validationAction: 'error', + }); + + await cmd.accept(executor); + + const colls = await db.listCollections({ name: 'validated_coll' }).toArray(); + expect(colls).toHaveLength(1); + const opts = (colls[0] as Record)['options'] as Record; + expect(opts['validator']).toEqual(validator); + expect(opts['validationLevel']).toBe('strict'); + expect(opts['validationAction']).toBe('error'); + }); + + it('createCollection passes changeStreamPreAndPostImages option', async () => { + const executor = new MongoCommandExecutor(db); + const cmd = new CreateCollectionCommand('cs_images_coll', { + changeStreamPreAndPostImages: { enabled: true }, + }); + + await cmd.accept(executor); + + const colls = await db.listCollections({ name: 'cs_images_coll' }).toArray(); + expect(colls).toHaveLength(1); + const opts = (colls[0] as Record)['options'] as Record; + expect(opts['changeStreamPreAndPostImages']).toEqual({ enabled: true }); + }); + + it('createCollection passes collation option', async () => { + const executor = new MongoCommandExecutor(db); + const collation = { locale: 'en', strength: 2 }; + const cmd = new CreateCollectionCommand('collation_coll', { + collation, + }); + + await cmd.accept(executor); + + const colls = await db.listCollections({ name: 'collation_coll' }).toArray(); + expect(colls).toHaveLength(1); + const opts = (colls[0] as Record)['options'] as Record; + expect(opts['collation']).toMatchObject(collation); + }); + + it('collMod passes changeStreamPreAndPostImages option', async () => { + await db.createCollection('cs_mod_coll'); + const executor = new MongoCommandExecutor(db); + const cmd = new CollModCommand('cs_mod_coll', { + changeStreamPreAndPostImages: { enabled: true }, + }); + + await cmd.accept(executor); + + const colls = await db.listCollections({ name: 'cs_mod_coll' }).toArray(); + expect(colls).toHaveLength(1); + const opts = (colls[0] as Record)['options'] as Record; + expect(opts['changeStreamPreAndPostImages']).toEqual({ enabled: true }); + }); + + it('createCollection passes timeseries option', async () => { + const executor = new MongoCommandExecutor(db); + const cmd = new CreateCollectionCommand('ts_coll', { + timeseries: { timeField: 'ts', granularity: 'hours' }, + }); + + try { + await cmd.accept(executor); + } catch { + return; + } + + const colls = await db.listCollections({ name: 'ts_coll' }).toArray(); + expect(colls).toHaveLength(1); + const opts = (colls[0] as Record)['options'] as Record; + expect(opts['timeseries']).toMatchObject({ timeField: 'ts', granularity: 'hours' }); + }); + + it('createCollection passes clusteredIndex option', async () => { + const executor = new MongoCommandExecutor(db); + const cmd = new CreateCollectionCommand('clustered_coll', { + clusteredIndex: { key: { _id: 1 }, unique: true }, + }); + + try { + await cmd.accept(executor); + } catch { + return; + } + + const colls = await db.listCollections({ name: 'clustered_coll' }).toArray(); + expect(colls).toHaveLength(1); + const opts = (colls[0] as Record)['options'] as Record; + expect(opts['clusteredIndex']).toMatchObject({ key: { _id: 1 }, unique: true }); + }); }); describe('MongoInspectionExecutor', () => { diff --git a/packages/3-mongo-target/2-mongo-adapter/test/contract-to-schema.test.ts b/packages/3-mongo-target/2-mongo-adapter/test/contract-to-schema.test.ts index 95852df817..efd28f8bfc 100644 --- a/packages/3-mongo-target/2-mongo-adapter/test/contract-to-schema.test.ts +++ b/packages/3-mongo-target/2-mongo-adapter/test/contract-to-schema.test.ts @@ -1,10 +1,8 @@ -import type { MongoContract } from '@prisma-next/mongo-contract'; +import type { MongoContract, MongoStorageCollection } from '@prisma-next/mongo-contract'; import { describe, expect, it } from 'vitest'; import { contractToMongoSchemaIR } from '../src/core/contract-to-schema'; -function makeContract( - collections: Record> }>, -): MongoContract { +function makeContract(collections: Record): MongoContract { return { target: 'mongo', targetFamily: 'mongo', @@ -38,7 +36,7 @@ describe('contractToMongoSchemaIR', () => { const ir = contractToMongoSchemaIR( makeContract({ users: { - indexes: [{ fields: { email: 1 } }], + indexes: [{ keys: [{ field: 'email', direction: 1 }] }], }, }), ); @@ -52,7 +50,7 @@ describe('contractToMongoSchemaIR', () => { const ir = contractToMongoSchemaIR( makeContract({ users: { - indexes: [{ fields: { email: 1 }, options: { unique: true } }], + indexes: [{ keys: [{ field: 'email', direction: 1 }], unique: true }], }, }), ); @@ -63,8 +61,8 @@ describe('contractToMongoSchemaIR', () => { it('converts multiple collections', () => { const ir = contractToMongoSchemaIR( makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, - posts: { indexes: [{ fields: { title: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, + posts: { indexes: [{ keys: [{ field: 'title', direction: 1 }] }] }, }), ); expect(Object.keys(ir.collections)).toHaveLength(2); @@ -76,7 +74,7 @@ describe('contractToMongoSchemaIR', () => { const ir = contractToMongoSchemaIR( makeContract({ users: { - indexes: [{ fields: { nickname: 1 }, options: { sparse: true } }], + indexes: [{ keys: [{ field: 'nickname', direction: 1 }], sparse: true }], }, }), ); @@ -87,7 +85,7 @@ describe('contractToMongoSchemaIR', () => { const ir = contractToMongoSchemaIR( makeContract({ users: { - indexes: [{ fields: { createdAt: 1 }, options: { expireAfterSeconds: 3600 } }], + indexes: [{ keys: [{ field: 'createdAt', direction: 1 }], expireAfterSeconds: 3600 }], }, }), ); @@ -99,7 +97,7 @@ describe('contractToMongoSchemaIR', () => { const ir = contractToMongoSchemaIR( makeContract({ users: { - indexes: [{ fields: { status: 1 }, options: { partialFilterExpression: pfe } }], + indexes: [{ keys: [{ field: 'status', direction: 1 }], partialFilterExpression: pfe }], }, }), ); @@ -112,8 +110,11 @@ describe('contractToMongoSchemaIR', () => { users: { indexes: [ { - fields: { email: 1, tenantId: 1 }, - options: { unique: true }, + keys: [ + { field: 'email', direction: 1 }, + { field: 'tenantId', direction: 1 }, + ], + unique: true, }, ], }, @@ -125,4 +126,101 @@ describe('contractToMongoSchemaIR', () => { expect(idx.keys[1]!.field).toBe('tenantId'); expect(idx.unique).toBe(true); }); + + it('preserves M2 index options (weights, default_language, language_override, collation, wildcardProjection)', () => { + const ir = contractToMongoSchemaIR( + makeContract({ + users: { + indexes: [ + { + keys: [{ field: 'bio', direction: 'text' as const }], + weights: { bio: 10 }, + default_language: 'english', + language_override: 'lang', + collation: { locale: 'en', strength: 2 }, + }, + { + keys: [{ field: '$**', direction: 1 as const }], + wildcardProjection: { name: 1 as const, email: 1 as const }, + }, + ], + }, + }), + ); + const textIdx = ir.collections['users']!.indexes[0]!; + expect(textIdx.weights).toEqual({ bio: 10 }); + expect(textIdx.default_language).toBe('english'); + expect(textIdx.language_override).toBe('lang'); + expect(textIdx.collation).toEqual({ locale: 'en', strength: 2 }); + + const wildcardIdx = ir.collections['users']!.indexes[1]!; + expect(wildcardIdx.wildcardProjection).toEqual({ name: 1, email: 1 }); + }); + + it('converts collection with validator', () => { + const ir = contractToMongoSchemaIR( + makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object', properties: { name: { bsonType: 'string' } } }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }), + ); + const coll = ir.collections['users']!; + expect(coll.validator).toBeDefined(); + expect(coll.validator!.jsonSchema).toEqual({ + bsonType: 'object', + properties: { name: { bsonType: 'string' } }, + }); + expect(coll.validator!.validationLevel).toBe('strict'); + expect(coll.validator!.validationAction).toBe('error'); + }); + + it('converts collection with options', () => { + const ir = contractToMongoSchemaIR( + makeContract({ + events: { + options: { + capped: { size: 1048576, max: 1000 }, + collation: { locale: 'en' }, + changeStreamPreAndPostImages: { enabled: true }, + clusteredIndex: { name: 'myCluster' }, + }, + }, + }), + ); + const coll = ir.collections['events']!; + expect(coll.options).toBeDefined(); + expect(coll.options!.capped).toEqual({ size: 1048576, max: 1000 }); + expect(coll.options!.collation).toEqual({ locale: 'en' }); + expect(coll.options!.changeStreamPreAndPostImages).toEqual({ enabled: true }); + expect(coll.options!.clusteredIndex).toEqual({ name: 'myCluster' }); + }); + + it('converts collection with timeseries options', () => { + const ir = contractToMongoSchemaIR( + makeContract({ + metrics: { + options: { + timeseries: { timeField: 'ts', metaField: 'meta', granularity: 'hours' }, + }, + }, + }), + ); + const coll = ir.collections['metrics']!; + expect(coll.options!.timeseries).toEqual({ + timeField: 'ts', + metaField: 'meta', + granularity: 'hours', + }); + }); + + it('collection without validator or options has undefined for both', () => { + const ir = contractToMongoSchemaIR(makeContract({ users: {} })); + expect(ir.collections['users']!.validator).toBeUndefined(); + expect(ir.collections['users']!.options).toBeUndefined(); + }); }); diff --git a/packages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.ts b/packages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.ts index 60c1330b0d..455b63ace4 100644 --- a/packages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.ts +++ b/packages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.ts @@ -1,5 +1,8 @@ import { + CollModCommand, + CreateCollectionCommand, CreateIndexCommand, + DropCollectionCommand, DropIndexCommand, type MongoMigrationPlanOperation, } from '@prisma-next/mongo-query-ast/control'; @@ -156,4 +159,180 @@ describe('formatMongoOperations', () => { 'db.sessions.createIndex({ "expiresAt": 1 }, { sparse: true, expireAfterSeconds: 3600 })', ]); }); + + it('formats createIndex with M2 options (collation, weights, wildcardProjection)', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [], + execute: [ + { + description: 'create text index', + command: new CreateIndexCommand('users', [{ field: 'bio', direction: 'text' }], { + weights: { bio: 10 }, + default_language: 'english', + language_override: 'lang', + collation: { locale: 'en', strength: 2 }, + }), + }, + ], + postcheck: [], + }; + const result = formatMongoOperations([op]); + expect(result[0]).toContain('createIndex'); + expect(result[0]).toContain('default_language: "english"'); + expect(result[0]).toContain('language_override: "lang"'); + }); + + it('formats createCollection with options', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [], + execute: [ + { + description: 'create collection', + command: new CreateCollectionCommand('events', { + capped: true, + size: 1048576, + validator: { $jsonSchema: { bsonType: 'object' } }, + validationLevel: 'strict', + validationAction: 'error', + }), + }, + ], + postcheck: [], + }; + const result = formatMongoOperations([op]); + expect(result[0]).toContain('db.createCollection("events"'); + expect(result[0]).toContain('capped: true'); + }); + + it('formats createCollection with no options', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [], + execute: [ + { + description: 'create collection', + command: new CreateCollectionCommand('events'), + }, + ], + postcheck: [], + }; + const result = formatMongoOperations([op]); + expect(result).toEqual(['db.createCollection("events")']); + }); + + it('formats dropCollection', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'destructive', + precheck: [], + execute: [ + { + description: 'drop collection', + command: new DropCollectionCommand('events'), + }, + ], + postcheck: [], + }; + const result = formatMongoOperations([op]); + expect(result).toEqual(['db.events.drop()']); + }); + + it('formats collMod with validator', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'destructive', + precheck: [], + execute: [ + { + description: 'update validator', + command: new CollModCommand('users', { + validator: { $jsonSchema: { bsonType: 'object' } }, + validationLevel: 'strict', + validationAction: 'error', + }), + }, + ], + postcheck: [], + }; + const result = formatMongoOperations([op]); + expect(result[0]).toContain('db.runCommand({ collMod: "users"'); + expect(result[0]).toContain('validationLevel: "strict"'); + }); + + it('formats createIndex with wildcardProjection and partialFilterExpression', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [], + execute: [ + { + description: 'create wildcard index', + command: new CreateIndexCommand('users', [{ field: '$**', direction: 1 }], { + wildcardProjection: { bio: 1, name: 0 }, + partialFilterExpression: { active: true }, + }), + }, + ], + postcheck: [], + }; + const result = formatMongoOperations([op]); + expect(result[0]).toContain('wildcardProjection:'); + expect(result[0]).toContain('"bio":1'); + expect(result[0]).toContain('partialFilterExpression:'); + expect(result[0]).toContain('"active":true'); + }); + + it('formats createCollection with changeStreamPreAndPostImages', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [], + execute: [ + { + description: 'create collection', + command: new CreateCollectionCommand('events', { + changeStreamPreAndPostImages: { enabled: true }, + }), + }, + ], + postcheck: [], + }; + const result = formatMongoOperations([op]); + expect(result[0]).toContain('changeStreamPreAndPostImages:'); + expect(result[0]).toContain('"enabled":true'); + }); + + it('formats collMod with changeStreamPreAndPostImages', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'destructive', + precheck: [], + execute: [ + { + description: 'enable change stream images', + command: new CollModCommand('users', { + changeStreamPreAndPostImages: { enabled: true }, + }), + }, + ], + postcheck: [], + }; + const result = formatMongoOperations([op]); + expect(result[0]).toContain('db.runCommand({ collMod: "users"'); + expect(result[0]).toContain('changeStreamPreAndPostImages:'); + expect(result[0]).toContain('"enabled":true'); + }); }); diff --git a/packages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.ts b/packages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.ts index 504acd4bc4..991d2ed0ed 100644 --- a/packages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.ts +++ b/packages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.ts @@ -1,6 +1,10 @@ import { + CollModCommand, + CreateCollectionCommand, CreateIndexCommand, + DropCollectionCommand, DropIndexCommand, + ListCollectionsCommand, ListIndexesCommand, MongoAndExpr, MongoExistsExpr, @@ -469,4 +473,225 @@ describe('serializeMongoOps / deserializeMongoOps', () => { expect(cmd.partialFilterExpression).toEqual(pfe); expect(cmd.name).toBe('status_1'); }); + + it('round-trips createIndex with M2 options', () => { + const op: MongoMigrationPlanOperation = { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [], + execute: [ + { + description: 'create text index', + command: new CreateIndexCommand('users', [{ field: 'bio', direction: 'text' }], { + weights: { bio: 10 }, + default_language: 'english', + language_override: 'lang', + collation: { locale: 'en', strength: 2 }, + wildcardProjection: { name: 1, email: 1 }, + }), + }, + ], + postcheck: [], + }; + const deserialized = deserializeMongoOps(JSON.parse(serializeMongoOps([op])) as unknown[]); + const cmd = deserialized[0]!.execute[0]!.command as CreateIndexCommand; + expect(cmd.weights).toEqual({ bio: 10 }); + expect(cmd.default_language).toBe('english'); + expect(cmd.language_override).toBe('lang'); + expect(cmd.collation).toEqual({ locale: 'en', strength: 2 }); + expect(cmd.wildcardProjection).toEqual({ name: 1, email: 1 }); + }); + + it('round-trips createCollection command', () => { + const op: MongoMigrationPlanOperation = { + id: 'coll.events.create', + label: 'Create collection events', + operationClass: 'additive', + precheck: [ + { + description: 'collection does not exist', + source: new ListCollectionsCommand(), + filter: MongoFieldFilter.eq('name', 'events'), + expect: 'notExists', + }, + ], + execute: [ + { + description: 'create events collection', + command: new CreateCollectionCommand('events', { + capped: true, + size: 1048576, + max: 1000, + validator: { $jsonSchema: { bsonType: 'object' } }, + validationLevel: 'strict', + validationAction: 'error', + }), + }, + ], + postcheck: [], + }; + const deserialized = deserializeMongoOps(JSON.parse(serializeMongoOps([op])) as unknown[]); + const cmd = deserialized[0]!.execute[0]!.command as CreateCollectionCommand; + expect(cmd.kind).toBe('createCollection'); + expect(cmd.collection).toBe('events'); + expect(cmd.capped).toBe(true); + expect(cmd.size).toBe(1048576); + expect(cmd.max).toBe(1000); + expect(cmd.validator).toEqual({ $jsonSchema: { bsonType: 'object' } }); + expect(cmd.validationLevel).toBe('strict'); + expect(cmd.validationAction).toBe('error'); + }); + + it('round-trips dropCollection command', () => { + const op: MongoMigrationPlanOperation = { + id: 'coll.events.drop', + label: 'Drop collection events', + operationClass: 'destructive', + precheck: [], + execute: [ + { + description: 'drop events collection', + command: new DropCollectionCommand('events'), + }, + ], + postcheck: [], + }; + const deserialized = deserializeMongoOps(JSON.parse(serializeMongoOps([op])) as unknown[]); + const cmd = deserialized[0]!.execute[0]!.command as DropCollectionCommand; + expect(cmd.kind).toBe('dropCollection'); + expect(cmd.collection).toBe('events'); + }); + + it('round-trips collMod command', () => { + const op: MongoMigrationPlanOperation = { + id: 'validator.users.update', + label: 'Update validator on users', + operationClass: 'destructive', + precheck: [], + execute: [ + { + description: 'update validator on users', + command: new CollModCommand('users', { + validator: { $jsonSchema: { bsonType: 'object' } }, + validationLevel: 'strict', + validationAction: 'error', + changeStreamPreAndPostImages: { enabled: true }, + }), + }, + ], + postcheck: [], + }; + const deserialized = deserializeMongoOps(JSON.parse(serializeMongoOps([op])) as unknown[]); + const cmd = deserialized[0]!.execute[0]!.command as CollModCommand; + expect(cmd.kind).toBe('collMod'); + expect(cmd.collection).toBe('users'); + expect(cmd.validator).toEqual({ $jsonSchema: { bsonType: 'object' } }); + expect(cmd.validationLevel).toBe('strict'); + expect(cmd.validationAction).toBe('error'); + expect(cmd.changeStreamPreAndPostImages).toEqual({ enabled: true }); + }); + + it('rejects and filter with non-array exprs', () => { + const json = [ + { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [ + { + description: 'test', + source: { kind: 'listIndexes', collection: 'users' }, + filter: { kind: 'and', exprs: 'not-array' }, + expect: 'exists', + }, + ], + execute: [], + postcheck: [], + }, + ]; + expect(() => deserializeMongoOps(json)).toThrow(/Invalid and filter/); + }); + + it('rejects or filter with non-array exprs', () => { + const json = [ + { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [ + { + description: 'test', + source: { kind: 'listIndexes', collection: 'users' }, + filter: { kind: 'or', exprs: 'not-array' }, + expect: 'exists', + }, + ], + execute: [], + postcheck: [], + }, + ]; + expect(() => deserializeMongoOps(json)).toThrow(/Invalid or filter/); + }); + + it('rejects not filter with missing expr', () => { + const json = [ + { + id: 'test', + label: 'test', + operationClass: 'additive', + precheck: [ + { + description: 'test', + source: { kind: 'listIndexes', collection: 'users' }, + filter: { kind: 'not' }, + expect: 'exists', + }, + ], + execute: [], + postcheck: [], + }, + ]; + expect(() => deserializeMongoOps(json)).toThrow(/Invalid not filter/); + }); + + it('round-trips createCollection with M2 options', () => { + const timeseries = { timeField: 'ts', metaField: 'meta', granularity: 'hours' as const }; + const collation = { locale: 'en', strength: 2 }; + const changeStreamPreAndPostImages = { enabled: true }; + const clusteredIndex = { key: { _id: 1 }, unique: true, name: 'clustered' }; + const op: MongoMigrationPlanOperation = { + id: 'coll.ts.create', + label: 'Create time series collection', + operationClass: 'additive', + precheck: [ + { + description: 'collection does not exist', + source: new ListCollectionsCommand(), + filter: MongoFieldFilter.eq('name', 'metrics'), + expect: 'notExists', + }, + ], + execute: [ + { + description: 'create metrics collection', + command: new CreateCollectionCommand('metrics', { + timeseries, + collation, + changeStreamPreAndPostImages, + clusteredIndex, + }), + }, + ], + postcheck: [], + }; + const deserialized = deserializeMongoOps(JSON.parse(serializeMongoOps([op])) as unknown[]); + const cmd = deserialized[0]!.execute[0]!.command as CreateCollectionCommand; + expect(cmd.kind).toBe('createCollection'); + expect(cmd.collection).toBe('metrics'); + expect(cmd.timeseries).toEqual(timeseries); + expect(cmd.collation).toEqual(collation); + expect(cmd.changeStreamPreAndPostImages).toEqual(changeStreamPreAndPostImages); + expect(cmd.clusteredIndex).toEqual(clusteredIndex); + }); }); diff --git a/packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts b/packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts index 29d7a3053f..c70b44db7d 100644 --- a/packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts +++ b/packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts @@ -1,14 +1,19 @@ import type { MigrationOperationPolicy } from '@prisma-next/framework-components/control'; -import type { MongoContract } from '@prisma-next/mongo-contract'; +import type { MongoContract, MongoStorageCollection } from '@prisma-next/mongo-contract'; import type { + CollModCommand, + CreateCollectionCommand, CreateIndexCommand, + DropCollectionCommand, DropIndexCommand, MongoMigrationPlanOperation, } from '@prisma-next/mongo-query-ast/control'; import { MongoSchemaCollection, + MongoSchemaCollectionOptions, MongoSchemaIndex, type MongoSchemaIR, + MongoSchemaValidator, } from '@prisma-next/mongo-schema-ir'; import { describe, expect, it } from 'vitest'; import { MongoMigrationPlanner } from '../src/core/mongo-planner'; @@ -21,9 +26,7 @@ const ADDITIVE_ONLY_POLICY: MigrationOperationPolicy = { allowedOperationClasses: ['additive'], }; -function makeContract( - collections: Record> }>, -): MongoContract { +function makeContract(collections: Record): MongoContract { return { target: 'mongo', targetFamily: 'mongo', @@ -79,7 +82,7 @@ describe('MongoMigrationPlanner', () => { describe('index diffing', () => { it('emits createIndex when destination has an index origin lacks', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planSuccess(planner, contract, emptyIR()); @@ -109,7 +112,7 @@ describe('MongoMigrationPlanner', () => { it('emits no operations when indexes are identical', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const origin = irWithCollection('users', [ascIndex('email')]); const plan = planSuccess(planner, contract, origin); @@ -118,7 +121,7 @@ describe('MongoMigrationPlanner', () => { it('treats indexes with same keys but different name as equivalent (no-op)', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const origin = irWithCollection('users', [ascIndex('email')]); const plan = planSuccess(planner, contract, origin); @@ -127,7 +130,7 @@ describe('MongoMigrationPlanner', () => { it('treats indexes with same keys but different options as different', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 }, options: { unique: true } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }], unique: true }] }, }); const origin = irWithCollection('users', [ascIndex('email')]); const plan = planSuccess(planner, contract, origin); @@ -142,7 +145,7 @@ describe('MongoMigrationPlanner', () => { it('treats indexes with same keys but different TTL as different', () => { const contract = makeContract({ sessions: { - indexes: [{ fields: { createdAt: 1 }, options: { expireAfterSeconds: 3600 } }], + indexes: [{ keys: [{ field: 'createdAt', direction: 1 }], expireAfterSeconds: 3600 }], }, }); const origin = irWithCollection('sessions', [ @@ -160,8 +163,8 @@ describe('MongoMigrationPlanner', () => { items: { indexes: [ { - fields: { status: 1 }, - options: { partialFilterExpression: { active: true } }, + keys: [{ field: 'status', direction: 1 }], + partialFilterExpression: { active: true }, }, ], }, @@ -179,7 +182,10 @@ describe('MongoMigrationPlanner', () => { it('handles multiple indexes on same collection', () => { const contract = makeContract({ users: { - indexes: [{ fields: { email: 1 } }, { fields: { name: 1 } }], + indexes: [ + { keys: [{ field: 'email', direction: 1 }] }, + { keys: [{ field: 'name', direction: 1 }] }, + ], }, }); const plan = planSuccess(planner, contract, emptyIR()); @@ -189,14 +195,14 @@ describe('MongoMigrationPlanner', () => { it('handles multiple collections', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, - posts: { indexes: [{ fields: { title: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, + posts: { indexes: [{ keys: [{ field: 'title', direction: 1 }] }] }, }); const plan = planSuccess(planner, contract, emptyIR()); expect(plan.operations).toHaveLength(2); }); - it('drops all indexes when collection removed from destination', () => { + it('drops all indexes and the collection when collection removed from destination', () => { const contract = makeContract({}); const origin: MongoSchemaIR = { collections: { @@ -207,14 +213,18 @@ describe('MongoMigrationPlanner', () => { }, }; const plan = planSuccess(planner, contract, origin); - expect(plan.operations).toHaveLength(2); + expect(plan.operations).toHaveLength(3); expect(plan.operations.every((op) => op.operationClass === 'destructive')).toBe(true); + expect(plan.operations[2]!.id).toBe('collection.users.drop'); }); it('handles empty origin (all creates)', () => { const contract = makeContract({ users: { - indexes: [{ fields: { email: 1 }, options: { unique: true } }, { fields: { name: 1 } }], + indexes: [ + { keys: [{ field: 'email', direction: 1 }], unique: true }, + { keys: [{ field: 'name', direction: 1 }] }, + ], }, }); const plan = planSuccess(planner, contract, emptyIR()); @@ -226,7 +236,7 @@ describe('MongoMigrationPlanner', () => { describe('ordering', () => { it('orders drops before creates', () => { const contract = makeContract({ - users: { indexes: [{ fields: { name: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'name', direction: 1 }] }] }, }); const origin = irWithCollection('users', [ascIndex('email')]); const plan = planSuccess(planner, contract, origin); @@ -238,8 +248,8 @@ describe('MongoMigrationPlanner', () => { it('orders operations deterministically by collection then keys', () => { const contract = makeContract({ - beta: { indexes: [{ fields: { x: 1 } }] }, - alpha: { indexes: [{ fields: { y: 1 } }] }, + beta: { indexes: [{ keys: [{ field: 'x', direction: 1 }] }] }, + alpha: { indexes: [{ keys: [{ field: 'y', direction: 1 }] }] }, }); const plan = planSuccess(planner, contract, emptyIR()); @@ -268,7 +278,7 @@ describe('MongoMigrationPlanner', () => { it('allows additive operations with additive-only policy', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planSuccess(planner, contract, emptyIR(), ADDITIVE_ONLY_POLICY); expect(plan.operations).toHaveLength(1); @@ -292,14 +302,59 @@ describe('MongoMigrationPlanner', () => { }); expect(result.kind).toBe('failure'); if (result.kind !== 'failure') throw new Error('Expected failure'); - expect(result.conflicts).toHaveLength(2); + expect(result.conflicts).toHaveLength(3); + }); + + it('rejects destructive validator add with additive-only policy', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }); + const origin = irWithCollection('users', []); + const result = planner.plan({ + contract, + schema: origin, + policy: ADDITIVE_ONLY_POLICY, + frameworkComponents: [], + }); + expect(result.kind).toBe('failure'); + if (result.kind !== 'failure') throw new Error('Expected failure'); + expect(result.conflicts).toHaveLength(1); + expect(result.conflicts[0]!.summary).toContain('destructive'); + }); + + it('allows widening validator removal with widening policy', () => { + const wideningPolicy: MigrationOperationPolicy = { + allowedOperationClasses: ['additive', 'widening'], + }; + const contract = makeContract({ users: {} }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin, wideningPolicy); + expect(plan.operations).toHaveLength(1); + expect(plan.operations[0]!.operationClass).toBe('widening'); }); }); describe('operation structure', () => { it('createIndex has correct precheck/execute/postcheck', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planSuccess(planner, contract, emptyIR()); const op = plan.operations[0] as MongoMigrationPlanOperation; @@ -336,7 +391,7 @@ describe('MongoMigrationPlanner', () => { it('unique index postcheck includes unique filter', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 }, options: { unique: true } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }], unique: true }] }, }); const plan = planSuccess(planner, contract, emptyIR()); const op = plan.operations[0] as MongoMigrationPlanOperation; @@ -346,7 +401,7 @@ describe('MongoMigrationPlanner', () => { it('non-unique index postcheck uses simple field filter', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planSuccess(planner, contract, emptyIR()); const op = plan.operations[0] as MongoMigrationPlanOperation; @@ -356,7 +411,7 @@ describe('MongoMigrationPlanner', () => { it('createIndex sets a deterministic operation id', () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planSuccess(planner, contract, emptyIR()); expect(plan.operations[0]!.id).toBe('index.users.create(email:1)'); @@ -370,6 +425,645 @@ describe('MongoMigrationPlanner', () => { }); }); + describe('M2 index vocabulary', () => { + it('detects different wildcardProjection as distinct indexes', () => { + const contract = makeContract({ + users: { + indexes: [ + { + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { name: 1, email: 1 }, + }, + ], + }, + }); + const origin = irWithCollection('users', [ + new MongoSchemaIndex({ + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { name: 1 }, + }), + ]); + const plan = planSuccess(planner, contract, origin); + expect(plan.operations).toHaveLength(2); + }); + + it('detects different collation as distinct indexes', () => { + const contract = makeContract({ + users: { + indexes: [ + { + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'en', strength: 2 }, + }, + ], + }, + }); + const origin = irWithCollection('users', [ + new MongoSchemaIndex({ + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'fr', strength: 2 }, + }), + ]); + const plan = planSuccess(planner, contract, origin); + expect(plan.operations).toHaveLength(2); + }); + + it('treats same collation with different key order as identical', () => { + const contract = makeContract({ + users: { + indexes: [ + { + keys: [{ field: 'name', direction: 1 }], + collation: { strength: 2, locale: 'en' }, + }, + ], + }, + }); + const origin = irWithCollection('users', [ + new MongoSchemaIndex({ + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'en', strength: 2 }, + }), + ]); + const plan = planSuccess(planner, contract, origin); + expect(plan.operations).toHaveLength(0); + }); + + it('detects different weights as distinct indexes', () => { + const contract = makeContract({ + users: { + indexes: [ + { + keys: [{ field: 'bio', direction: 'text' }], + weights: { bio: 10 }, + }, + ], + }, + }); + const origin = irWithCollection('users', [ + new MongoSchemaIndex({ + keys: [{ field: 'bio', direction: 'text' }], + weights: { bio: 5 }, + }), + ]); + const plan = planSuccess(planner, contract, origin); + expect(plan.operations).toHaveLength(2); + }); + + it('passes M2 options through to CreateIndexCommand', () => { + const contract = makeContract({ + users: { + indexes: [ + { + keys: [{ field: 'bio', direction: 'text' }], + weights: { bio: 10 }, + default_language: 'english', + language_override: 'lang', + collation: { locale: 'en' }, + wildcardProjection: { bio: 1 }, + }, + ], + }, + }); + const plan = planSuccess(planner, contract, emptyIR()); + expect(plan.operations).toHaveLength(1); + const cmd = (plan.operations[0] as MongoMigrationPlanOperation).execute[0]! + .command as CreateIndexCommand; + expect(cmd.weights).toEqual({ bio: 10 }); + expect(cmd.default_language).toBe('english'); + expect(cmd.language_override).toBe('lang'); + expect(cmd.collation).toEqual({ locale: 'en' }); + expect(cmd.wildcardProjection).toEqual({ bio: 1 }); + }); + }); + + describe('validator diffing', () => { + it('emits collMod when validator is added', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }); + const origin = irWithCollection('users', []); + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + const cmd = collModOps[0]!.execute[0]!.command as CollModCommand; + expect(cmd.validator).toEqual({ $jsonSchema: { bsonType: 'object' } }); + expect(cmd.validationLevel).toBe('strict'); + }); + + it('validator add has precheck (collection exists) and postcheck (validator applied)', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }); + const origin = irWithCollection('users', []); + const plan = planSuccess(planner, contract, origin); + const op = (plan.operations as MongoMigrationPlanOperation[]).find( + (o) => o.execute[0]?.command.kind === 'collMod', + )!; + + expect(op.precheck).toHaveLength(1); + expect(op.precheck[0]!.source.kind).toBe('listCollections'); + expect(op.precheck[0]!.expect).toBe('exists'); + + expect(op.postcheck).toHaveLength(1); + expect(op.postcheck[0]!.source.kind).toBe('listCollections'); + expect(op.postcheck[0]!.expect).toBe('exists'); + }); + + it('validator remove has precheck and empty postcheck', () => { + const contract = makeContract({ users: {} }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const op = (plan.operations as MongoMigrationPlanOperation[]).find( + (o) => o.execute[0]?.command.kind === 'collMod', + )!; + + expect(op.precheck).toHaveLength(1); + expect(op.precheck[0]!.source.kind).toBe('listCollections'); + expect(op.precheck[0]!.expect).toBe('exists'); + + expect(op.postcheck).toHaveLength(0); + }); + + it('classifies validator removal as widening', () => { + const contract = makeContract({ users: {} }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('widening'); + }); + + it('classifies jsonSchema body change as destructive', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object', properties: { name: { bsonType: 'string' } } }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('destructive'); + }); + + it('treats reordered jsonSchema keys as equivalent (no operation emitted)', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { properties: { name: { bsonType: 'string' } }, bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object', properties: { name: { bsonType: 'string' } } }, + validationLevel: 'strict', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(0); + }); + + it('classifies validationAction error->warn as widening', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'warn', + }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('widening'); + }); + + it('classifies validationAction warn->error as destructive', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'warn', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('destructive'); + }); + + it('classifies validationLevel strict->moderate as widening', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'moderate', + validationAction: 'error', + }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('widening'); + }); + + it('classifies validationLevel moderate->strict as destructive', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'moderate', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('destructive'); + }); + + it('classifies mixed widening+destructive changes as destructive', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'moderate', + validationAction: 'error', + }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'warn', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('destructive'); + }); + + it('no-ops when validators are identical', () => { + const contract = makeContract({ + users: { + validator: { + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + validator: new MongoSchemaValidator({ + jsonSchema: { bsonType: 'object' }, + validationLevel: 'strict', + validationAction: 'error', + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + expect(plan.operations).toHaveLength(0); + }); + }); + + describe('collection lifecycle', () => { + it('emits createCollection for new collections with options', () => { + const contract = makeContract({ + events: { + options: { capped: { size: 1048576, max: 1000 } }, + }, + }); + const plan = planSuccess(planner, contract, emptyIR()); + const createOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'createCollection', + ); + expect(createOps).toHaveLength(1); + const cmd = createOps[0]!.execute[0]!.command as CreateCollectionCommand; + expect(cmd.collection).toBe('events'); + expect(cmd.capped).toBe(true); + expect(cmd.size).toBe(1048576); + }); + + it('emits dropCollection for removed collections', () => { + const contract = makeContract({}); + const origin: MongoSchemaIR = { + collections: { + events: new MongoSchemaCollection({ + name: 'events', + options: new MongoSchemaCollectionOptions({ + capped: { size: 1048576 }, + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const dropOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'dropCollection', + ); + expect(dropOps).toHaveLength(1); + const cmd = dropOps[0]!.execute[0]!.command as DropCollectionCommand; + expect(cmd.collection).toBe('events'); + }); + + it('reports conflict for immutable option change (capped)', () => { + const contract = makeContract({ + events: { + options: { capped: { size: 2097152 } }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + events: new MongoSchemaCollection({ + name: 'events', + options: new MongoSchemaCollectionOptions({ + capped: { size: 1048576 }, + }), + }), + }, + }; + const result = planner.plan({ + contract, + schema: origin, + policy: ALL_CLASSES_POLICY, + frameworkComponents: [], + }); + expect(result.kind).toBe('failure'); + if (result.kind !== 'failure') throw new Error('Expected failure'); + expect(result.conflicts.some((c) => c.summary.includes('immutable'))).toBe(true); + }); + + it('reports conflict when adding collation to existing collection without options', () => { + const contract = makeContract({ + users: { + options: { collation: { locale: 'en', strength: 2 } }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ name: 'users' }), + }, + }; + const result = planner.plan({ + contract, + schema: origin, + policy: ALL_CLASSES_POLICY, + frameworkComponents: [], + }); + expect(result.kind).toBe('failure'); + if (result.kind !== 'failure') throw new Error('Expected failure'); + expect(result.conflicts.some((c) => c.summary.includes('immutable'))).toBe(true); + }); + + it('reports conflict when removing capped from existing collection', () => { + const contract = makeContract({ events: {} }); + const origin: MongoSchemaIR = { + collections: { + events: new MongoSchemaCollection({ + name: 'events', + options: new MongoSchemaCollectionOptions({ + capped: { size: 1048576 }, + }), + }), + }, + }; + const result = planner.plan({ + contract, + schema: origin, + policy: ALL_CLASSES_POLICY, + frameworkComponents: [], + }); + expect(result.kind).toBe('failure'); + if (result.kind !== 'failure') throw new Error('Expected failure'); + expect(result.conflicts.some((c) => c.summary.includes('immutable'))).toBe(true); + }); + + it('treats reordered collation keys as equivalent (no conflict)', () => { + const contract = makeContract({ + users: { + options: { collation: { strength: 2, locale: 'en' } }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + users: new MongoSchemaCollection({ + name: 'users', + options: new MongoSchemaCollectionOptions({ + collation: { locale: 'en', strength: 2 }, + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + expect(plan.operations).toHaveLength(0); + }); + + it('classifies enabling changeStreamPreAndPostImages as widening', () => { + const contract = makeContract({ + events: { + options: { changeStreamPreAndPostImages: { enabled: true } }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + events: new MongoSchemaCollection({ + name: 'events', + options: new MongoSchemaCollectionOptions({ + changeStreamPreAndPostImages: { enabled: false }, + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('widening'); + }); + + it('classifies disabling changeStreamPreAndPostImages as destructive', () => { + const contract = makeContract({ + events: { + options: { changeStreamPreAndPostImages: { enabled: false } }, + }, + }); + const origin: MongoSchemaIR = { + collections: { + events: new MongoSchemaCollection({ + name: 'events', + options: new MongoSchemaCollectionOptions({ + changeStreamPreAndPostImages: { enabled: true }, + }), + }), + }, + }; + const plan = planSuccess(planner, contract, origin); + const collModOps = (plan.operations as MongoMigrationPlanOperation[]).filter( + (op) => op.execute[0]?.command.kind === 'collMod', + ); + expect(collModOps).toHaveLength(1); + expect(collModOps[0]!.operationClass).toBe('destructive'); + }); + + it('orders creates before indexes, drops after', () => { + const contract = makeContract({ + events: { + indexes: [{ keys: [{ field: 'ts', direction: 1 as const }] }], + options: { capped: { size: 1048576 } }, + }, + }); + const plan = planSuccess(planner, contract, emptyIR()); + const kinds = (plan.operations as MongoMigrationPlanOperation[]).map( + (op) => op.execute[0]!.command.kind, + ); + const createCollIdx = kinds.indexOf('createCollection'); + const createIdxIdx = kinds.indexOf('createIndex'); + expect(createCollIdx).toBeLessThan(createIdxIdx); + }); + }); + describe('plan metadata', () => { it('sets targetId to mongo', () => { const contract = makeContract({ users: {} }); diff --git a/packages/3-mongo-target/2-mongo-adapter/test/mongo-runner.test.ts b/packages/3-mongo-target/2-mongo-adapter/test/mongo-runner.test.ts index bb5a19d223..81df907ac0 100644 --- a/packages/3-mongo-target/2-mongo-adapter/test/mongo-runner.test.ts +++ b/packages/3-mongo-target/2-mongo-adapter/test/mongo-runner.test.ts @@ -5,7 +5,8 @@ import type { import type { MongoMigrationPlanOperation } from '@prisma-next/mongo-query-ast/control'; import { type Db, MongoClient } from 'mongodb'; import { MongoMemoryReplSet } from 'mongodb-memory-server'; -import { afterAll, beforeAll, beforeEach, describe, expect, it } from 'vitest'; +import { afterAll, beforeAll, beforeEach, describe, expect, it, vi } from 'vitest'; +import * as markerLedger from '../src/core/marker-ledger'; import { initMarker, readMarker } from '../src/core/marker-ledger'; import { createMongoControlDriver } from '../src/core/mongo-control-driver'; import { serializeMongoOps } from '../src/core/mongo-ops-serializer'; @@ -41,7 +42,13 @@ beforeEach(async () => { function makeContract( collections: Record< string, - { indexes?: Array<{ fields: Record; options?: { unique?: boolean } }> } + { + indexes?: Array<{ + keys: Array<{ field: string; direction: 1 | -1 }>; + unique?: boolean; + sparse?: boolean; + }>; + } >, storageHash = 'sha256:dest', ) { @@ -86,7 +93,7 @@ function makeDriver() { describe('MongoMigrationRunner', () => { it('creates an index on a real MongoDB instance', async () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 }, options: { unique: true } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }], unique: true }] }, }); const plan = planForContract(contract); const serialized = serializePlan(plan); @@ -155,7 +162,7 @@ describe('MongoMigrationRunner', () => { await db.collection('items').createIndex({ sku: 1 }, { unique: true, name: 'sku_1' }); const contract = makeContract({ - items: { indexes: [{ fields: { sku: 1 }, options: { unique: true } }] }, + items: { indexes: [{ keys: [{ field: 'sku', direction: 1 }], unique: true }] }, }); const plan = planForContract(contract); const serialized = serializePlan(plan); @@ -180,7 +187,7 @@ describe('MongoMigrationRunner', () => { await db.collection('users').createIndex({ email: 1 }, { name: 'email_1' }); const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planForContract(contract); const serialized = serializePlan(plan); @@ -203,8 +210,8 @@ describe('MongoMigrationRunner', () => { it('executes multiple operations in order', async () => { const contract = makeContract({ - alpha: { indexes: [{ fields: { a: 1 } }] }, - beta: { indexes: [{ fields: { b: 1 } }] }, + alpha: { indexes: [{ keys: [{ field: 'a', direction: 1 }] }] }, + beta: { indexes: [{ keys: [{ field: 'b', direction: 1 }] }] }, }); const plan = planForContract(contract); const serialized = serializePlan(plan); @@ -241,7 +248,7 @@ describe('MongoMigrationRunner', () => { await initMarker(db, { storageHash: 'sha256:different', profileHash: 'sha256:p1' }); const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planForContract(contract); const serialized = serializePlan({ @@ -264,9 +271,90 @@ describe('MongoMigrationRunner', () => { } }); + it('returns MARKER_ORIGIN_MISMATCH when marker exists but plan has no origin', async () => { + await initMarker(db, { storageHash: 'sha256:existing', profileHash: 'sha256:p1' }); + + const contract = makeContract({ + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, + }); + const plan = planForContract(contract); + const serialized = serializePlan(plan); + + const runner = new MongoMigrationRunner(); + const result = await runner.execute({ + plan: serialized, + driver: makeDriver(), + destinationContract: contract, + policy: { allowedOperationClasses: ['additive', 'widening', 'destructive'] }, + frameworkComponents: [], + }); + + expect(result.ok).toBe(false); + if (!result.ok) { + expect(result.failure.code).toBe('MARKER_ORIGIN_MISMATCH'); + } + }); + + it('returns MARKER_ORIGIN_MISMATCH when no marker but plan has origin', async () => { + const contract = makeContract({ + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, + }); + const plan = planForContract(contract); + const serialized = serializePlan({ + ...plan, + origin: { storageHash: 'sha256:something' }, + }); + + const runner = new MongoMigrationRunner(); + const result = await runner.execute({ + plan: serialized, + driver: makeDriver(), + destinationContract: contract, + policy: { allowedOperationClasses: ['additive', 'widening', 'destructive'] }, + frameworkComponents: [], + }); + + expect(result.ok).toBe(false); + if (!result.ok) { + expect(result.failure.code).toBe('MARKER_ORIGIN_MISMATCH'); + } + }); + + it('returns MARKER_CAS_FAILURE when marker update loses compare-and-swap', async () => { + await initMarker(db, { storageHash: 'sha256:origin', profileHash: 'sha256:profile' }); + + const contract = makeContract({ + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, + }); + const plan = planForContract(contract); + const serialized = serializePlan({ + ...plan, + origin: { storageHash: 'sha256:origin' }, + }); + + const updateSpy = vi.spyOn(markerLedger, 'updateMarker').mockResolvedValue(false); + try { + const runner = new MongoMigrationRunner(); + const result = await runner.execute({ + plan: serialized, + driver: makeDriver(), + destinationContract: contract, + policy: { allowedOperationClasses: ['additive', 'widening', 'destructive'] }, + frameworkComponents: [], + }); + + expect(result.ok).toBe(false); + if (!result.ok) { + expect(result.failure.code).toBe('MARKER_CAS_FAILURE'); + } + } finally { + updateSpy.mockRestore(); + } + }); + it('returns POLICY_VIOLATION for disallowed operation class', async () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planForContract(contract); const serialized = serializePlan(plan); @@ -288,7 +376,7 @@ describe('MongoMigrationRunner', () => { it('updates marker and writes ledger entry after successful execution', async () => { const contract = makeContract({ - users: { indexes: [{ fields: { email: 1 } }] }, + users: { indexes: [{ keys: [{ field: 'email', direction: 1 }] }] }, }); const plan = planForContract(contract); const serialized = serializePlan(plan); diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index c7a1b96ea3..72f5cd4364 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -883,6 +883,9 @@ importers: '@prisma-next/contract': specifier: workspace:* version: link:../../../1-framework/0-foundation/contract + '@prisma-next/mongo-contract': + specifier: workspace:* + version: link:../../1-foundation/mongo-contract '@prisma-next/psl-parser': specifier: workspace:* version: link:../../../1-framework/2-authoring/psl-parser diff --git a/projects/mongo-schema-migrations/plan.md b/projects/mongo-schema-migrations/plan.md index 82bffd3695..167f5aac85 100644 --- a/projects/mongo-schema-migrations/plan.md +++ b/projects/mongo-schema-migrations/plan.md @@ -74,35 +74,49 @@ Proves the full migration architecture works for MongoDB by cutting a thin verti ### Milestone 2: Full index vocabulary + validators + collection options -Extends every layer to cover the full breadth of MongoDB server-side configuration. Still validates with hand-crafted contracts for indexes/validators/options, plus adds authoring + emitter support. +Extends every layer to cover the full breadth of MongoDB server-side configuration: all index types and options, `$jsonSchema` validators, collection options, and PSL authoring support. + +**Spec:** [m2-full-vocabulary.spec.md](specs/m2-full-vocabulary.spec.md) +**Plan:** [m2-plan.md](plans/m2-plan.md) **Tasks:** -**Full index vocabulary:** +**Phase 1 — Foundation types:** -- [ ] **2.1 Extend index key types.** Support compound indexes, descending (`-1`), text (`"text"`), geospatial (`"2dsphere"`), wildcard (`"$**"`). Update `MongoStorageIndex` type, schema IR, Arktype validation, planner diffing, and runner execution. Tests for each key type. -- [ ] **2.2 Extend index options.** Support `sparse`, `expireAfterSeconds` (TTL), `partialFilterExpression` (partial indexes). Update types, IR, validation, planner, runner. Tests for each option. Include index identity tests: same keys + different options = different index. **Note:** `buildIndexLookupKey` in the planner uses `JSON.stringify` for `partialFilterExpression`, which is key-order dependent. Replace with canonical serialization so that structurally equivalent partial filter expressions with different key ordering produce the same lookup key (important when comparing contract-derived IR against live-introspected IR in M4). +- [ ] **2.1 Extend index options in contract types.** Add `wildcardProjection`, `collation`, `weights`, `default_language`, `language_override` to `MongoStorageIndex`. Update Arktype schema. +- [ ] **2.2 Add validator and collection options to contract types.** Define `MongoStorageValidator` and `MongoStorageCollectionOptions` types. Update `MongoStorageCollection` and Arktype schema. +- [ ] **2.3 Extend schema IR with new index options.** Add new options to `MongoSchemaIndex`, update `indexesEquivalent`. +- [ ] **2.4 Add MongoSchemaValidator and MongoSchemaCollectionOptions to schema IR.** Implement node classes, update visitor interface from `unknown` to concrete types. +- [ ] **2.5 Add new DDL command classes.** `CreateCollectionCommand`, `DropCollectionCommand`, `CollModCommand`. Update `CreateIndexCommand` with new options. Update `MongoDdlCommandVisitor`. +- [ ] **2.6 Implement canonical serialization utility.** Key-order-independent serialization for index lookup keys. -**Validators:** +**Phase 2 — Composition:** -- [ ] **2.3 Extend `MongoStorageCollection` with validator.** Add `validator?: { jsonSchema: Record; validationLevel: 'strict' | 'moderate'; validationAction: 'error' | 'warn' }` to the contract type. Update Arktype schema. Type and validation tests. -- [ ] **2.4 Extend schema IR and planner for validators.** Add validator representation to `MongoSchemaIR`. Add `CollModCommand` DDL command class. Planner generates `collMod` operations with appropriate `MongoFilterExpr`-based postchecks against `listCollections` results. Classify: relaxing validation = `widening`, tightening = `destructive`. Unit tests for add/remove/change validator. Integration test: planner output → runner → verify validator applied on `mongodb-memory-server`. -- [ ] **2.5 Runner executes validator operations.** Command executor handles `CollModCommand` with `$jsonSchema`, `validationLevel`, `validationAction`. Integration tests. +- [ ] **2.7 Update `contractToSchema` for validators, options, and new index options.** +- [ ] **2.8 Update serializer/deserializer for new DDL commands and index options.** +- [ ] **2.9 Update DDL formatter for new commands and index options.** -**Collection options:** +**Phase 3 — Planner extensions:** -- [ ] **2.6 Extend `MongoStorageCollection` with collection options.** Capped settings, time series configuration, collation, change stream pre/post images. Update Arktype schema. Tests. -- [ ] **2.7 Extend schema IR and planner for collection options.** Add `CreateCollectionCommand` and `DropCollectionCommand` DDL command classes. Planner generates operations with `listCollections`-based checks for new collections and option changes. Unit tests. Integration test: planner output → runner → verify collection options on `mongodb-memory-server`. -- [ ] **2.8 Runner executes collection option operations.** Command executor handles `CreateCollectionCommand` (with options), `DropCollectionCommand`, and `CollModCommand` for option changes. Integration tests. +- [ ] **2.10 Extend planner for full index vocabulary.** Canonical lookup keys, new index options in diffing. +- [ ] **2.11 Extend planner for validators.** `collMod` generation, widening/destructive classification. +- [ ] **2.12 Extend planner for collection options.** `createCollection`/`dropCollection` generation, immutable option conflicts. -**Authoring + emitter:** +**Phase 4 — Runner + command executor:** -- [ ] **2.9 Add PSL authoring support for Mongo indexes.** Support `@@index` and `@@unique` annotations in the Mongo PSL interpreter. Update `@prisma-next/mongo-contract-psl` to populate `storage.collections[].indexes` from annotations. Tests with PSL fixtures. -- [ ] **2.10 Update Mongo emitter to populate enriched `storage.collections`.** Emit index definitions, validator (auto-derived `$jsonSchema` from model field definitions), and collection options into the contract. Tests verifying emitted contracts match expected shapes. +- [ ] **2.13 Extend command executor for new DDL commands.** `createCollection`, `dropCollection`, `collMod`, updated `createIndex` with new options. +- [ ] **2.14 Extend inspection executor for collection option/validator checks.** -**End-to-end proof:** +**Phase 5 — PSL authoring + emitter:** + +- [ ] **2.15 Add `@@index` and `@@unique` to Mongo PSL interpreter.** +- [ ] **2.16 Auto-derive `$jsonSchema` validator from model fields.** +- [ ] **2.17 Update Mongo emitter to populate enriched `storage.collections`.** + +**Phase 6 — End-to-end proof:** -- [ ] **2.11 End-to-end test: full vocabulary against real MongoDB.** Hand-crafted contracts exercising compound indexes, TTL indexes, partial indexes, validators (`$jsonSchema` + `validationLevel`), and collection options (capped, collation). Verify: `migration plan` produces correct operations → `migration apply` applies them on `mongodb-memory-server` → introspect database to confirm all configuration matches. Second contract modifying validators and removing indexes → plan produces correct `collMod`/`dropIndex` → apply succeeds. +- [ ] **2.18 End-to-end integration tests: full vocabulary.** Compound, text, wildcard, TTL, partial, geospatial, hashed indexes + validators + collection options against `mongodb-memory-server`. +- [ ] **2.19 End-to-end PSL authoring test.** PSL → contract → plan → apply → verify. ### Milestone 3: Polymorphic index generation @@ -171,16 +185,25 @@ Adds Mongo support to all CLI commands that interact with a live database. The o | CLI `migration plan` works with Mongo | Integration | 1.12 | End-to-end with hand-crafted contract | | CLI `migration apply` works with Mongo | Integration | 1.12 | End-to-end with `mongodb-memory-server` | | End-to-end single index | Integration | 1.12 | Plan → apply → verify index exists | -| All index key types | Unit + Integration | 2.1 | Each key type tested | -| All index options | Unit + Integration | 2.2 | Each option tested, identity tests | -| Validator in contract | Unit | 2.3 | Type + Arktype validation | -| Planner generates `collMod` for validators | Unit + Integration | 2.4 | Widening/destructive classification; applied on `mongodb-memory-server` | -| Runner executes validator operations | Integration | 2.5 | `mongodb-memory-server` | -| Collection options in contract | Unit | 2.6 | Type + Arktype validation | -| Planner generates collection option ops | Unit + Integration | 2.7 | New collection + option changes; applied on `mongodb-memory-server` | -| Runner executes collection option ops | Integration | 2.8 | `mongodb-memory-server` | -| Emitter populates enriched collections | Unit | 2.10 | PSL → contract verification | -| End-to-end full vocabulary | Integration | 2.11 | Compound/TTL/partial indexes + validators + collection options on `mongodb-memory-server` | +| New index options in contract (`wildcardProjection`, `collation`, `weights`, etc.) | Unit | 2.1 | Arktype validates all option shapes | +| Validator + collection options in contract | Unit | 2.2 | Type shapes + Arktype validation | +| New index options in schema IR | Unit | 2.3 | `indexesEquivalent` identity tests | +| `MongoSchemaValidator` + `MongoSchemaCollectionOptions` in IR | Unit | 2.4 | Construction, freeze, visitor dispatch | +| New DDL commands (`CreateCollectionCommand`, etc.) | Unit | 2.5 | Construction, freeze, visitor dispatch | +| Canonical serialization for lookup keys | Unit | 2.6 | Key-order independence | +| `contractToSchema` handles validators, options, new index options | Unit | 2.7 | Full contract → IR conversion | +| Serializer/deserializer handles new commands + options | Unit | 2.8 | Round-trip equality | +| DDL formatter renders new commands | Unit | 2.9 | Correct display strings | +| Planner: full index vocabulary with canonical keys | Unit + Integration | 2.10 | Text/wildcard/compound wildcard diffing | +| Planner generates `collMod` for validators | Unit + Integration | 2.11 | Widening/destructive classification | +| Planner: collection lifecycle + option conflicts | Unit + Integration | 2.12 | `createCollection`/`dropCollection`/conflicts | +| Command executor handles new DDL commands | Integration | 2.13 | `mongodb-memory-server` | +| Inspection executor validates option checks | Integration | 2.14 | `listCollections` filter evaluation | +| PSL `@@index`/`@@unique`/`@unique` → contract indexes | Unit | 2.15 | All index attribute combinations | +| `$jsonSchema` derivation from model fields | Unit | 2.16 | Type mapping, nullable, arrays, value objects | +| Emitter populates enriched collections | Unit | 2.17 | PSL → contract with indexes + validator | +| End-to-end full vocabulary | Integration | 2.18 | All index types + validators + options on `mongodb-memory-server` | +| End-to-end PSL authoring | Integration | 2.19 | PSL → contract → plan → apply → verify | | Polymorphic partial indexes auto-generated | Unit + Integration | 3.1 | Discriminator → partialFilterExpression; planner output applied on `mongodb-memory-server` | | End-to-end polymorphic proof | Integration | 3.2 | `mongodb-memory-server` | | Live introspection produces `MongoSchemaIR` | Integration | 4.1 | `mongodb-memory-server` | diff --git a/projects/mongo-schema-migrations/plans/m2-plan.md b/projects/mongo-schema-migrations/plans/m2-plan.md new file mode 100644 index 0000000000..bc332ecc7a --- /dev/null +++ b/projects/mongo-schema-migrations/plans/m2-plan.md @@ -0,0 +1,612 @@ +# M2 Implementation Plan: Full Index Vocabulary + Validators + Collection Options + +## Goal + +Extend every layer of the MongoDB migration pipeline to cover the full breadth of MongoDB server-side configuration. M1 proved the architecture with a thin vertical slice (single ascending index). M2 fills in the vocabulary: all index types and options, `$jsonSchema` validators, collection options, and PSL authoring support. + +## Design references + +| Area | Design doc | +|---|---| +| M2 spec | [m2-full-vocabulary.spec.md](../specs/m2-full-vocabulary.spec.md) | +| Schema IR | [schema-ir.spec.md](../specs/schema-ir.spec.md) | +| DDL commands + operation envelope | [operation-ast.spec.md](../specs/operation-ast.spec.md) | +| Operation envelope + serialization | [operation-envelope.spec.md](../specs/operation-envelope.spec.md) | +| DDL command dispatch | [ddl-command-dispatch.spec.md](../specs/ddl-command-dispatch.spec.md) | +| Check evaluator | [check-evaluator.spec.md](../specs/check-evaluator.spec.md) | +| Contract types + contractToSchema | [contract-to-schema-and-introspection.spec.md](../specs/contract-to-schema-and-introspection.spec.md) | +| Planner + runner | [planner-runner.spec.md](../specs/planner-runner.spec.md) | +| CLI display | [cli-display.spec.md](../specs/cli-display.spec.md) | +| ADR 187 — MongoDB schema representation | [ADR 187](../../../docs/architecture%20docs/adrs/ADR%20187%20-%20MongoDB%20schema%20representation%20for%20migration%20diffing.md) | +| ADR 188 — MongoDB migration operation model | [ADR 188](../../../docs/architecture%20docs/adrs/ADR%20188%20-%20MongoDB%20migration%20operation%20model.md) | +| ADR 189 — Structural index matching | [ADR 189](../../../docs/architecture%20docs/adrs/ADR%20189%20-%20Structural%20index%20matching%20for%20MongoDB%20migrations.md) | + +## Implementation sequence + +Tasks are grouped into **phases** by dependency. Tasks within a phase are independent and can be worked in parallel. + +--- + +### Phase 1: Foundation types (no inter-task dependencies) + +#### 2.1 Extend index options in contract types + +**Goal:** Add the remaining index options to `MongoStorageIndex` and update Arktype validation. + +**What to do:** +- Add to `MongoStorageIndex` (in `@prisma-next/mongo-contract`): + - `wildcardProjection?: Record` + - `collation?: Record` + - `weights?: Record` + - `default_language?: string` + - `language_override?: string` +- Add corresponding Arktype schema entries in `MongoStorageIndexSchema` +- Export new types from `exports/index.ts` + +**Tests:** +- Arktype validation: valid index with each new option, invalid shapes rejected +- Type-level: `MongoStorageIndex` accepts all option combinations + +**Package:** `packages/2-mongo-family/1-foundation/mongo-contract/` + +--- + +#### 2.2 Add validator and collection options to contract types + +**Goal:** Add `MongoStorageValidator` and `MongoStorageCollectionOptions` types and update Arktype validation. + +**What to do:** +- Define `MongoStorageValidator`: + ```typescript + interface MongoStorageValidator { + readonly jsonSchema: Record; + readonly validationLevel: 'strict' | 'moderate'; + readonly validationAction: 'error' | 'warn'; + } + ``` +- Define `MongoStorageCollectionOptions`: + ```typescript + interface MongoStorageCollectionOptions { + readonly capped?: { size: number; max?: number }; + readonly timeseries?: { timeField: string; metaField?: string; granularity?: 'seconds' | 'minutes' | 'hours' }; + readonly collation?: Record; + readonly changeStreamPreAndPostImages?: { enabled: boolean }; + readonly clusteredIndex?: { name?: string }; + } + ``` +- Update `MongoStorageCollection` to include `validator?` and `options?` fields +- Add Arktype schemas: `MongoStorageValidatorSchema`, `MongoCollectionOptionsSchema` +- Update `StorageCollectionSchema` to accept `validator?` and `options?` + +**Tests:** +- Arktype validation: valid/invalid validator shapes, valid/invalid option shapes +- Backward compat: collection with no validator/options still passes validation + +**Package:** `packages/2-mongo-family/1-foundation/mongo-contract/` + +--- + +#### 2.3 Extend schema IR with new index options + +**Goal:** Add the new index options to `MongoSchemaIndex` and update `indexesEquivalent`. + +**What to do:** +- Add to `MongoSchemaIndex` and `MongoSchemaIndexOptions`: + - `wildcardProjection?: Record` + - `collation?: Record` + - `weights?: Record` + - `default_language?: string` + - `language_override?: string` +- Update `indexesEquivalent` to compare new options (using `deepEqual` for object-valued ones) +- Export updated types + +**Tests:** +- `indexesEquivalent`: same keys + different `wildcardProjection` → not equivalent +- `indexesEquivalent`: same keys + same `collation` → equivalent +- `indexesEquivalent`: same keys + different `weights` → not equivalent +- Construction and freeze behavior for indexes with new options + +**Package:** `packages/2-mongo-family/3-tooling/mongo-schema-ir/` + +--- + +#### 2.4 Add MongoSchemaValidator and MongoSchemaCollectionOptions to schema IR + +**Goal:** Implement the validator and collection options node classes, update the visitor interface. + +**What to do:** +- Create `MongoSchemaValidator` class: + ```typescript + class MongoSchemaValidator extends MongoSchemaNode { + readonly kind = 'validator' as const; + readonly jsonSchema: Record; + readonly validationLevel: 'strict' | 'moderate'; + readonly validationAction: 'error' | 'warn'; + } + ``` +- Create `MongoSchemaCollectionOptions` class: + ```typescript + class MongoSchemaCollectionOptions extends MongoSchemaNode { + readonly kind = 'collectionOptions' as const; + readonly capped?: { size: number; max?: number }; + readonly timeseries?: { timeField: string; metaField?: string; granularity?: 'seconds' | 'minutes' | 'hours' }; + readonly collation?: Record; + readonly changeStreamPreAndPostImages?: { enabled: boolean }; + readonly clusteredIndex?: { name?: string }; + } + ``` +- Update `MongoSchemaCollection` to accept `validator?` and `options?` +- Update `MongoSchemaVisitor`: change `validator(node: unknown)` → `validator(node: MongoSchemaValidator)` and `collectionOptions(node: unknown)` → `collectionOptions(node: MongoSchemaCollectionOptions)` +- Update `AnyMongoSchemaNode` union type +- Export new classes from `exports/index.ts` + +**Tests:** +- Validator: construction, freeze, visitor dispatch +- CollectionOptions: construction with each option, freeze, visitor dispatch +- Collection with validator and options: construction, nested freeze + +**Package:** `packages/2-mongo-family/3-tooling/mongo-schema-ir/` + +--- + +#### 2.5 Add new DDL command classes + +**Goal:** Add `CreateCollectionCommand`, `DropCollectionCommand`, and `CollModCommand` to the DDL command AST. + +**What to do:** +- Create `CreateCollectionCommand extends MongoAstNode`: + - Fields: `collection`, `validator?`, `validationLevel?`, `validationAction?`, `capped?`, `size?`, `max?`, `timeseries?`, `collation?`, `changeStreamPreAndPostImages?`, `clusteredIndex?` + - `accept(visitor: MongoDdlCommandVisitor): R` +- Create `DropCollectionCommand extends MongoAstNode`: + - Fields: `collection` + - `accept(visitor)` +- Create `CollModCommand extends MongoAstNode`: + - Fields: `collection`, `validator?`, `validationLevel?`, `validationAction?`, `changeStreamPreAndPostImages?` + - `accept(visitor)` +- Update `AnyMongoDdlCommand` union type +- Update `MongoDdlCommandVisitor` with new methods: `createCollection`, `dropCollection`, `collMod` +- Add new index options to `CreateIndexCommand`: `wildcardProjection?`, `collation?`, `weights?`, `default_language?`, `language_override?` + +**Tests:** +- Each new command: construction, freeze, kind discriminant, visitor dispatch +- Updated `CreateIndexCommand` with new options: construction, freeze + +**Package:** `packages/2-mongo-family/4-query/query-ast/` + +--- + +#### 2.6 Implement canonical serialization utility + +**Goal:** Replace `JSON.stringify` with a key-order-independent canonical serialization for index lookup keys. + +**What to do:** +- Implement `canonicalize(obj: unknown): string` — recursively sorts object keys, handles arrays, primitives, null/undefined +- Can live in `@prisma-next/mongo-schema-ir` alongside `indexesEquivalent`, or as a shared utility + +**Tests:** +- `canonicalize({ b: 1, a: 2 })` === `canonicalize({ a: 2, b: 1 })` +- Nested objects: key order independent at all levels +- Arrays: order preserved (not sorted) +- Primitives, null, undefined handled correctly + +**Package:** `packages/2-mongo-family/3-tooling/mongo-schema-ir/` (or utility package) + +--- + +### Phase 2: Composition (depends on Phase 1 types) + +#### 2.7 Update `contractToSchema` for validators, options, and new index options + +**Goal:** Extend `contractToMongoSchemaIR` to convert validators, collection options, and new index options from contract to IR. + +**What to do:** +- Update `convertIndex` to pass through new options (`wildcardProjection`, `collation`, `weights`, `default_language`, `language_override`) +- Add `convertValidator(v: MongoStorageValidator): MongoSchemaValidator` +- Add `convertOptions(o: MongoStorageCollectionOptions): MongoSchemaCollectionOptions` +- Update `convertCollection` to include validator and options + +**Tests:** +- Index with each new option → correct IR +- Collection with validator → IR has `MongoSchemaValidator` +- Collection with options (capped, timeseries, collation, changeStreamPreAndPostImages, clusteredIndex) → IR has `MongoSchemaCollectionOptions` +- Null contract → empty IR (still works) +- Collection with no validator/options → IR without them (backward compat) + +**Package:** `packages/3-mongo-target/2-mongo-adapter/` + +**Depends on:** 2.1, 2.2, 2.3, 2.4 + +--- + +#### 2.8 Update serializer/deserializer for new DDL commands and index options + +**Goal:** Extend `mongo-ops-serializer` to handle the new command kinds and index options. + +**What to do:** +- Add Arktype validation schemas for `CreateCollectionCommand`, `DropCollectionCommand`, `CollModCommand` JSON shapes +- Add deserializer cases in `deserializeDdlCommand` for new `kind` values +- Update `CreateIndexJson` schema and `deserializeDdlCommand` case to handle new index options +- Verify serialization: new commands serialize correctly via `JSON.stringify` (frozen AST nodes) + +**Tests:** +- Round-trip: construct each new command → serialize → deserialize → structurally equal +- Round-trip: `CreateIndexCommand` with new options → serialize → deserialize → equal +- Invalid JSON shapes for new commands → validation error + +**Package:** `packages/3-mongo-target/2-mongo-adapter/` + +**Depends on:** 2.5 + +--- + +#### 2.9 Update DDL formatter for new commands and index options + +**Goal:** Extend `MongoDdlCommandFormatter` to render new commands and index options as display strings. + +**What to do:** +- Add `createCollection` method: `db.createCollection("name", { ...options })` +- Add `dropCollection` method: `db.name.drop()` +- Add `collMod` method: `db.runCommand({ collMod: "name", validator: ..., ... })` +- Update `createIndex` formatter to include new options (`collation`, `weights`, `default_language`, `language_override`, `wildcardProjection`) + +**Tests:** +- `CreateCollectionCommand` with options → correct display string +- `DropCollectionCommand` → correct display string +- `CollModCommand` with validator → correct display string +- `CreateIndexCommand` with text index options → correct display string +- `CreateIndexCommand` with wildcard projection → correct display string + +**Package:** `packages/3-mongo-target/2-mongo-adapter/` + +**Depends on:** 2.5 + +--- + +### Phase 3: Planner extensions (depends on Phase 2) + +#### 2.10 Extend planner for full index vocabulary + +**Goal:** Update the planner's index diffing to include new options in the lookup key, using canonical serialization. + +**What to do:** +- Update `buildIndexLookupKey` to include `wildcardProjection`, `collation`, `weights`, `default_language`, `language_override` using `canonicalize()` +- Replace existing `JSON.stringify(index.partialFilterExpression)` with `canonicalize(index.partialFilterExpression)` +- Update `planCreateIndex` to pass new options to `CreateIndexCommand` +- Update precheck/postcheck filter expressions for indexes with new options + +**Tests:** +- Same keys + different `wildcardProjection` → detected as different indexes +- Same keys + different `collation` → detected as different indexes +- Same keys + different `weights` → detected as different indexes +- `partialFilterExpression` with different key order → treated as same (canonical serialization) +- Text index: add → `createIndex`, remove → `dropIndex` +- Wildcard index: add → `createIndex`, remove → `dropIndex` +- Compound wildcard: add → `createIndex`, correct key spec + +**Package:** `packages/3-mongo-target/2-mongo-adapter/` + +**Depends on:** 2.6, 2.7 + +--- + +#### 2.11 Extend planner for validators + +**Goal:** Add validator diffing to the planner. + +**What to do:** +- Compare origin and destination collection validators +- No validator → validator added: emit `collMod` (or include in `createCollection` if collection is also new) +- Validator → no validator: emit `collMod` to remove validator +- Validator changed: emit `collMod` with new validator +- Classify validator changes per the spec (widening vs destructive) +- Implement `planUpdateValidator(collection, oldValidator, newValidator): MongoMigrationPlanOperation` +- Postchecks use `ListCollectionsCommand` + filter on `options.validationLevel` + +**Tests:** +- No validator → add validator: `collMod` operation, classified as `destructive` +- Remove validator: `collMod` operation, classified as `widening` +- Change `validationAction` error → warn: classified as `widening` +- Change `validationLevel` moderate → strict: classified as `destructive` +- Change `$jsonSchema` body: classified as `destructive` (conservative default) +- New collection with validator: validator included in `createCollection` (not a separate `collMod`) +- Policy gate: destructive validator change blocked when policy disallows destructive + +**Package:** `packages/3-mongo-target/2-mongo-adapter/` + +**Depends on:** 2.7 + +--- + +#### 2.12 Extend planner for collection options + +**Goal:** Add collection lifecycle and option diffing to the planner. + +**What to do:** +- New collection in destination but not in origin: + - If collection has options or validator → emit `createCollection` with all options (additive) + - Indexes on new collections are emitted as `createIndex` operations (after collection creation) +- Collection in origin but not in destination → emit `dropCollection` (destructive) +- Option changes on existing collections: + - `changeStreamPreAndPostImages` changed → emit `collMod` (widening if enabling, destructive if disabling) + - Immutable option changed (capped, timeseries, collation, clusteredIndex) → emit `MigrationPlannerConflict` with guidance +- Implement `planCreateCollection`, `planDropCollection` +- Operation ordering: collection creates first, collection drops last + +**Tests:** +- New collection with no options → no `createCollection` (MongoDB auto-creates on first write; indexes are sufficient) +- New collection with capped option → `createCollection` with capped +- New collection with clusteredIndex → `createCollection` with clusteredIndex +- New collection with timeseries → `createCollection` with timeseries +- New collection with validator + options → single `createCollection` with both +- Collection removed → `dropCollection` (destructive) +- Capped → non-capped on existing collection → conflict +- Collation change on existing collection → conflict +- `changeStreamPreAndPostImages` toggle → `collMod` +- Policy gate: `dropCollection` blocked when destructive disallowed + +**Package:** `packages/3-mongo-target/2-mongo-adapter/` + +**Depends on:** 2.7 + +--- + +### Phase 4: Runner + command executor (depends on Phase 3) + +#### 2.13 Extend command executor for new DDL commands + +**Goal:** Add handler methods to `MongoCommandExecutor` for the new DDL command kinds. + +**What to do:** +- Add `createCollection(cmd)` → `db.createCollection(cmd.collection, { ...options })` +- Add `dropCollection(cmd)` → `db.collection(cmd.collection).drop()` +- Add `collMod(cmd)` → `db.command({ collMod: cmd.collection, validator: ..., validationLevel: ..., validationAction: ..., ... })` +- Update `createIndex(cmd)` to pass new options (wildcardProjection, collation, weights, default_language, language_override) to the MongoDB driver + +**Tests (integration, using `mongodb-memory-server`):** +- `createCollection` with capped options → collection exists with correct options +- `createCollection` with clusteredIndex → collection exists as clustered +- `createCollection` with validator → collection has validator +- `createCollection` with timeseries → time series collection exists +- `createCollection` with collation → collection has collation +- `dropCollection` → collection no longer exists +- `collMod` with validator → validator updated on collection +- `collMod` with changeStreamPreAndPostImages → option updated +- `createIndex` with text options (weights, default_language) → text index created correctly +- `createIndex` with wildcardProjection → wildcard index with projection +- `createIndex` with collation → index has collation + +**Package:** `packages/3-mongo-target/2-mongo-adapter/` + +**Depends on:** 2.5 + +--- + +#### 2.14 Extend inspection executor for collection option/validator checks + +**Goal:** Ensure the inspection executor and filter evaluator handle `listCollections` results with nested option fields. + +**What to do:** +- Verify `ListCollectionsCommand` returns collection info including `options.validator`, `options.validationLevel`, `options.validationAction`, `options.capped`, etc. +- Verify `FilterEvaluator` handles dotted paths like `options.validator.$jsonSchema` and `options.validationLevel` correctly (already supported — verify with tests) + +**Tests (integration):** +- Create collection with validator → `listCollections` → filter on `options.validationLevel` matches +- Create capped collection → `listCollections` → filter on `options.capped` matches + +**Package:** `packages/3-mongo-target/2-mongo-adapter/` + +**Depends on:** 2.13 + +--- + +### Phase 5: PSL authoring + emitter (can start after Phase 1; full integration after Phase 3) + +#### 2.15 Add `@@index` and `@@unique` to Mongo PSL interpreter + +**Goal:** Handle index-related model attributes in the Mongo PSL interpreter, populating `storage.collections[].indexes`. + +**What to do:** +- In `interpretPslDocumentToMongoContract()`, handle `modelAttribute.name === 'index'` and `modelAttribute.name === 'unique'`: + - Parse field list using `parseAttributeFieldList` (shared with SQL PSL) + - Parse Mongo-specific named arguments: `sparse`, `expireAfterSeconds`, `type` (for text direction), `weights`, `defaultLanguage`, `languageOverride`, `collation` + - Map field names to storage field names (respecting `@map`) + - Construct `MongoStorageIndex` entries +- Handle field-level `@unique` attribute → single-field unique index +- Populate `storage.collections[collectionName].indexes` instead of `{}` + +**Tests:** +- `@@index([field1, field2])` → compound ascending index +- `@@unique([field])` → unique index +- `@unique` on field → single-field unique index +- `@@index([field], expireAfterSeconds: 3600)` → TTL index +- `@@index([field], sparse: true)` → sparse index +- `@@index([field], type: "text", weights: { field: 10 })` → text index with weights +- `@@index([field1, field2], map: "custom_name")` → index (name is metadata, not identity) +- Fields with `@map` → index uses mapped field names +- Invalid: `@@index` with no fields → diagnostic +- Multiple `@@index` on same model → multiple indexes on same collection + +**Package:** `packages/2-mongo-family/2-authoring/contract-psl/` + +**Depends on:** 2.1 + +--- + +#### 2.16 Auto-derive `$jsonSchema` validator from model fields + +**Goal:** Implement a utility that produces a `$jsonSchema` document from a contract's model field definitions. + +**What to do:** +- Implement `deriveJsonSchema(model: MongoModelDefinition): Record` +- Map contract field types to BSON types: + - Scalar types via codec ID → BSON type mapping + - Value objects → `"object"` with nested `properties` (recursive) + - Nullable fields → `bsonType: ["", "null"]` + - Array fields (`many: true`) → `bsonType: "array"` with `items` +- Non-nullable fields added to `required` array +- Return a complete `$jsonSchema` object with `bsonType: "object"`, `required`, `properties` +- This is a standalone utility function, testable independently + +**Tests:** +- Model with String, Int, Float, Boolean, DateTime fields → correct BSON types +- Nullable field → `["string", "null"]` +- Array field → `"array"` with items +- Value object field → nested `"object"` with properties +- Mixed nullable + array combinations +- Empty model → minimal schema + +**Package:** `packages/2-mongo-family/2-authoring/contract-psl/` or `packages/2-mongo-family/3-tooling/emitter/` + +**Depends on:** 2.2 + +--- + +#### 2.17 Update Mongo emitter to populate enriched `storage.collections` + +**Goal:** Wire the PSL interpreter output and `$jsonSchema` derivation into the emitter pipeline so emitted contracts carry indexes and validators. + +**What to do:** +- After PSL interpretation produces the contract, run `deriveJsonSchema` on each model +- Populate `storage.collections[collectionName].validator` with the derived schema + default validation policy (`validationLevel: 'strict'`, `validationAction: 'error'`) +- Merge PSL-derived indexes (from 2.15) into `storage.collections[collectionName].indexes` +- Update the emitter's validation to accept the enriched collection shape + +**Tests:** +- PSL with `@@index` + model fields → emitted contract has both indexes and validator +- Emitted `$jsonSchema` matches expected derivation from model fields +- No `@@index` attributes → emitted contract has validator but no indexes +- Emitted contract passes Arktype validation + +**Package:** `packages/2-mongo-family/3-tooling/emitter/` and/or `packages/2-mongo-family/2-authoring/contract-psl/` + +**Depends on:** 2.15, 2.16 + +--- + +### Phase 6: End-to-end proof (depends on all previous phases) + +#### 2.18 End-to-end integration tests: full vocabulary + +**Goal:** Prove the full pipeline works against a real MongoDB instance for all M2 features. + +**What to do:** +- Hand-craft contracts exercising: + - Compound indexes (multi-field, ascending + descending) + - Text indexes with weights, default_language + - Wildcard indexes with wildcardProjection + - TTL indexes + - Partial indexes + - Geospatial indexes (2dsphere) + - Hashed indexes + - Indexes with collation (case-insensitive) +- Test sequence: + 1. Contract v1 (empty) → plan → apply → verify indexes exist + 2. Contract v2 (modify: change some indexes, add validator, add collection options) → plan → apply → verify changes + 3. Contract v3 (remove indexes, relax validator) → plan → apply → verify removals + +**Tests (integration, using `mongodb-memory-server`):** +- Create cycle: plan → apply → listIndexes confirms each index type +- Modify cycle: plan → apply → validator changed, indexes changed +- Remove cycle: plan → apply → dropIndex, validator relaxed +- Collection with options: createCollection with capped → verify +- Collection with clusteredIndex: createCollection → verify +- Idempotent re-apply: already-applied plan → no-op +- Validator: collMod → listCollections confirms validationLevel/validationAction + +**Package:** `test/integration/test/mongo/` and/or `packages/3-mongo-target/2-mongo-adapter/test/` + +**Depends on:** all previous tasks + +--- + +#### 2.19 End-to-end PSL authoring test + +**Goal:** Prove the full PSL → contract → plan → apply flow. + +**What to do:** +- Write a PSL schema with `@@index`, `@@unique`, `@unique`, and model fields +- Run through the emitter to produce a contract with indexes and validator +- Feed the contract through `migration plan` → `migration apply` +- Verify indexes and validator exist on `mongodb-memory-server` + +**Tests (integration):** +- PSL → emitter → contract → plan → apply → verify indexes +- PSL → emitter → contract with derived validator → plan → apply → verify validator + +**Package:** `test/integration/test/mongo/` + +**Depends on:** 2.17, 2.18 + +--- + +## Package summary + +| Modified package | Changes | +|---|---| +| `@prisma-next/mongo-contract` | New index options, `MongoStorageValidator`, `MongoStorageCollectionOptions`, Arktype schemas | +| `@prisma-next/mongo-schema-ir` | New index options on `MongoSchemaIndex`, `MongoSchemaValidator` class, `MongoSchemaCollectionOptions` class, canonical serialization, updated visitor | +| `@prisma-next/mongo-query-ast` | `CreateCollectionCommand`, `DropCollectionCommand`, `CollModCommand`, updated `CreateIndexCommand`, updated visitor | +| `@prisma-next/adapter-mongo` | Extended planner (index, validator, collection diffs), extended command executor, extended formatter, updated serializer | +| `@prisma-next/mongo-contract-psl` | `@@index`, `@@unique`, `@unique` handling, `$jsonSchema` derivation | +| `@prisma-next/mongo-emitter` | Enriched `storage.collections` with indexes + validator | + +## Dependency graph + +``` +Phase 1 (parallel — foundation types): + 2.1 Index options in contract ──────────────────────┐ + 2.2 Validator + options in contract ─────────────────┤ + 2.3 Index options in schema IR ──────────────────────┤ + 2.4 Validator + options nodes in schema IR ───────────┤ + 2.5 New DDL command classes ─────────────────────────┤ + 2.6 Canonical serialization utility ─────────────────┤ + │ +Phase 2 (depends on Phase 1): │ + 2.7 contractToSchema extensions ─────────────────────┤ (needs 2.1–2.4) + 2.8 Serializer/deserializer extensions ──────────────┤ (needs 2.5) + 2.9 DDL formatter extensions ────────────────────────┤ (needs 2.5) + │ +Phase 3 (depends on Phase 2): │ + 2.10 Planner: full index vocabulary ──────────────────┤ (needs 2.6, 2.7) + 2.11 Planner: validators ─────────────────────────────┤ (needs 2.7) + 2.12 Planner: collection options ─────────────────────┤ (needs 2.7) + │ +Phase 4 (depends on Phase 1 DDL types): │ + 2.13 Command executor: new commands ──────────────────┤ (needs 2.5) + 2.14 Inspection executor: option checks ──────────────┤ (needs 2.13) + │ +Phase 5 (PSL — can start after Phase 1): │ + 2.15 PSL interpreter: @@index/@@unique ───────────────┤ (needs 2.1) + 2.16 $jsonSchema derivation ──────────────────────────┤ (needs 2.2) + 2.17 Emitter: enriched storage.collections ───────────┤ (needs 2.15, 2.16) + │ +Phase 6 (E2E — depends on all): │ + 2.18 E2E integration tests ───────────────────────────┤ (needs Phases 1–4) + 2.19 E2E PSL authoring test ──────────────────────────┘ (needs 2.17, 2.18) +``` + +Note: Phase 4 (command executor) and Phase 5 (PSL authoring) can proceed in parallel with Phase 3 (planner). The command executor needs only DDL command types (Phase 1), not planner output. PSL authoring needs only contract types (Phase 1). + +## Testing strategy + +| Test type | Location | Framework | Infrastructure | +|---|---|---|---| +| Unit (types, AST, Arktype, canonical serialization) | Colocated `test/` in each package | Vitest | None | +| Unit (planner diffs, formatter, serialization) | Colocated `test/` in each package | Vitest | None | +| Integration (command executor, runner, E2E) | Package `test/` or `test/integration/test/mongo/` | Vitest | `mongodb-memory-server` via `MongoMemoryReplSet` | +| Unit (PSL interpreter) | `packages/2-mongo-family/2-authoring/contract-psl/test/` | Vitest | None | + +**Mongo test setup patterns:** +- Use `MongoMemoryReplSet` for integration tests +- Use `describeWithMongoDB` or `withMongod` helpers +- Set `timeout` and `hookTimeout` to `timeouts.spinUpDbServer` +- `beforeEach`: drop test database for isolation +- `fileParallelism: false` in vitest config for DB tests + +## Risk and open items + +- **`$jsonSchema` derivation depth:** The derivation maps contract field types to BSON types. Edge cases with deeply nested value objects, union types, or dict fields may require iteration. Start with basic types and build up. +- **Text index restrictions:** Text indexes have unique constraints (only one per collection, special compound rules). The planner should detect and report conflicts if a user defines multiple text indexes on the same collection. +- **Time series collections:** `mongodb-memory-server` support for time series collections should be verified early. If not supported, those tests may need to be deferred or use a different test infrastructure. +- **PSL grammar expressibility:** Some Mongo-specific index options (e.g., `weights: { bio: 10 }`) require object-valued named arguments in PSL attributes. Verify the PSL parser can handle this; if not, consider alternative syntax. +- **Emitter integration:** The relationship between the PSL interpreter and the emitter needs clarification — does the interpreter produce the enriched contract directly, or does the emitter post-process? The SQL pattern should guide this. diff --git a/projects/mongo-schema-migrations/specs/m2-full-vocabulary.spec.md b/projects/mongo-schema-migrations/specs/m2-full-vocabulary.spec.md new file mode 100644 index 0000000000..ddce7ca483 --- /dev/null +++ b/projects/mongo-schema-migrations/specs/m2-full-vocabulary.spec.md @@ -0,0 +1,474 @@ +# M2: Full Index Vocabulary, Validators, and Collection Options + +## Summary + +Extend every layer of the MongoDB migration pipeline — contract types, schema IR, Arktype validation, DDL commands, planner, runner, serializer, CLI formatter, and PSL authoring — to cover the full breadth of MongoDB server-side configuration: all index types and options, `$jsonSchema` validators, and collection options. M1 proved the architecture works for a single ascending index; M2 fills in the vocabulary. + +## Grounding example + +A Prisma schema declares a `users` collection with a unique email index, a text index on `bio` with weights and language, a TTL index on `lastSeen`, and a `$jsonSchema` validator derived from model fields: + +```prisma +model User { + id String @id @map("_id") + email String @unique + bio String + lastSeen DateTime + + @@index([bio], type: "text", weights: { bio: 10 }, defaultLanguage: "english") + @@index([lastSeen], expireAfterSeconds: 2592000) +} +``` + +The contract's `storage.collections.users` carries: + +```json +{ + "indexes": [ + { "keys": [{ "field": "email", "direction": 1 }], "unique": true }, + { + "keys": [{ "field": "bio", "direction": "text" }], + "weights": { "bio": 10 }, + "default_language": "english" + }, + { "keys": [{ "field": "lastSeen", "direction": 1 }], "expireAfterSeconds": 2592000 } + ], + "validator": { + "jsonSchema": { + "bsonType": "object", + "required": ["email", "bio", "lastSeen"], + "properties": { + "email": { "bsonType": "string" }, + "bio": { "bsonType": "string" }, + "lastSeen": { "bsonType": "date" } + } + }, + "validationLevel": "strict", + "validationAction": "error" + } +} +``` + +The planner diffs this against the prior contract's schema IR and emits `createIndex`, `dropIndex`, and `collMod` operations as needed. The runner applies them against a live MongoDB instance. + +## What M2 adds (layer by layer) + +### 1. Index options + +M1 established the contract type `MongoStorageIndex` with `keys`, `unique`, `sparse`, `expireAfterSeconds`, and `partialFilterExpression`. The direction type `MongoIndexKeyDirection` already covers `1 | -1 | 'text' | '2dsphere' | '2d' | 'hashed'`. + +M2 adds these index options to complete the vocabulary: + +| Option | Type | Used by | Notes | +|---|---|---|---| +| `wildcardProjection` | `Record` | Wildcard indexes (`$**`) | Include/exclude fields from wildcard index | +| `collation` | `Record` | Any index | Per-index collation for case-insensitive matching | +| `weights` | `Record` | Text indexes | Field weight assignment | +| `default_language` | `string` | Text indexes | Default language for text analysis | +| `language_override` | `string` | Text indexes | Per-document language override field name | + +These options affect index behavior and must be part of the index lookup key for structural matching. + +#### Wildcard indexes + +Wildcard indexes use `$**` (or `path.$**`) as the **field name**, not as a direction value. Examples: + +```typescript +// All fields +{ field: '$**', direction: 1 } + +// Path-specific +{ field: 'attributes.$**', direction: 1 } + +// Compound wildcard (MongoDB 7.0+) +[ + { field: 'tenantId', direction: 1 }, + { field: '$**', direction: 1 }, +] +``` + +The `wildcardProjection` option only applies when the wildcard key is `$**` (not path-specific). `wildcardProjection` is part of the wildcard index's identity (it changes which fields are indexed). + +#### Index options NOT modeled + +These are excluded as legacy, debugging, or internal metadata: + +- `hidden` — debugging tool (hide index from query planner without dropping) +- `textIndexVersion`, `2dsphereIndexVersion` — internal version metadata +- `min` / `max` / `bits` — legacy 2d index coordinate bounds +- `storageEngine` — per-index storage engine config (niche) + +### 2. Validators + +MongoDB supports `$jsonSchema` validators that reject documents not matching a schema. The validator is set at collection creation or modified via `collMod`. + +#### Contract type + +```typescript +interface MongoStorageValidator { + readonly jsonSchema: Record; + readonly validationLevel: 'strict' | 'moderate'; + readonly validationAction: 'error' | 'warn'; +} +``` + +Added as an optional field on `MongoStorageCollection`: + +```typescript +interface MongoStorageCollection { + readonly indexes?: ReadonlyArray; + readonly validator?: MongoStorageValidator; + readonly options?: MongoStorageCollectionOptions; +} +``` + +#### Schema IR + +`MongoSchemaValidator` is a new concrete class extending `MongoSchemaNode`: + +```typescript +class MongoSchemaValidator extends MongoSchemaNode { + readonly kind = 'validator' as const; + readonly jsonSchema: Record; + readonly validationLevel: 'strict' | 'moderate'; + readonly validationAction: 'error' | 'warn'; +} +``` + +The existing `MongoSchemaVisitor` already has a `validator(node)` method (typed as `unknown` in M1). M2 changes this to `validator(node: MongoSchemaValidator)`. + +#### Validator diff classification + +The planner classifies validator changes: + +| Change | Classification | Rationale | +|---|---|---| +| No validator → validator added | `destructive` | New validation may reject existing documents | +| Validator removed | `widening` | Removes constraints | +| `validationAction` error → warn | `widening` | Relaxes enforcement | +| `validationAction` warn → error | `destructive` | Tightens enforcement | +| `validationLevel` strict → moderate | `widening` | Relaxes scope | +| `validationLevel` moderate → strict | `destructive` | Tightens scope | +| `$jsonSchema` body changed | `destructive` (default) | Conservative: any structural change to the schema body is treated as destructive | + +The conservative default for `$jsonSchema` body changes is intentional. Proper JSON Schema subset detection is complex and out of scope for M2. The architecture validates that the migration system can handle validator operations; comprehensive diff classification can be refined later. + +#### DDL command + +A single `CollModCommand` handles both validator and collection option changes, matching how MongoDB's `collMod` command works: + +```typescript +class CollModCommand extends MongoAstNode { + readonly kind = 'collMod' as const; + readonly collection: string; + readonly validator?: Record; + readonly validationLevel?: 'strict' | 'moderate'; + readonly validationAction?: 'error' | 'warn'; + readonly changeStreamPreAndPostImages?: { enabled: boolean }; +} +``` + +#### Operation structure + +Validator operations use `listCollections` for checks (querying `options.validator`): + +```typescript +const op: MongoMigrationPlanOperation = { + id: 'validator.users.update', + label: 'Update validator on users', + operationClass: 'destructive', + precheck: [{ + description: 'collection exists', + source: new ListCollectionsCommand(), + filter: MongoFieldFilter.eq('name', 'users'), + expect: 'exists', + }], + execute: [{ + description: 'update validator on users', + command: new CollModCommand('users', { + validator: { $jsonSchema: { ... } }, + validationLevel: 'strict', + validationAction: 'error', + }), + }], + postcheck: [{ + description: 'validator applied', + source: new ListCollectionsCommand(), + filter: MongoAndExpr.of([ + MongoFieldFilter.eq('name', 'users'), + MongoFieldFilter.eq('options.validationLevel', 'strict'), + ]), + expect: 'exists', + }], +}; +``` + +### 3. Collection options + +MongoDB collections can be created with options that affect storage and behavior. Some options are immutable after creation; others can be modified via `collMod`. + +#### Contract type + +```typescript +interface MongoStorageCollectionOptions { + readonly capped?: { size: number; max?: number }; + readonly timeseries?: { + timeField: string; + metaField?: string; + granularity?: 'seconds' | 'minutes' | 'hours'; + }; + readonly collation?: Record; + readonly changeStreamPreAndPostImages?: { enabled: boolean }; + readonly clusteredIndex?: { name?: string }; +} +``` + +`clusteredIndex` is modeled minimally — the key is always `{ _id: 1 }` and unique is always `true`, so only the optional name needs storing. + +#### Schema IR + +`MongoSchemaCollectionOptions` is a new concrete class extending `MongoSchemaNode`: + +```typescript +class MongoSchemaCollectionOptions extends MongoSchemaNode { + readonly kind = 'collectionOptions' as const; + readonly capped?: { size: number; max?: number }; + readonly timeseries?: { timeField: string; metaField?: string; granularity?: 'seconds' | 'minutes' | 'hours' }; + readonly collation?: Record; + readonly changeStreamPreAndPostImages?: { enabled: boolean }; + readonly clusteredIndex?: { name?: string }; +} +``` + +The `MongoSchemaVisitor`'s existing `collectionOptions(node)` method (typed as `unknown` in M1) is updated to `collectionOptions(node: MongoSchemaCollectionOptions)`. + +#### Mutability rules + +| Option | Set at creation | Modifiable via `collMod` | Planner behavior | +|---|---|---|---| +| `capped` | Yes | No | Conflict if changed on existing collection | +| `timeseries` | Yes | No | Conflict if changed on existing collection | +| `collation` | Yes | No | Conflict if changed on existing collection | +| `changeStreamPreAndPostImages` | Yes | Yes | `collMod` operation (widening or destructive) | +| `clusteredIndex` | Yes | No | Conflict if changed on existing collection | + +For immutable options, the planner emits a `MigrationPlannerConflict` with guidance suggesting a manual migration (drop + recreate with new options). + +#### DDL commands + +```typescript +class CreateCollectionCommand extends MongoAstNode { + readonly kind = 'createCollection' as const; + readonly collection: string; + readonly validator?: Record; + readonly validationLevel?: 'strict' | 'moderate'; + readonly validationAction?: 'error' | 'warn'; + readonly capped?: boolean; + readonly size?: number; + readonly max?: number; + readonly timeseries?: { timeField: string; metaField?: string; granularity?: 'seconds' | 'minutes' | 'hours' }; + readonly collation?: Record; + readonly changeStreamPreAndPostImages?: { enabled: boolean }; + readonly clusteredIndex?: { key: Record; unique: boolean; name?: string }; +} + +class DropCollectionCommand extends MongoAstNode { + readonly kind = 'dropCollection' as const; + readonly collection: string; +} +``` + +`CollModCommand` (defined above under Validators) also handles the mutable collection options (`changeStreamPreAndPostImages`). + +#### Operation ordering + +The planner emits operations in this deterministic order: + +1. **Collection creates** (additive) — new collections first, so indexes on them can be created +2. **Index drops** (destructive) — drop obsolete indexes before creating replacements +3. **Index creates** (additive) — new indexes +4. **Validator updates** — after structural changes +5. **Collection option updates** — after structural changes +6. **Collection drops** (destructive) — last, most destructive + +Within each category, operations are ordered lexicographically by collection name, then by index key spec. + +### 4. PSL authoring + +The Mongo PSL interpreter (`@prisma-next/mongo-contract-psl`) currently fills `storage.collections[name] = {}` — empty objects with no indexes, validators, or options. M2 adds: + +#### Index attributes + +Following the SQL PSL interpreter's pattern (`modelAttribute.name === 'index' | 'unique'`): + +- **`@@index([field1, field2], ...options)`** — compound index on named fields +- **`@@unique([field1, field2], ...options)`** — same with `unique: true` +- **`@unique`** (field-level) — single-field unique index + +Named arguments for Mongo-specific options: + +```prisma +model User { + email String @unique + bio String + lastSeen DateTime + + @@index([bio], type: "text", weights: { bio: 10 }, defaultLanguage: "english") + @@index([lastSeen], expireAfterSeconds: 2592000) + @@index([email, tenantId], sparse: true) +} +``` + +The interpreter maps these to `MongoStorageIndex` entries in `storage.collections[collectionName].indexes`. + +#### Validator derivation + +The emitter auto-derives a `$jsonSchema` validator from model field definitions. The derivation maps contract field types to BSON types: + +| Contract field type | BSON type | +|---|---| +| `String` | `"string"` | +| `Int` | `"int"` | +| `Float` | `"double"` | +| `Boolean` | `"bool"` | +| `DateTime` | `"date"` | +| `ObjectId` | `"objectId"` | +| Value object | `"object"` with nested properties | + +Non-nullable fields are added to the `required` array. Nullable fields use `bsonType: ["", "null"]`. Array fields (`many: true`) use `bsonType: "array"` with `items`. + +The depth of derivation is bounded by the contract's expressiveness — the contract carries field types, nullability, and cardinality, which maps directly to a `$jsonSchema`. Features beyond contract expressiveness (enums, pattern validation, min/max constraints) are not auto-derived. + +### 5. Canonical serialization for structural matching + +M1's `buildIndexLookupKey` uses `JSON.stringify` for `partialFilterExpression`, which is key-order dependent. M2 adds new object-valued options (`wildcardProjection`, `collation`, `weights`) that have the same issue. The lookup key computation must use a canonical serialization that produces the same string regardless of key order. + +```typescript +function canonicalize(obj: unknown): string { + if (obj === null || obj === undefined) return String(obj); + if (typeof obj !== 'object') return JSON.stringify(obj); + if (Array.isArray(obj)) return `[${obj.map(canonicalize).join(',')}]`; + const sorted = Object.keys(obj as Record).sort(); + const entries = sorted.map(k => `${JSON.stringify(k)}:${canonicalize((obj as Record)[k])}`); + return `{${entries.join(',')}}`; +} +``` + +The lookup key for M2 includes all identity-significant options: + +```typescript +function buildIndexLookupKey(index: MongoSchemaIndex): string { + const keys = index.keys.map(k => `${k.field}:${k.direction}`).join(','); + const opts = [ + index.unique ? 'unique' : '', + index.sparse ? 'sparse' : '', + index.expireAfterSeconds != null ? `ttl:${index.expireAfterSeconds}` : '', + index.partialFilterExpression ? `pfe:${canonicalize(index.partialFilterExpression)}` : '', + index.wildcardProjection ? `wp:${canonicalize(index.wildcardProjection)}` : '', + index.collation ? `col:${canonicalize(index.collation)}` : '', + index.weights ? `wt:${canonicalize(index.weights)}` : '', + index.default_language ? `lang:${index.default_language}` : '', + index.language_override ? `lo:${index.language_override}` : '', + ].filter(Boolean).join(';'); + return opts ? `${keys}|${opts}` : keys; +} +``` + +### 6. Updated DDL visitor and command executor + +The `MongoDdlCommandVisitor` interface gains methods for the new DDL commands: + +```typescript +interface MongoDdlCommandVisitor { + createIndex(command: CreateIndexCommand): R; + dropIndex(command: DropIndexCommand): R; + createCollection(command: CreateCollectionCommand): R; + dropCollection(command: DropCollectionCommand): R; + collMod(command: CollModCommand): R; +} +``` + +Adding these methods forces compile-time updates to: +- `MongoCommandExecutor` (runner — maps to MongoDB driver calls) +- `MongoDdlCommandFormatter` (CLI — produces display strings) +- `mongo-ops-serializer` (serialization/deserialization) + +#### Command executor mappings + +| Visitor method | MongoDB driver call | +|---|---| +| `createIndex` | `collection.createIndex(keySpec, options)` | +| `dropIndex` | `collection.dropIndex(name)` | +| `createCollection` | `db.createCollection(name, options)` | +| `dropCollection` | `collection.drop()` | +| `collMod` | `db.command({ collMod: name, validator: ..., ... })` | + +#### CLI formatter output + +``` +db.users.createIndex({ "bio": "text" }, { weights: { "bio": 10 }, default_language: "english" }) +db.users.createIndex({ "lastSeen": 1 }, { expireAfterSeconds: 2592000 }) +db.createCollection("events", { capped: true, size: 1048576 }) +db.runCommand({ collMod: "users", validator: { $jsonSchema: { ... } }, validationLevel: "strict" }) +db.orders.drop() +``` + +## Acceptance criteria + +### Index vocabulary +- [ ] All index key types tested through full pipeline: ascending, descending, compound, text, geospatial (2d, 2dsphere), hashed, wildcard (`$**`, `path.$**`), compound wildcard +- [ ] All index options supported and tested: `unique`, `sparse`, `expireAfterSeconds`, `partialFilterExpression`, `wildcardProjection`, `collation`, `weights`, `default_language`, `language_override` +- [ ] Index lookup key uses canonical serialization (key-order independent) for object-valued options +- [ ] DDL formatter renders all index types and options correctly + +### Validators +- [ ] `MongoStorageValidator` type in contract with `jsonSchema`, `validationLevel`, `validationAction` +- [ ] `MongoSchemaValidator` node in schema IR +- [ ] Arktype validation accepts validator definitions +- [ ] Planner generates `collMod` operations for validator changes +- [ ] Validator changes classified as widening or destructive +- [ ] Runner executes `collMod` with validator against real MongoDB +- [ ] `contractToSchema` converts validator from contract to IR + +### Collection options +- [ ] `MongoStorageCollectionOptions` type with capped, timeseries, collation, changeStreamPreAndPostImages, clusteredIndex +- [ ] `MongoSchemaCollectionOptions` node in schema IR +- [ ] Arktype validation accepts collection option definitions +- [ ] `CreateCollectionCommand` DDL class with all options +- [ ] `DropCollectionCommand` DDL class +- [ ] `CollModCommand` DDL class for mutable options and validators +- [ ] Planner generates `createCollection` with options for new collections +- [ ] Planner emits conflicts for unsupported transitions (capped changes, timeseries changes, collation changes, clustered changes) +- [ ] Runner executes collection commands against real MongoDB + +### PSL authoring +- [ ] Mongo PSL interpreter handles `@@index`, `@@unique`, `@unique` +- [ ] Mongo-specific index options parsed as named arguments (sparse, expireAfterSeconds, collation, type, weights, etc.) +- [ ] Emitter populates `storage.collections[].indexes` from PSL annotations +- [ ] Emitter auto-derives `$jsonSchema` validator from model field definitions +- [ ] Round-trip: PSL → contract → `contractToSchema` → planner produces correct operations + +### Serialization +- [ ] New DDL commands (`CreateCollectionCommand`, `DropCollectionCommand`, `CollModCommand`) serialize/deserialize correctly +- [ ] All new index options serialize/deserialize in `CreateIndexCommand` +- [ ] Arktype validation schemas for all new command kinds + +### End-to-end +- [ ] Integration test: compound indexes, TTL indexes, partial indexes, text indexes with weights, wildcard indexes → plan → apply → verify on `mongodb-memory-server` +- [ ] Integration test: validator added → `collMod` applied → verify validator on `mongodb-memory-server` +- [ ] Integration test: collection created with options (capped, collation, clusteredIndex) → verify options on `mongodb-memory-server` +- [ ] Integration test: second contract modifying validators and removing indexes → correct `collMod`/`dropIndex` operations → apply succeeds + +## Alternatives considered + +### Full JSON Schema subset detection for validator diffs + +We could implement proper JSON Schema containment checking to classify validator changes as widening vs destructive more precisely (e.g., adding an optional field is widening, adding a required field is destructive). This is algorithmically complex (JSON Schema subsumption is co-NP-hard in the general case) and not needed for the architecture validation goals of M2. The conservative approach (any body change = destructive) is safe and can be refined later. + +### Separate DDL commands per concern + +We could have `UpdateValidatorCommand`, `UpdateCollectionOptionsCommand` as separate DDL classes instead of a single `CollModCommand`. MongoDB's `collMod` is one command that handles both, and splitting it would mean the command model diverges from MongoDB's actual API. A single `CollModCommand` is simpler, matches the database, and carries optional fields for each concern. + +### Skip PSL authoring for M2 + +We could defer PSL authoring to a later milestone and continue using hand-crafted contracts. The authoring infrastructure already exists (`@prisma-next/mongo-contract-psl` with a working interpreter) and the SQL pattern for `@@index` / `@@unique` is proven. Adding Mongo index support is incremental work on existing infrastructure, and it's needed to prove the full contract-first flow (PSL → contract → migration) for MongoDB. diff --git a/projects/mongo-schema-migrations/specs/psl-binding-model.md b/projects/mongo-schema-migrations/specs/psl-binding-model.md new file mode 100644 index 0000000000..5fbb9f6461 --- /dev/null +++ b/projects/mongo-schema-migrations/specs/psl-binding-model.md @@ -0,0 +1,478 @@ +# PSL Binding Model + +The PSL parser produces an untyped AST of blocks, members, attributes, and values. The **binding layer** sits between the parser and target-specific interpreters. It resolves identifiers, validates structure, and produces a validated AST that interpreters can consume without manual parsing. + +This document defines the binding model: scopes, name resolution, block type definitions, and attribute definitions. + +## Three layers + +``` +PSL text + │ + ▼ +┌──────────────────────┐ +│ Parser (grammar) │ Tokenize, parse blocks/members/attributes/values. +│ │ No knowledge of what keywords or attributes mean. +│ │ Produces untyped AST with typed values. +└──────────┬───────────┘ + │ + ▼ +┌──────────────────────┐ +│ Binding layer │ Resolve identifiers against scopes. +│ │ Validate structure against block/attribute definitions. +│ │ Produce validated, resolved AST. +│ │ Driven by a BindingContext provided by framework components. +└──────────┬───────────┘ + │ + ▼ +┌──────────────────────┐ +│ Interpreter │ Target-specific semantics (Mongo, SQL). +│ (family-specific) │ Receives resolved AST — no raw string parsing. +└──────────────────────┘ +``` + +The parser is generic — it knows PSL syntax but not semantics. The binding layer is also generic — it resolves names and validates structure, but doesn't know what a "model" or "index" means. Meaning is assigned by the interpreter, which receives a clean, validated AST. + +## Binding context + +The binding context is the configuration that tells the binding layer what constructs are valid. It is assembled from framework component contributions — the core framework provides base block types and common attributes, and each family/extension adds its own. + +```typescript +interface BindingContext { + readonly blockTypes: ReadonlyArray; + readonly contextDirectives: ReadonlyArray; + readonly builtinTypes: ReadonlyArray; + readonly typeConstructors: ReadonlyArray; + readonly tagHandlers: ReadonlyArray; +} +``` + +### Builtin types + +The binding context declares what scalar type names are available in the document scope: + +```typescript +interface BuiltinTypeEntry { + readonly name: string; // "String", "Int", "ObjectId", ... + readonly category: "scalar"; // all builtins are scalars +} +``` + +The core framework provides `String`, `Int`, `Boolean`, `DateTime`, `Float`, `Decimal`, `Json`, `Bytes`, `BigInt`. Target families add their own (e.g. Mongo adds `ObjectId`). + +### Block type definitions + +A block type definition describes a kind of entity declaration — what keyword introduces it, what members look like, and what attributes are valid. + +```typescript +interface BlockTypeDefinition { + readonly keyword: string; + readonly memberSchema: MemberSchema; + readonly memberAttributes: ReadonlyArray; + readonly blockAttributes: ReadonlyArray; +} +``` + +The framework registers block types. Different families can extend the same keyword's definition (e.g. both SQL and Mongo contribute block attributes to `model`). + +#### Member schema + +The member schema describes what members look like inside this block type: + +```typescript +interface MemberSchema { + readonly hasTypeExpression: boolean; // fields have types; enum values don't + readonly hasAssignment: boolean; // types block uses `name = expr`; others don't + readonly typeCategories: ReadonlyArray; // what categories of type names are valid +} +``` + +For example: +- `model`: members have type expressions, types can be scalars, models (relations), enums, composite types, or aliases +- `enum`: members have no type expression and no assignment — they're bare names +- `types` (context directive): members have assignments + +### Context directive definitions + +Context directives are simpler — they modify the interpretation environment rather than declaring entities: + +```typescript +interface ContextDirectiveDefinition { + readonly keyword: string; + readonly memberSchema: MemberSchema; + readonly memberAttributes: ReadonlyArray; +} +``` + +The `types` directive introduces type aliases into the document scope. + +### Attribute definitions + +An attribute definition describes a single attribute — its name, what arguments it accepts, and what functions are available within its argument scope. + +```typescript +interface AttributeDefinition { + readonly name: string; + readonly arguments: ReadonlyArray; + readonly functions?: ReadonlyArray; +} + +interface ArgumentDefinition { + readonly name?: string; // undefined = positional + readonly type: ValueTypeConstraint; + readonly required: boolean; + readonly scope?: ScopeDirective; // how to narrow the scope for this argument's values + readonly dictionary?: DictionaryConstraint; // for typed dictionary arguments + readonly valueSchema?: JsonSchema; // JSON Schema applied to opaque values +} + +interface FunctionDefinition { + readonly name: string; + readonly arguments: ReadonlyArray; +} +``` + +Functions declared on an attribute are available only within that attribute's argument values. For example, `wildcard()` is scoped to `@@index`'s field list. + +### Typed dictionary arguments + +A **typed dictionary** is an Object argument where the keys and values have separate validation strategies. This is declared via `dictionary` and `valueSchema` on an `ArgumentDefinition`: + +```typescript +interface DictionaryConstraint { + readonly keyScope: ScopeDirective; // how to resolve keys (e.g. as field refs) + readonly valueValidation: "opaque"; // values are not resolved, just schema-validated +} +``` + +When `dictionary` is set, the binding layer: +1. Resolves each top-level key as an identifier in the scope specified by `keyScope` (e.g. `enclosingEntity` for field references) +2. Does **not** resolve identifiers within the values — they are opaque structured data +3. If `valueSchema` is set, validates each value against the provided JSON Schema + +This is distinct from: +- A plain **Object** argument (fully opaque — no key resolution, no schema validation) +- A **List** argument with scope (every element is resolved) + +The primary use case is `filter` on `@@index`, where keys are model field names (validated at authoring time) and values are MongoDB filter operator expressions (validated by JSON Schema): + +```prisma +@@index([status, age], filter: { status: "active", age: { "$gte": 18 } }) +// ^^^^^^ ^^^ +// field ref ✓ field ref ✓ +// ^^^^^^^^ ^^^^^^^^^^^^^ +// opaque value opaque value (schema-validated) +``` + +## Scopes and name resolution + +### Core principle + +The binding layer resolves identifiers by combining two inputs: + +1. **What scope is active** — what names are visible at this point in the document +2. **What type is expected** — what kind of value is valid in this position + +The combination narrows resolution. The binding layer doesn't need annotations telling it "this is a field reference" — it determines that from context: this position expects an identifier, and the active scope contains field names from the enclosing entity, so the identifier resolves as a field reference. + +### Scope hierarchy + +Scopes are nested containers of named entries. Each entry has a name and a category. Inner scopes see everything in their parent. + +``` +Document scope +├── String : scalar +├── Int : scalar +├── ObjectId : scalar (target-provided) +├── User : model (declared entity) +├── Post : model (declared entity) +├── Role : enum (declared entity) +├── Address : compositeType (declared entity) +├── Email : alias → String (from context directive) +│ +└── Entity "User" scope (extends document scope) + ├── id : field(ObjectId) + ├── email : field(String) + ├── role : field(Role) + └── posts : field(Post[]) +``` + +The document scope is populated from: +- Builtin types declared in the binding context +- Entity names introduced by entity declarations in the file +- Aliases introduced by context directives (e.g. `types` block) + +Each entity block has its own scope extending the document scope, populated with its member names. + +### How identifier resolution works + +When the binding layer encounters an identifier, it looks it up in the active scope and checks whether the result is compatible with the expected type constraint: + +**Field type position** — active scope: document scope, expected: type name (any category): + +```prisma +model User { + email String // "String" found in document scope, category: scalar ✓ + role Role // "Role" found in document scope, category: enum ✓ + posts Post[] // "Post" found in document scope, category: model ✓ +} +``` + +**`@@index` field list** — active scope: narrowed to enclosing entity's members, expected: field name: + +```prisma +@@index([email, name]) +// "email" found in entity scope, category: field ✓ +// "name" found in entity scope, category: field ✓ +// "String" — found in document scope but category: scalar, not field → error +``` + +**Named argument value** — active scope: attribute argument scope (may include constants), expected: per argument definition: + +```prisma +@@index([status], type: hashed) +// "hashed" resolved as a valid constant for this argument position +``` + +### Scope narrowing + +Attribute argument definitions can specify a scope directive that narrows the active scope for their values. This is how `@@index` restricts its field list to field names rather than all document-scope identifiers: + +```typescript +type ScopeDirective = + | { kind: "enclosingEntity" } // field names of the current entity + | { kind: "referencedEntity", via: string } // field names of a related entity + | { kind: "document" } // full document scope (default) +``` + +The `enclosingEntity` directive says: "for this argument, only names from the enclosing entity's member scope are visible." The binding layer doesn't know *why* — it just narrows the scope. + +The `referencedEntity` directive handles the cross-entity case like `@relation(references: [id])`, where `id` must resolve against the related model's fields. The `via` field indicates which piece of context determines the target entity (e.g. the field's own type expression). + +### Future: namespaced scopes + +Today, the document scope is flat — all entity names live at the top level. When we add namespace support (Postgres schemas, MySQL databases), the scope hierarchy deepens: + +``` +Document scope +├── public (namespace) +│ ├── User : model +│ └── Post : model +└── analytics (namespace) + ├── Event : model + └── Metric : model +``` + +Qualified name resolution (`analytics.Event`) traverses namespace scopes. The binding layer's resolution algorithm stays the same — it just follows dotted paths through nested scopes. + +### Type constructor resolution + +When the binding layer encounters a `DottedIdent(args)` value, it must determine whether this is a `FunctionCall` or a `TypeConstructor`. Resolution works as follows: + +1. Look up the name in the attribute's declared functions (from `AttributeDefinition.functions`). If found → `FunctionCall`. +2. Look up the name in the binding context's type constructor registry. If found → `TypeConstructor`. +3. If neither → binding error ("unknown function or type constructor"). + +Type constructors are contributed by extension packs (e.g. `pgvector` registers `pgvector.zero`, `pgvector.random`). They produce a value of the corresponding type and are primarily useful in `@default` arguments: + +```prisma +model Items { + embedding pgvector.Vector(1536) @default(pgvector.zero(1536)) +} +``` + +```typescript +interface TypeConstructorDefinition { + readonly name: string; // "pgvector.zero" + readonly producesType: string; // "pgvector.Vector" + readonly arguments: ReadonlyArray; +} +``` + +### Tagged literal resolution + +When the binding layer encounters a `TaggedLiteral`, it resolves the tag (a dotted identifier) against registered tag handlers in the binding context: + +```typescript +interface TagHandlerDefinition { + readonly tag: string; // "pg.sql", "pg.predicate", "mongo.expr" + readonly packName: string; +} +``` + +The binding layer verifies that the tag is registered, routes the body to the owning pack for validation, and produces a validated `TaggedLiteral` node. The body content is opaque to the binding layer — the pack determines whether it's valid SQL, a valid predicate expression, etc. + +```prisma +@@policy(pg.predicate`status = 'active'`) +// tag "pg.predicate" → look up in tag registry → found, owned by "pg" pack ✓ +// body "status = 'active'" → passed to pg pack for validation +``` + +If the tag is not registered, the binding layer reports an error: "unknown tag 'pg.predicate' — ensure the 'pg' extension pack is in extensionPacks." + +## How framework components contribute + +The binding context is assembled from contributions. Each framework component declares what it adds: + +### Core framework + +Provides the base block types and common attributes: + +``` +Block types: + model — members with type expressions, standard field/block attributes + enum — members are bare names, block attributes only + type — same member structure as model (composite types) + +Context directives: + types — assignment members, introduces aliases into document scope + +Builtin types: + String, Int, Boolean, DateTime, Float, Decimal, Json, Bytes, BigInt + +Common member attributes: + @id — no arguments + @unique — no arguments + @map — positional String argument + @default — positional Value argument, with functions: now(), autoincrement() + @relation — named arguments: fields (List), references (List), etc. + +Common block attributes: + @@map — positional String argument + @@index — positional List, named arguments (map, etc.) + @@unique — positional List, named arguments +``` + +### Mongo family + +Extends `model` with additional block attributes and functions: + +``` +Additional block attributes for model: + @@index — extended with: + type: Identifier (hashed | 2dsphere | 2d) + sparse: Boolean + expireAfterSeconds: Number + filter: TypedDictionary (keys: field refs, values: schema-validated filter operators) + collation: Object (schema-validated) + include: List + exclude: List + Functions: wildcard(field ref?) + + @@textIndex — positional List, named arguments: + weights: Object (schema-validated) + language: String + languageOverride: String + +Additional builtin types: + ObjectId +``` + +### SQL family + +Extends `model` with different attributes, extension values, and may add new block types: + +``` +Future block types: + view — same member structure as model, with tagged literal body + +Additional block attributes for model: + @@index — extended with: type (Hash | Gin | Gist | SpGist | Brin), etc. + @@policy — positional TaggedLiteral argument (pg.predicate) + +Type constructors: + pgvector.zero(dimensions) — produces pgvector.Vector + pgvector.random(dimensions) — produces pgvector.Vector + +Tag handlers: + pg.sql — SQL expression (view definitions, raw SQL) + pg.predicate — SQL predicate (RLS policies, check constraints) +``` + +## Example: binding `@@index` in Mongo + +Given this PSL: + +```prisma +model Events { + id ObjectId @id @map("_id") + status String + tenantId String + metadata Json + + @@index([tenantId, wildcard(metadata)], filter: { status: "active" }, sparse: true) +} +``` + +The binding layer processes `@@index` as follows: + +1. **Locate the attribute definition** — `@@index` is a block attribute on the `model` block type. Found in the Mongo family's contribution. + +2. **Resolve the first positional argument** — expected type: `List`, scope: `enclosingEntity`. + - `tenantId` → look up in entity "Events" scope → found, category: field ✓ + - `wildcard(metadata)` → `wildcard` is a function declared on `@@index`. Resolve its argument: `metadata` → look up in entity "Events" scope → found, category: field ✓ + +3. **Resolve named arguments**: + - `filter: { status: "active" }` → typed dictionary argument. Resolve top-level keys: + - `status` → look up in entity "Events" scope → found, category: field ✓ + - Value `String("active")` → validate against filter operator JSON Schema (implicit `$eq`) ✓ + - `sparse: true` → expected type: Boolean. Value is Boolean(true) ✓ + +4. **Produce validated AST** — the interpreter receives: + - fields: `[FieldRef("tenantId"), FunctionCall("wildcard", [FieldRef("metadata")])]` + - filter: `TypedDictionary({ FieldRef("status"): String("active") })` + - sparse: `Boolean(true)` + +The interpreter never parses raw strings. It receives typed, resolved values. + +## Example: binding extension values in SQL + +Given this PSL: + +```prisma +model Items { + id Int @id + embedding pgvector.Vector(1536) @default(pgvector.zero(1536)) + + @@policy(pg.predicate`tenant_id = current_setting('app.tenant_id')`) +} + +view ActiveUsers { + definition pg.sql` + SELECT id, email FROM "User" WHERE status = 'active' + ` +} +``` + +The binding layer processes these as follows: + +1. **Type constructor in `@default`** — The value `pgvector.zero(1536)` is a `DottedIdent(args)`. + - Not found in `@default`'s declared functions (`now`, `autoincrement`) → not a FunctionCall. + - Found in `typeConstructors` registry as `pgvector.zero` → TypeConstructor ✓ + - Validates that `pgvector.zero` produces `pgvector.Vector`, matching the field's type ✓ + +2. **Tagged literal in `@@policy`** — The value `` pg.predicate`tenant_id = ...` `` is a `TaggedLiteral`. + - Tag `pg.predicate` → look up in `tagHandlers` registry → found, owned by "pg" pack ✓ + - Body passed to the pg pack for validation (is this a valid SQL predicate?) ✓ + +3. **Tagged literal as member value** — In the `view` block, `definition` is a member with a tagged literal value. + - Tag `pg.sql` → found in `tagHandlers` registry ✓ + - Body is a SQL SELECT statement → pg pack validates it ✓ + +## Design decisions + +### Why scopes, not reference-kind annotations + +An earlier design annotated each argument definition with a `referenceKind` (e.g. `"fieldRef"`, `"typeRef"`). This was rejected because it duplicates information that's already implicit in the scope hierarchy. The binding layer determines what an identifier refers to by combining the active scope with the expected type — no additional annotations needed. This also scales better to future namespace support, where resolution rules become more complex. + +### Why block types own their attributes + +An earlier design had attribute definitions declare their own `target: "member" | "block"`. This was inverted — block type definitions own their attribute lists. This is more natural: a block type is the authority on what's valid within it. When registering a new block type (e.g. `view`), you specify everything about it in one place. Attributes don't need to know about the blocks they appear in. + +### Why entity declarations vs context directives + +An earlier design treated all top-level blocks uniformly. We split them into entity declarations (which introduce named things with identity) and context directives (which modify the interpretation environment). The `types` block is fundamentally different from `model` — it doesn't declare a domain entity, it configures type aliases. This distinction maps to familiar concepts in other languages (declarations vs imports/macros) and will accommodate future environment-modifying constructs. + +### Why the parser doesn't special-case keywords + +The parser could have separate AST node types for models, enums, etc. (and currently does). The target design uses a single generic block AST node where the keyword is a plain string. This means adding a new block type (e.g. `view`) requires no parser changes — just a new block type definition in the binding context. The parser's job is purely syntactic. diff --git a/projects/mongo-schema-migrations/specs/psl-index-authoring-surface.md b/projects/mongo-schema-migrations/specs/psl-index-authoring-surface.md new file mode 100644 index 0000000000..7dca9b451b --- /dev/null +++ b/projects/mongo-schema-migrations/specs/psl-index-authoring-surface.md @@ -0,0 +1,292 @@ +# PSL Index Authoring Surface for MongoDB + +## Grounding example + +Today, a user can declare basic indexes in PSL: + +```prisma +model User { + id ObjectId @id @map("_id") + email String @unique + bio String + + @@index([email, bio]) + @@index([bio], type: "text", weights: "{\"bio\": 10}") +} +``` + +But MongoDB supports several index features that users **cannot yet express** in PSL: + +- **Wildcard indexes** — index all subpaths of a nested document (e.g. a schemaless `metadata` field) without naming each path upfront +- **Collation** — control how string comparison works (locale, case-sensitivity, accent-sensitivity) so that queries can match `"café"` and `"Café"` correctly +- **Partial indexes** — index only documents matching a filter (e.g. only `status: "active"` documents), saving storage and write overhead +- **Wildcard projections** — when using a wildcard index, limit coverage to specific subtrees (include) or exclude certain paths + +These features are already supported in the contract types and migration pipeline. This doc defines how users author them via PSL. + +This design depends on the PSL value model extensions described in [PSL Language Specification](psl-language-spec.md) (object literals, function calls as first-class values) and the binding model described in [PSL Binding Model](psl-binding-model.md) (scoped function definitions, attribute argument validation). + +## Decision summary + +**Four key decisions:** + +1. **`wildcard()` function in field lists.** MongoDB's wildcard key is `$**`, but `$` and `*` would break the PSL grammar. Instead, users write `wildcard()` (optionally scoped: `wildcard(metadata)`). This maps to `$**` / `metadata.$**` in the contract. `wildcard` is a function scoped to the `@@index` attribute definition. + +2. **`@@textIndex` as a dedicated attribute.** Text indexes have a fundamentally different option set (`weights`, `language`, `languageOverride`) and different query semantics (queried via `$text`, not standard comparison). A separate `@@textIndex` attribute makes each form self-documenting, avoiding a complex compatibility matrix. + +3. **Object literals for structured options.** `collation` and `weights` use PSL object literals — `{ locale: "fr", strength: 2 }`, `{ title: 10, body: 5 }` — instead of escaped JSON strings. This is enabled by the typed value model extension to the PSL parser. + +4. **`filter` as a typed dictionary.** The `filter` argument's top-level keys are model field references resolved by the binding layer (validated against the entity's fields). The values are opaque JSON structures validated against a JSON Schema constraining them to MongoDB's filter operators. Multiple keys are implicitly AND'd. Top-level `$or`/`$and` are not supported initially. + +5. **`include`/`exclude` field lists for wildcard projections.** Rather than MongoDB's `0`/`1` projection document, users specify field lists: `include: [metadata, tags]`. The binding layer resolves these as field references in the enclosing entity's scope. + +## Syntax by example + +### Regular indexes — `@@index` + +The common case. Fields are ordered, direction defaults to ascending. + +```prisma +model Events { + id ObjectId @id @map("_id") + status String + tenantId String + expiresAt DateTime + + @@index([status]) // ascending on one field + @@index([status, tenantId]) // compound ascending +} +``` + +#### TTL and sparse + +A TTL index automatically deletes documents after a duration. `sparse` skips documents where the indexed field is missing. + +```prisma + @@index([expiresAt], sparse: true, expireAfterSeconds: 3600) +``` + +#### Partial indexes with `filter` + +A *partial index* only covers documents matching a MongoDB query filter. This reduces index size and write cost when queries always target a subset. + +```prisma + @@index([status], filter: { status: "active" }) + @@index([age], filter: { age: { "$gte": 18 } }) + @@index([status, age], filter: { status: "active", age: { "$gte": 18, "$lte": 65 } }) +``` + +`filter` is a **typed dictionary**: the top-level keys are model field references resolved by the binding layer against the enclosing entity's scope (catches typos and stale references on rename). The values are opaque JSON structures validated against a JSON Schema that constrains them to MongoDB's supported filter operators: `$eq` (implicit when value is a scalar), `$exists`, `$gt`, `$gte`, `$lt`, `$lte`, `$type`. + +Multiple field keys in the same filter object are implicitly AND'd — `{ status: "active", age: { "$gte": 18 } }` means both conditions must hold. This covers the vast majority of real-world partial filter expressions. + +`$`-prefixed operator keys within values use quoted strings (`"$gte"`, `"$exists"`) since `$` is not a valid identifier character in PSL. + +**Scope limitation:** Top-level logical operators `$or` and explicit `$and` are not supported in this initial design. These are rare in partial filter expressions — implicit AND via multiple field keys handles nearly all cases. If logical operators are needed, the index can be created outside PSL. + +#### Collation + +Collation controls locale-aware string comparison for the index. A query can only *use* a collated index if it specifies the same collation, so this is a deliberate user choice. + +```prisma + @@index([status], collation: { locale: "fr", strength: 2 }) +``` + +`strength` controls what differences matter: +- **1**: base characters only (a = A = á) +- **2**: base + accents (a = A, but a ≠ á) +- **3**: base + accents + case (default) + +The full set of collation fields: + +| Key | Type | Description | +|-----|------|-------------| +| `locale` | String | ICU locale (required) | +| `strength` | Number (1–5) | Comparison sensitivity level | +| `caseLevel` | Boolean | Enable case-level comparisons | +| `caseFirst` | String (`"upper"`, `"lower"`, `"off"`) | Sort order of case differences | +| `numericOrdering` | Boolean | Numeric string comparison (`"10"` after `"9"`) | +| `alternate` | String (`"non-ignorable"`, `"shifted"`) | Whitespace/punctuation handling | +| `maxVariable` | String (`"punct"`, `"space"`) | Characters affected by `alternate: "shifted"` | +| `backwards` | Boolean | Reverse secondary differences (French dictionary order) | +| `normalization` | Boolean | Unicode normalization | + +#### Hashed and geospatial indexes + +These are rare, specialized index types. Hashed indexes are used for shard keys. Geospatial indexes (`2dsphere`, `2d`) support location queries. They stay under `@@index` with a `type` discriminator: + +```prisma + @@index([tenantId], type: hashed) // shard key + @@index([location], type: "2dsphere") // geospatial +``` + +Hashed indexes must have exactly one field. Neither hashed nor geo indexes support wildcard fields or uniqueness. + +#### Wildcard indexes + +A wildcard index covers all subpaths of a document (or a subtree) without naming them upfront. This is useful for schemaless nested data — e.g. a `metadata` field with arbitrary user-defined keys. + +In MongoDB, the wildcard key is `$**`, meaning "every field path, recursively." In PSL, we represent this with the `wildcard()` function in the field list. `wildcard` is a function scoped to the `@@index` attribute — it is only available within `@@index`'s field list argument. + +```prisma +model Events { + id ObjectId @id @map("_id") + tenantId String + metadata Json + tags Json + + // All fields in the document + @@index([wildcard()]) + + // Scoped to a subtree — all paths under metadata + @@index([wildcard(metadata)]) + + // Compound wildcard (MongoDB 7.0+) — fixed field + wildcard + @@index([tenantId, wildcard(metadata)]) +} +``` + +`wildcard()` maps to `$**` in the contract. `wildcard(metadata)` maps to `metadata.$**`. The `$**` is always a terminal — it means "recurse from this point down." The argument to `wildcard()` is a field reference resolved in the enclosing entity's scope. + +**Projections with `include`/`exclude`.** When using `wildcard()` without a scope argument, you can narrow coverage to specific subtrees with `include`, or index everything except certain paths with `exclude`: + +```prisma + // Only index metadata and tags subtrees + @@index([wildcard()], include: [metadata, tags]) + + // Index everything except _class and internalLog + @@index([wildcard()], exclude: [_class, internalLog]) +``` + +`include` and `exclude` are mutually exclusive. Both are field-reference lists resolved in the enclosing entity's scope. The interpreter converts them to the contract's `wildcardProjection`: +- `include: [a, b]` → `{ "a": 1, "b": 1 }` +- `exclude: [a, b]` → `{ "a": 0, "b": 0 }` + +**Constraints on wildcard fields:** +- At most **one** `wildcard()` per index +- Cannot be combined with `@@unique` / `@unique` — MongoDB does not support unique wildcard indexes +- Cannot be combined with `expireAfterSeconds` — TTL requires a single concrete date field +- Cannot be combined with `type: hashed`, `"2dsphere"`, or `"2d"` + +### Unique indexes — `@@unique` / `@unique` + +Shorthand for a regular index with `unique: true`. Supports `filter`, `collation`, `sparse`, and `expireAfterSeconds`, but **not** wildcard fields. + +```prisma +model User { + id ObjectId @id @map("_id") + email String @unique // field-level + + @@unique([email, tenantId]) // compound + + @@unique([email], collation: { locale: "en", strength: 2 }) // case-insensitive unique + + @@unique([email], filter: { active: true }) // partial unique +} +``` + +### Text indexes — `@@textIndex` + +Text indexes power MongoDB's full-text search (`$text` queries). They have a fundamentally different option surface from regular indexes, which is why they get their own attribute. + +```prisma +model Article { + id ObjectId @id @map("_id") + title String + body String + + @@textIndex([title, body]) + + @@textIndex([title, body], weights: { title: 10, body: 5 }, language: "english", languageOverride: "idioma") +} +``` + +Only **one** `@@textIndex` is permitted per collection (MongoDB limitation). + +## Validation rules + +The binding layer and interpreter validate these constraints at authoring time and produce clear diagnostics: + +1. **At most one `wildcard()` per index** — "An index can contain at most one wildcard() field" +2. **No wildcard in unique indexes** — "Unique indexes cannot use wildcard() fields" +3. **`include`/`exclude` mutual exclusivity** — "Cannot specify both include and exclude on the same index" +4. **`include`/`exclude` requires wildcard** — "include/exclude options are only valid when the index contains a wildcard() field" +5. **No TTL with wildcard** — "expireAfterSeconds cannot be combined with wildcard() fields" +6. **No wildcard with hashed/geo/text** — "wildcard() fields cannot be combined with type: hashed/2dsphere/2d or @@textIndex" +7. **One text index per collection** — "Only one @@textIndex is allowed per collection" +8. **Hashed single-field** — "Hashed indexes must have exactly one field" +9. **Collation locale required** — "`locale` is required in the collation object" +10. **Filter keys are field references** — "Unknown field 'staus' in filter — did you mean 'status'?" +11. **Filter values match JSON Schema** — "Invalid filter operator '$foo' for field 'age' — supported operators: $eq, $exists, $gt, $gte, $lt, $lte, $type" +12. **No top-level logical operators in filter** — "Top-level $or/$and in filter is not supported — use multiple field keys for implicit AND" + +## Contract mapping + +The PSL surface maps directly to the existing `MongoStorageIndex` contract type. No contract type changes are required. + +```typescript +interface MongoStorageIndex { + readonly keys: ReadonlyArray; // from field list + wildcard() + readonly unique?: boolean; // from @@unique or @unique + readonly sparse?: boolean; // from sparse: arg + readonly expireAfterSeconds?: number; // from expireAfterSeconds: arg + readonly partialFilterExpression?: Record; // from filter: typed dictionary (field-ref keys, schema-validated values) + readonly wildcardProjection?: Record; // from include/exclude field lists + readonly collation?: Record; // from collation: object literal + readonly weights?: Record; // from weights: object literal (@@textIndex) + readonly default_language?: string; // from language: arg (@@textIndex) + readonly language_override?: string; // from languageOverride: arg (@@textIndex) +} +``` + +`wildcard()` maps to the contract's key representation: + +| PSL form | Contract key | +|----------|-------------| +| `wildcard()` | `{ field: "$**", direction: 1 }` | +| `wildcard(metadata)` | `{ field: "metadata.$**", direction: 1 }` | +| `wildcard(foo.bar)` | `{ field: "foo.bar.$**", direction: 1 }` | + +## Scope + +This design covers the PSL authoring surface only. The TS authoring surface (`contract-ts`) is out of scope for now. + +## Alternatives considered + +### `$**` as a literal in the field list + +The most direct mapping would be `@@index([$**])`, mirroring MongoDB syntax exactly. We rejected this because `$` and `*` are `Invalid` tokens in the PSL tokenizer — they'd require grammar changes or escaping, adding complexity for a rare feature. The `wildcard()` function syntax uses the existing PSL function-call value type and is scoped to the `@@index` attribute via the binding model. + +### Escaped JSON strings for structured values + +The initial design used JSON strings for `filter`, `weights`, and `collation`: `filter: "{\"status\": \"active\"}"`. This works without parser changes but produces unreadable PSL and is error-prone (escaping, no validation). The typed value model extension adds object literal support to the parser, making `filter: { status: "active" }` possible. Since we need the value model extension anyway (for general PSL improvement), there's no reason to keep the JSON-string workaround. + +### Collation as decomposed named scalar arguments + +We considered surfacing collation fields as individual PSL arguments: `collationLocale: "fr", collationStrength: 2`. This avoids JSON strings but pollutes the `@@index` argument namespace with 9 prefixed arguments. With object literal support, `collation: { locale: "fr", strength: 2 }` is cleaner — the structured value groups naturally, and the binding layer can validate the object's shape. + +### Single `@@index` for everything (no `@@textIndex`) + +We considered keeping all index types under `@@index` with a `type` discriminator. This works for hashed and geo (rare, simple option sets), but text indexes have a completely different option surface (`weights`, `language`, `languageOverride`) and none of the regular options (`unique`, `expireAfterSeconds`). Overloading `@@index` for text would require a large compatibility matrix and produce confusing validation errors. A dedicated `@@textIndex` makes each attribute's valid options obvious. + +### Splitting all index types into separate attributes + +We considered `@@hashedIndex`, `@@geoIndex`, `@@wildcardIndex`, etc. We rejected this because: +- Hashed and geo are rare — dedicated attributes add surface area without much benefit +- Wildcard is a field-level concern, not a type — a wildcard index is just a regular index with a glob field +- Three attributes (`@@index`, `@@unique`, `@@textIndex`) cover the space well. The `type` discriminator handles the remaining rare cases. + +### Filter as a fully opaque Object + +We considered making `filter` a plain opaque Object with no key resolution — the binding layer would pass the entire structure through unvalidated. This is simpler but misses the opportunity to validate field names at authoring time. Since partial filter keys are almost always model field names, resolving them as field references catches typos and keeps filters consistent through renames. The opaque JSON values (with `$`-prefixed operators) are validated via JSON Schema instead. + +### Filter with top-level logical operators (`$or`, `$and`) + +We considered supporting `$or` and `$and` at the top level of filter expressions. This would require special-casing non-field keys or allowing quoted string keys to bypass field-ref resolution. We deferred this because: (a) implicit AND via multiple field keys covers the vast majority of real partial filters, (b) explicit `$and` is almost never needed (multiple operators on the same field go inside that field's value object), and (c) `$or` in a partial filter is genuinely rare. If needed, the index can be created outside PSL. + +### Wildcard projections as a JSON-style projection document + +MongoDB represents wildcard projections as `{ "metadata": 1, "tags": 1 }` (include) or `{ "_class": 0 }` (exclude). We considered mirroring this as an object literal: `projection: { metadata: 1, tags: 1 }`. We chose `include`/`exclude` field lists instead because: +- They make the intent explicit (no `0`/`1` syntax to learn) +- They are mutually exclusive by design, preventing invalid mixed projections +- The field names are resolved as references in the enclosing entity's scope, enabling validation diff --git a/projects/mongo-schema-migrations/specs/psl-language-spec.md b/projects/mongo-schema-migrations/specs/psl-language-spec.md new file mode 100644 index 0000000000..de26557199 --- /dev/null +++ b/projects/mongo-schema-migrations/specs/psl-language-spec.md @@ -0,0 +1,316 @@ +# PSL Language Specification + +This document specifies the Prisma Schema Language (PSL) grammar, value model, and document structure. It describes both the current implementation in `@prisma-next/psl-parser` and the target design we are evolving toward. Sections marked **(current)** describe what exists today; sections marked **(target)** describe the design direction. + +## Lexical elements + +The tokenizer produces the following token kinds: + +| Token kind | Pattern | Examples | +|-----------|---------|---------| +| `Ident` | Unicode letter or `_`, then letters, digits, or `-` | `email`, `my-pack`, `_id` | +| `StringLiteral` | `"` ... `"` with `\` escapes | `"hello"`, `"C:\\\\"` | +| `NumberLiteral` | Optional `-`, digits, optional `.` + digits | `42`, `-1`, `3.14` | +| `At` | `@` | `@` | +| `DoubleAt` | `@@` | `@@` | +| `LBrace` / `RBrace` | `{` / `}` | | +| `LParen` / `RParen` | `(` / `)` | | +| `LBracket` / `RBracket` | `[` / `]` | | +| `Equals` | `=` | | +| `Question` | `?` | | +| `Dot` | `.` | | +| `Comma` | `,` | | +| `Colon` | `:` | | +| `Whitespace` | Spaces and tabs (not newlines) | | +| `Newline` | `\n` or `\r\n` | | +| `Comment` | `//` to end of line | `// a comment` | +| `Invalid` | Any unrecognized character | `$`, `*`, `#` | + +Key observations: +- Identifiers support Unicode letters and hyphens (`my-pack`), but not `$` or `*`. +- Only double-quoted strings are supported. +- There is no boolean literal token — `true` and `false` are `Ident` tokens distinguished by the value model (see below). + +## Document structure + +A PSL document contains an ordered sequence of top-level constructs. These fall into two categories: + +### Entity declarations + +An entity declaration introduces a named thing with identity — a data model, an enum, an embedded type, a view. It has a keyword, a name, and a body containing members and block-level attributes. + +``` +EntityDeclaration = Keyword Ident "{" (Member | BlockAttribute)* "}" +``` + +```prisma +model User { + id Int @id + email String @unique + @@map("users") +} + +enum Role { + USER + ADMIN + @@map("user_role") +} + +type Address { + street String + city String +} +``` + +Entity declarations introduce a type name into the document scope. The keyword determines the entity's *category* (model, enum, compositeType, etc.), but the parser treats them uniformly — it does not assign special structure to any keyword. What members look like, what attributes are valid, and what the entity means are all determined by the binding layer (see [PSL Binding Model](psl-binding-model.md)). + +### Context directives + +A context directive modifies the interpretation environment. Unlike entity declarations, it does not introduce a named entity — it changes how the rest of the file is interpreted. + +``` +ContextDirective = Keyword "{" ... "}" +``` + +The `types` block is a context directive. It introduces type aliases that affect how field type expressions are resolved: + +```prisma +types { + Email = String + ShortName = sql.String(length: 35) +} +``` + +Context directives are analogous to `import` or `using` in other languages. They exist to configure the interpretation context, not to declare domain entities. + +### Current vs target (current) + +The current parser hardcodes four block types with distinct AST nodes (`PslModel`, `PslEnum`, `PslCompositeType`, `PslTypesBlock`). The target design replaces these with a single generic block AST node, where the keyword is a plain string and the binding layer determines what's valid. + +## Members + +Members are the declarations inside entity blocks. A member has a name and optional components depending on the entity's kind. + +``` +Member = Ident TypeExpression? ("=" ValueExpression)? MemberAttribute* +``` + +The three observed member shapes: + +| Shape | Example | Used by | +|-------|---------|---------| +| Field | `email String @unique` | model, compositeType, view | +| Value | `USER` | enum | +| Assignment | `Email = String` | types (context directive) | + +These are syntactic variations of the same member grammar. A field has a name and a type expression. An enum value has only a name (implicit type, no attributes). A type alias has a name and an assignment. The parser can treat all three as members with optional components; the binding layer validates that the member shape matches what the entity's block type expects. + +### Type expressions + +A type expression specifies a member's type, with optional modifiers: + +``` +TypeExpression = TypeBase ("?" | "[]")? +TypeBase = Ident | TypeConstructorCall +``` + +| Form | Meaning | +|------|---------| +| `String` | Required scalar | +| `String?` | Optional (nullable) | +| `String[]` | List (array) | +| `pgvector.Vector(1536)` | Type constructor call | +| `pgvector.Vector(1536)?` | Optional type constructor | + +Modifiers `?` (optional) and `[]` (list) are mutually exclusive. + +### Type constructor calls + +``` +TypeConstructorCall = DottedIdent "(" ArgumentList ")" +DottedIdent = Ident ("." Ident)* +``` + +A type constructor call is a namespaced identifier with arguments. It appears as a field type or a named type alias RHS. + +```prisma +embedding pgvector.Vector(1536) +``` + +## Attributes + +Attributes are annotations on members and blocks. They have a prefix (`@` for member-level, `@@` for block-level), a dotted name, and an optional argument list. + +``` +MemberAttribute = "@" DottedIdent ("(" ArgumentList ")")? +BlockAttribute = "@@" DottedIdent ("(" ArgumentList ")")? +``` + +```prisma +id Int @id @default(42) +email String @unique @map("email_address") +data Bytes @vendor.column(length: 1536) + +@@map("users") +@@index([email]) +@@unique([title, userId]) +``` + +### Attribute names + +Attribute names are dotted identifiers: `Ident ("." Ident)*`. Each segment can contain letters, digits, underscores, and hyphens. + +| Form | Example | +|------|---------| +| Simple | `@id`, `@@map`, `@unique` | +| Namespaced | `@db.VarChar`, `@vendor.column`, `@my-pack.column` | + +The parser does not validate attribute names — it accepts any dotted identifier. The binding layer determines which attributes are valid in which context. + +## Arguments + +Arguments appear inside `(` `)` delimiters on attributes and type constructors. + +``` +ArgumentList = (Argument ("," Argument)*)? +Argument = PositionalArgument | NamedArgument +PositionalArgument = Value +NamedArgument = Ident ":" Value +``` + +## Value model (target) + +PSL has nine value types. These are the building blocks for all attribute and type constructor arguments. + +### Simple values + +| Type | Surface syntax | Examples | +|------|---------------|----------| +| **Boolean** | `true` \| `false` | `true`, `false` | +| **Number** | Digits, optional `-`, optional `.` | `42`, `-1`, `3.14` | +| **String** | `"..."` with `\` escapes | `"hello"`, `"en"` | +| **Identifier** | Bare word (not `true`/`false`) | `Cascade`, `Desc`, `hashed` | + +### Compound values + +| Type | Surface syntax | Examples | +|------|---------------|----------| +| **List** | `[` Value (`,` Value)* `]` | `[email, name]`, `[1, 2]` | +| **Object** | `{` (Ident `:` Value (`,` ...)* )? `}` | `{ status: "active" }` | +| **FunctionCall** | DottedIdent `(` ArgumentList `)` | `now()`, `wildcard(metadata)` | + +### Extension values + +| Type | Surface syntax | Examples | +|------|---------------|----------| +| **TypeConstructor** | DottedIdent `(` ArgumentList `)` | `pgvector.zero(1536)` | +| **TaggedLiteral** | DottedIdent `` ` `` ... `` ` `` | `` pg.sql`SELECT 1` `` | + +### Grammar + +``` +Value = Boolean | Number | String | Identifier + | List | Object | FunctionCall + | TypeConstructor | TaggedLiteral + +Boolean = "true" | "false" +Number = "-"? Digit+ ("." Digit+)? +String = '"' (EscapeSeq | [^"\\])* '"' +Identifier = Ident (where Ident ∉ { "true", "false" }) +List = "[" (Value ("," Value)*)? "]" +Object = "{" (Ident ":" Value ("," Ident ":" Value)*)? "}" +FunctionCall = DottedIdent "(" (Argument ("," Argument)*)? ")" +TypeConstructor = DottedIdent "(" (Argument ("," Argument)*)? ")" +TaggedLiteral = DottedIdent "`" Body "`" +``` + +### FunctionCall vs TypeConstructor + +`FunctionCall` and `TypeConstructor` have the same surface syntax: `Name(args)`. The parser produces them identically. The binding layer distinguishes them based on what the name resolves to in the active scope — if it resolves to a declared function, it's a `FunctionCall`; if it resolves to a type constructor, it's a `TypeConstructor`. This parallels how the same identifier `Role` could be a type name or something else depending on context. + +A `FunctionCall` invokes a function declared in the binding context (e.g. `now()`, `autoincrement()`, `wildcard(metadata)`). Functions are scoped to specific attributes — `wildcard` is available in `@@index`'s field list, `now` and `autoincrement` are available in `@default`. + +A `TypeConstructor` constructs a value of a non-primitive type. Where type constructor calls in type positions declare the *type* of a field (e.g. `embedding pgvector.Vector(1536)`), a `TypeConstructor` in value position constructs an *instance* of that type (e.g. `@default(pgvector.zero(1536))`). The binding layer resolves the name against available type constructors contributed by extension packs. + +### TaggedLiteral + +A tagged literal is an opaque, domain-specific text payload owned by an extension pack. It consists of a dotted identifier (the tag) followed by a backtick-delimited body: + +```prisma +@@policy(pg.predicate`status = 'active' AND role = 'admin'`) + +view ActiveUsers { + definition pg.sql` + SELECT id, email FROM "User" WHERE status = 'active' + ` +} +``` + +The tag is a qualified identifier (`pack.flavor`) that the binding layer routes to the owning pack. The body is opaque to the parser and binding layer — the pack validates and normalizes it. No interpolation is allowed (`${` is a hard error). + +Tagged literals serve as an escape hatch for domain-specific sub-languages that PSL's value model cannot express natively: SQL expressions, query predicates, aggregation pipelines, policy definitions. Rather than embedding a sub-language into PSL's grammar, the tagged literal delegates ownership to the pack while preserving precise source spans for diagnostics. + +Canonicalization rules, contract encoding, and the full AST node type are defined in [ADR 129 — Template-Tagged Literals for Extensions](../../docs/architecture%20docs/adrs/ADR%20129%20-%20Template-Tagged%20Literals%20for%20Extensions.md). + +### Type recursion + +Values are recursive — Lists and Objects contain Values, and FunctionCall/TypeConstructor arguments are Values: + +```prisma +@@index([tenantId, wildcard(metadata)]) +// ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +// List containing: +// Identifier("tenantId") +// FunctionCall("wildcard", [Identifier("metadata")]) + +@@index([status], filter: { age: { $gte: 18 } }) +// ^^^^^^^^^^^^^^^^^^^^^^^^ +// Object containing: +// "age" → Object { "$gte" → Number(18) } + +@@policy(pg.predicate`status = 'active'`) +// ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +// TaggedLiteral(tag: "pg.predicate", body: "status = 'active'") +``` + +### What identifiers mean + +An `Identifier` value is a bare word that isn't `true` or `false`. What it *refers to* depends on where it appears — this is determined by the binding layer, not the parser. The same identifier `email` could be: + +- A type name in a field declaration: `email String` (the field name is `email`) +- A field reference in an index: `@@index([email])` (refers to the `email` field) +- A symbolic constant: `type: hashed` (a fixed domain value) + +The parser produces the identifier; the binding layer resolves it against the active scope. + +### Current implementation (current) + +The current parser stores all argument values as raw strings. The target value model described above is not yet implemented — interpreters manually parse raw strings using `parseBooleanArg`, `parseNumericArg`, `parseFieldList`, `parseJsonArg`, etc. Additionally, the parser special-cases `@default` to produce typed `PslDefaultFunctionValue` and `PslDefaultLiteralValue` nodes for `now()`, `autoincrement()`, `true`, `42`, and `"hello"`. Under the target value model, `@default` becomes a regular attribute — function calls (`now()`, `autoincrement()`) are just `FunctionCall` values, and literals (`true`, `42`) are just `Boolean` and `Number` values. No special case is needed. `TypeConstructor` and `TaggedLiteral` are new value types not present in the current parser. + +## Comments + +Line comments start with `//` and extend to the end of the line. They are stripped during parsing and do not appear in the AST. + +```prisma +model User { + id Int @id // primary key +} +``` + +Comments within quoted strings are not treated as comments. + +## Summary of language constructs + +| Construct | Description | Context | +|-----------|------------|---------| +| Document | Root container | — | +| Entity declaration | Named block (`model`, `enum`, `type`, `view`, ...) | Top-level | +| Context directive | Environment modifier (`types`, future imports) | Top-level | +| Member | Named declaration within a block | Inside entity/directive | +| Type expression | Type reference with optional `?`/`[]` | Field type position | +| Type constructor call | Namespaced parameterized type | Type expression or alias RHS | +| Member attribute | `@`-prefixed annotation | On a member | +| Block attribute | `@@`-prefixed annotation | Inside an entity block | +| Positional argument | Unnamed value in an argument list | Inside `(` `)` | +| Named argument | `name: value` pair in an argument list | Inside `(` `)` | +| Value | Boolean, Number, String, Identifier, List, Object, FunctionCall, TypeConstructor, TaggedLiteral | Argument positions | diff --git a/test/integration/test/authoring/side-by-side-contracts.test.ts b/test/integration/test/authoring/side-by-side-contracts.test.ts index 9a5e874a52..72c38c0123 100644 --- a/test/integration/test/authoring/side-by-side-contracts.test.ts +++ b/test/integration/test/authoring/side-by-side-contracts.test.ts @@ -218,14 +218,44 @@ describe('side-by-side contract examples', () => { enrichContract(providerResult.value, frameworkComponents), ); - expect(normalizedTs).toEqual(normalizedPsl); + // PSL auto-derives validators from model schemas; TS builder doesn't yet. + // Compare everything except storage.collections[].validator and storageHash + // (which changes because the validator is part of the hashed storage). + const stripValidatorFields = (contract: typeof normalizedTs) => { + const storage = contract.storage as unknown as Record; + const collections = storage['collections'] as Record>; + const stripped: Record = {}; + for (const [name, coll] of Object.entries(collections)) { + const { validator: _, ...rest } = coll; + stripped[name] = rest; + } + const { storageHash: _sh, ...restStorage } = storage; + return { ...contract, storage: { ...restStorage, collections: stripped } }; + }; + expect(stripValidatorFields(normalizedTs)).toEqual(stripValidatorFields(normalizedPsl)); const emittedTs = await emit(normalizedTs, stack, mongoFamilyDescriptor.emission); const emittedPsl = await emit(normalizedPsl, stack, mongoFamilyDescriptor.emission); - expect(emittedTs.contractJson).toBe(emittedPsl.contractJson); + // Emitted JSON differs because PSL adds validators + different storageHash. + // Compare structurally without validators and storageHash. + const stripForComparison = (json: string) => { + const parsed = JSON.parse(json) as Record; + const storage = parsed['storage'] as Record; + const collections = storage['collections'] as Record>; + const strippedCollections: Record = {}; + for (const [name, coll] of Object.entries(collections)) { + const { validator: _, ...rest } = coll; + strippedCollections[name] = rest; + } + const { storageHash: _sh, ...restStorage } = storage; + return { ...parsed, storage: { ...restStorage, collections: strippedCollections } }; + }; + expect(stripForComparison(emittedTs.contractJson)).toEqual( + stripForComparison(emittedPsl.contractJson), + ); - const emittedContractJson = parseContractJson(emittedTs.contractJson); + const emittedContractJson = parseContractJson(emittedPsl.contractJson); const validatedContract = validateEmittedMongoContract(emittedContractJson); expect(validatedContract.contract.roots).toEqual({ @@ -250,10 +280,10 @@ describe('side-by-side contract examples', () => { }); if (shouldUpdateExpected) { - writeExpectedContractJson(fixtureCase, emittedTs.contractJson); + writeExpectedContractJson(fixtureCase, emittedPsl.contractJson); } - expect(emittedTs.contractJson).toBe(readExpectedContractJson(fixtureCase)); + expect(emittedPsl.contractJson).toBe(readExpectedContractJson(fixtureCase)); }, timeouts.typeScriptCompilation, ); diff --git a/test/integration/test/authoring/side-by-side/mongo/contract.json b/test/integration/test/authoring/side-by-side/mongo/contract.json index 75a32fb858..15f9577d68 100644 --- a/test/integration/test/authoring/side-by-side/mongo/contract.json +++ b/test/integration/test/authoring/side-by-side/mongo/contract.json @@ -109,10 +109,70 @@ }, "storage": { "collections": { - "posts": {}, - "users": {} + "posts": { + "validator": { + "jsonSchema": { + "bsonType": "object", + "properties": { + "_id": { + "bsonType": "objectId" + }, + "authorId": { + "bsonType": "objectId" + }, + "publishedAt": { + "bsonType": [ + "null", + "date" + ] + }, + "title": { + "bsonType": "string" + } + }, + "required": [ + "_id", + "authorId", + "title" + ] + }, + "validationAction": "error", + "validationLevel": "strict" + } + }, + "users": { + "validator": { + "jsonSchema": { + "bsonType": "object", + "properties": { + "_id": { + "bsonType": "objectId" + }, + "bio": { + "bsonType": [ + "null", + "string" + ] + }, + "email": { + "bsonType": "string" + }, + "name": { + "bsonType": "string" + } + }, + "required": [ + "_id", + "email", + "name" + ] + }, + "validationAction": "error", + "validationLevel": "strict" + } + } }, - "storageHash": "sha256:3dc942d093b714429d2c735418815d37860c04e151bb78258ab6fa113e8a0141" + "storageHash": "sha256:cb09278a4bee1b0db899b70cf67fe96bc858f63b4faa1e5c41c69785abae4c4b" }, "capabilities": {}, "extensionPacks": {}, diff --git a/test/integration/test/cli.emit-command.test.ts b/test/integration/test/cli.emit-command.test.ts index bf8072f28e..1948f624ea 100644 --- a/test/integration/test/cli.emit-command.test.ts +++ b/test/integration/test/cli.emit-command.test.ts @@ -794,7 +794,7 @@ model Post { storage: { collections: { users: { - indexes: [{ fields: { email: 1 }, options: { unique: true } }], + indexes: [{ keys: [{ field: 'email', direction: 1 }], unique: true }], options: { collation: { locale: 'en', strength: 2 }, }, @@ -829,7 +829,8 @@ model Post { expect(contractDts).toContain("readonly discriminator: { readonly field: 'type' }"); expect(contractDts).toContain('readonly users: {'); expect(contractDts).toContain('readonly indexes:'); - expect(contractDts).toContain('readonly email: 1'); + expect(contractDts).toContain("readonly field: 'email'"); + expect(contractDts).toContain('readonly direction: 1'); expect(contractDts).toContain('readonly unique: true'); expect(contractDts).toContain('readonly options:'); expect(contractDts).toContain( diff --git a/test/integration/test/mongo/fixtures/generated/contract.d.ts b/test/integration/test/mongo/fixtures/generated/contract.d.ts index 4a4d4721a8..85d7cd7290 100644 --- a/test/integration/test/mongo/fixtures/generated/contract.d.ts +++ b/test/integration/test/mongo/fixtures/generated/contract.d.ts @@ -57,7 +57,10 @@ type ContractBase = ContractType< readonly tasks: Record; readonly users: { readonly indexes: readonly [ - { readonly fields: { readonly email: 1 }; readonly options: { readonly unique: true } }, + { + readonly keys: readonly [{ readonly field: 'email'; readonly direction: 1 }]; + readonly unique: true; + }, ]; readonly options: { readonly collation: { readonly locale: 'en'; readonly strength: 2 } }; }; diff --git a/test/integration/test/mongo/fixtures/generated/contract.json b/test/integration/test/mongo/fixtures/generated/contract.json index a03d682d33..1f38783ee8 100644 --- a/test/integration/test/mongo/fixtures/generated/contract.json +++ b/test/integration/test/mongo/fixtures/generated/contract.json @@ -218,12 +218,10 @@ "users": { "indexes": [ { - "fields": { - "email": 1 - }, - "options": { - "unique": true - } + "keys": [ + { "field": "email", "direction": 1 } + ], + "unique": true } ], "options": { diff --git a/test/integration/test/mongo/migration-e2e.test.ts b/test/integration/test/mongo/migration-e2e.test.ts index a980dc22ac..4a2bfad164 100644 --- a/test/integration/test/mongo/migration-e2e.test.ts +++ b/test/integration/test/mongo/migration-e2e.test.ts @@ -59,7 +59,7 @@ const indexedContract: MongoContract = { storage: { collections: { users: { - indexes: [{ fields: { email: 1 }, options: { unique: true } }], + indexes: [{ keys: [{ field: 'email', direction: 1 as const }], unique: true }], }, }, storageHash: coreHash('sha256:indexed-contract'), diff --git a/test/integration/test/mongo/migration-m2-vocabulary.test.ts b/test/integration/test/mongo/migration-m2-vocabulary.test.ts new file mode 100644 index 0000000000..f7c14b0a65 --- /dev/null +++ b/test/integration/test/mongo/migration-m2-vocabulary.test.ts @@ -0,0 +1,730 @@ +import { + contractToMongoSchemaIR, + MongoMigrationPlanner, + MongoMigrationRunner, + serializeMongoOps, +} from '@prisma-next/adapter-mongo/control'; +import { coreHash, profileHash } from '@prisma-next/contract/types'; +import mongoControlDriver from '@prisma-next/driver-mongo/control'; +import type { MongoContract, MongoStorageCollection } from '@prisma-next/mongo-contract'; +import type { MongoMigrationPlanOperation } from '@prisma-next/mongo-query-ast/control'; +import { timeouts } from '@prisma-next/test-utils'; +import { type Db, MongoClient } from 'mongodb'; +import { MongoMemoryReplSet } from 'mongodb-memory-server'; +import { afterAll, beforeAll, beforeEach, describe, expect, it } from 'vitest'; + +const ALL_POLICY = { + allowedOperationClasses: ['additive', 'widening', 'destructive'] as const, +}; + +function makeContract( + collections: Record, + hashSeed: string, +): MongoContract { + return { + target: 'mongo', + targetFamily: 'mongo', + roots: Object.fromEntries(Object.keys(collections).map((c) => [c, c])), + models: Object.fromEntries( + Object.keys(collections).map((c) => [ + c, + { + fields: { + _id: { + nullable: false, + type: { kind: 'scalar' as const, codecId: 'mongo/objectId@1' }, + }, + }, + relations: {}, + storage: { collection: c }, + }, + ]), + ), + storage: { + collections, + storageHash: coreHash(`sha256:${hashSeed}`), + }, + capabilities: {}, + extensionPacks: {}, + profileHash: profileHash('sha256:test'), + meta: {}, + }; +} + +async function planAndApply( + _db: Db, + replSetUri: string, + origin: MongoContract | null, + destination: MongoContract, +): Promise { + const planner = new MongoMigrationPlanner(); + const schema = contractToMongoSchemaIR(origin); + const result = planner.plan({ + contract: destination, + schema, + policy: ALL_POLICY, + frameworkComponents: [], + }); + if (result.kind !== 'success') { + throw new Error(`Plan failed: ${JSON.stringify(result)}`); + } + + const ops = result.plan.operations as readonly MongoMigrationPlanOperation[]; + if (ops.length === 0) return; + + const serialized = JSON.parse(serializeMongoOps(ops)); + const controlDriver = await mongoControlDriver.create(replSetUri); + try { + const runner = new MongoMigrationRunner(); + const runResult = await runner.execute({ + plan: { + targetId: 'mongo', + ...(origin ? { origin: { storageHash: origin.storage.storageHash } } : {}), + destination: { storageHash: destination.storage.storageHash }, + operations: serialized, + }, + driver: controlDriver, + destinationContract: destination, + policy: ALL_POLICY, + frameworkComponents: [], + }); + if (!runResult.ok) { + throw new Error(`Apply failed: ${JSON.stringify(runResult)}`); + } + } finally { + await controlDriver.close(); + } +} + +describe('MongoDB migration M2 vocabulary E2E', { timeout: timeouts.spinUpDbServer }, () => { + let replSet: MongoMemoryReplSet; + let client: MongoClient; + let db: Db; + const dbName = 'migration_m2_test'; + let replSetUri: string; + + beforeAll(async () => { + replSet = await MongoMemoryReplSet.create({ + instanceOpts: [{ launchTimeout: timeouts.spinUpDbServer, storageEngine: 'wiredTiger' }], + replSet: { count: 1, storageEngine: 'wiredTiger' }, + }); + client = new MongoClient(replSet.getUri()); + await client.connect(); + db = client.db(dbName); + replSetUri = replSet.getUri(dbName); + }, timeouts.spinUpDbServer); + + beforeEach(async () => { + await db.dropDatabase(); + }); + + afterAll(async () => { + try { + await client?.close(); + await replSet?.stop(); + } catch { + // ignore + } + }, timeouts.spinUpDbServer); + + describe('compound indexes', () => { + it('creates a compound ascending + descending index', async () => { + const contract = makeContract( + { + users: { + indexes: [ + { + keys: [ + { field: 'lastName', direction: 1 }, + { field: 'firstName', direction: -1 }, + ], + }, + ], + }, + }, + 'compound-idx', + ); + + await planAndApply(db, replSetUri, null, contract); + + const indexes = await db.collection('users').listIndexes().toArray(); + const compound = indexes.find( + (idx) => idx['key']?.['lastName'] === 1 && idx['key']?.['firstName'] === -1, + ); + expect(compound).toBeDefined(); + }); + }); + + describe('text indexes', () => { + it('creates a text index with weights, default_language, and language_override', async () => { + const contract = makeContract( + { + articles: { + indexes: [ + { + keys: [ + { field: 'title', direction: 'text' }, + { field: 'body', direction: 'text' }, + ], + weights: { title: 10, body: 5 }, + default_language: 'english', + language_override: 'idioma', + }, + ], + }, + }, + 'text-idx', + ); + + await planAndApply(db, replSetUri, null, contract); + + const indexes = await db.collection('articles').listIndexes().toArray(); + const textIdx = indexes.find((idx) => idx['key']?.['_fts'] === 'text'); + expect(textIdx).toBeDefined(); + expect(textIdx!['weights']).toEqual({ title: 10, body: 5 }); + expect(textIdx!['default_language']).toBe('english'); + expect(textIdx!['language_override']).toBe('idioma'); + }); + }); + + describe('TTL indexes', () => { + it('creates a TTL index', async () => { + const contract = makeContract( + { + sessions: { + indexes: [ + { + keys: [{ field: 'expiresAt', direction: 1 }], + expireAfterSeconds: 3600, + }, + ], + }, + }, + 'ttl-idx', + ); + + await planAndApply(db, replSetUri, null, contract); + + const indexes = await db.collection('sessions').listIndexes().toArray(); + const ttlIdx = indexes.find((idx) => idx['key']?.['expiresAt'] === 1); + expect(ttlIdx).toBeDefined(); + expect(ttlIdx!['expireAfterSeconds']).toBe(3600); + }); + }); + + describe('hashed indexes', () => { + it('creates a hashed index', async () => { + const contract = makeContract( + { + items: { + indexes: [{ keys: [{ field: 'shard_key', direction: 'hashed' }] }], + }, + }, + 'hashed-idx', + ); + + await planAndApply(db, replSetUri, null, contract); + + const indexes = await db.collection('items').listIndexes().toArray(); + const hashIdx = indexes.find((idx) => idx['key']?.['shard_key'] === 'hashed'); + expect(hashIdx).toBeDefined(); + }); + }); + + describe('2dsphere indexes', () => { + it('creates a 2dsphere geospatial index', async () => { + const contract = makeContract( + { + places: { + indexes: [{ keys: [{ field: 'location', direction: '2dsphere' }] }], + }, + }, + '2dsphere-idx', + ); + + await planAndApply(db, replSetUri, null, contract); + + const indexes = await db.collection('places').listIndexes().toArray(); + const geoIdx = indexes.find((idx) => idx['key']?.['location'] === '2dsphere'); + expect(geoIdx).toBeDefined(); + }); + }); + + describe('partial indexes', () => { + it('creates a partial index with partialFilterExpression', async () => { + const contract = makeContract( + { + users: { + indexes: [ + { + keys: [{ field: 'email', direction: 1 }], + unique: true, + partialFilterExpression: { email: { $exists: true } }, + }, + ], + }, + }, + 'partial-idx', + ); + + await planAndApply(db, replSetUri, null, contract); + + const indexes = await db.collection('users').listIndexes().toArray(); + const partialIdx = indexes.find((idx) => idx['key']?.['email'] === 1); + expect(partialIdx).toBeDefined(); + expect(partialIdx!['unique']).toBe(true); + expect(partialIdx!['partialFilterExpression']).toEqual({ email: { $exists: true } }); + }); + }); + + describe('indexes with collation', () => { + it('creates an index with case-insensitive collation', async () => { + const contract = makeContract( + { + users: { + indexes: [ + { + keys: [{ field: 'name', direction: 1 }], + collation: { locale: 'en', strength: 2 }, + }, + ], + }, + }, + 'collation-idx', + ); + + await planAndApply(db, replSetUri, null, contract); + + const indexes = await db.collection('users').listIndexes().toArray(); + const collIdx = indexes.find((idx) => idx['key']?.['name'] === 1); + expect(collIdx).toBeDefined(); + expect(collIdx!['collation']?.['locale']).toBe('en'); + expect(collIdx!['collation']?.['strength']).toBe(2); + }); + }); + + describe('wildcard indexes', () => { + it('creates a wildcard index with wildcardProjection', async () => { + const contract = makeContract( + { + events: { + indexes: [ + { + keys: [{ field: '$**', direction: 1 }], + wildcardProjection: { metadata: 1 }, + }, + ], + }, + }, + 'wildcard-idx', + ); + + await planAndApply(db, replSetUri, null, contract); + + const indexes = await db.collection('events').listIndexes().toArray(); + const wcIdx = indexes.find((idx) => idx['key']?.['$**'] === 1); + expect(wcIdx).toBeDefined(); + }); + }); + + describe('modify indexes', () => { + it('drops old index and creates new index on change', async () => { + const v1 = makeContract( + { + users: { + indexes: [{ keys: [{ field: 'email', direction: 1 }], unique: true }], + }, + }, + 'modify-v1', + ); + + await planAndApply(db, replSetUri, null, v1); + + let indexes = await db.collection('users').listIndexes().toArray(); + expect(indexes.some((idx) => idx['key']?.['email'] === 1)).toBe(true); + + const v2 = makeContract( + { + users: { + indexes: [{ keys: [{ field: 'name', direction: 1 }], sparse: true }], + }, + }, + 'modify-v2', + ); + + await planAndApply(db, replSetUri, v1, v2); + + indexes = await db.collection('users').listIndexes().toArray(); + expect(indexes.some((idx) => idx['key']?.['email'] === 1)).toBe(false); + const nameIdx = indexes.find((idx) => idx['key']?.['name'] === 1); + expect(nameIdx).toBeDefined(); + expect(nameIdx!['sparse']).toBe(true); + }); + }); + + describe('validators via collMod', () => { + it('adds a validator to an existing collection', async () => { + const v1 = makeContract( + { + users: { + indexes: [{ keys: [{ field: 'email', direction: 1 }] }], + }, + }, + 'validator-v1', + ); + + await planAndApply(db, replSetUri, null, v1); + + const v2 = makeContract( + { + users: { + indexes: [{ keys: [{ field: 'email', direction: 1 }] }], + validator: { + jsonSchema: { + bsonType: 'object', + required: ['email'], + properties: { + email: { bsonType: 'string' }, + }, + }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }, + 'validator-v2', + ); + + await planAndApply(db, replSetUri, v1, v2); + + const colls = await db.listCollections({ name: 'users' }).toArray(); + expect(colls).toHaveLength(1); + const collOptions = (colls[0] as Record)['options'] as + | Record + | undefined; + expect(collOptions?.['validator']).toBeDefined(); + }); + + it('removes a validator from a collection', async () => { + const withValidator = makeContract( + { + users: { + indexes: [{ keys: [{ field: 'email', direction: 1 }] }], + validator: { + jsonSchema: { + bsonType: 'object', + required: ['email'], + properties: { email: { bsonType: 'string' } }, + }, + validationLevel: 'strict', + validationAction: 'error', + }, + }, + }, + 'val-remove-v1', + ); + + await planAndApply(db, replSetUri, null, withValidator); + + const withoutValidator = makeContract( + { + users: { + indexes: [{ keys: [{ field: 'email', direction: 1 }] }], + }, + }, + 'val-remove-v2', + ); + + await planAndApply(db, replSetUri, withValidator, withoutValidator); + + const colls = await db.listCollections({ name: 'users' }).toArray(); + expect(colls).toHaveLength(1); + const collOptions = (colls[0] as Record)['options'] as + | Record + | undefined; + const validator = collOptions?.['validator'] as Record | undefined; + const isEffectivelyEmpty = !validator || Object.keys(validator).length === 0; + expect(isEffectivelyEmpty).toBe(true); + }); + }); + + describe('collection with options', () => { + it('creates a capped collection', async () => { + const contract = makeContract( + { + logs: { + options: { + capped: { size: 10_000_000, max: 1000 }, + }, + }, + }, + 'capped-coll', + ); + + await planAndApply(db, replSetUri, null, contract); + + const colls = await db.listCollections({ name: 'logs' }).toArray(); + expect(colls).toHaveLength(1); + const logsInfo = colls[0] as Record; + const logsOpts = logsInfo['options'] as Record | undefined; + expect(logsOpts?.['capped']).toBe(true); + expect(logsOpts?.['size']).toBeGreaterThanOrEqual(10_000_000); + expect(logsOpts?.['max']).toBe(1000); + }); + + it('creates a collection with collation', async () => { + const contract = makeContract( + { + posts: { + options: { + collation: { locale: 'en', strength: 2 }, + }, + }, + }, + 'collation-coll', + ); + + await planAndApply(db, replSetUri, null, contract); + + const colls = await db.listCollections({ name: 'posts' }).toArray(); + expect(colls).toHaveLength(1); + const postsInfo = colls[0] as Record; + const postsOpts = postsInfo['options'] as Record | undefined; + const collation = postsOpts?.['collation'] as Record | undefined; + expect(collation?.['locale']).toBe('en'); + expect(collation?.['strength']).toBe(2); + }); + + it('creates a collection with changeStreamPreAndPostImages and toggles it', async () => { + const v1 = makeContract( + { + events: { + options: { + changeStreamPreAndPostImages: { enabled: true }, + }, + }, + }, + 'csppi-v1', + ); + + await planAndApply(db, replSetUri, null, v1); + + let colls = await db.listCollections({ name: 'events' }).toArray(); + expect(colls).toHaveLength(1); + const eventsInfoV1 = colls[0] as Record; + const eventsOptsV1 = eventsInfoV1['options'] as Record | undefined; + expect( + (eventsOptsV1?.['changeStreamPreAndPostImages'] as Record | undefined)?.[ + 'enabled' + ], + ).toBe(true); + + const v2 = makeContract( + { + events: { + options: { + changeStreamPreAndPostImages: { enabled: false }, + }, + }, + }, + 'csppi-v2', + ); + + await planAndApply(db, replSetUri, v1, v2); + + colls = await db.listCollections({ name: 'events' }).toArray(); + expect(colls).toHaveLength(1); + const eventsInfoV2 = colls[0] as Record; + const eventsOptsV2 = eventsInfoV2['options'] as Record | undefined; + const csppiAfter = eventsOptsV2?.['changeStreamPreAndPostImages'] as + | Record + | undefined; + const disabledOrRemoved = csppiAfter === undefined || csppiAfter['enabled'] === false; + expect(disabledOrRemoved).toBe(true); + }); + + it('creates a timeseries collection', async () => { + const contract = makeContract( + { + metrics: { + options: { + timeseries: { timeField: 'ts', granularity: 'hours' }, + }, + }, + }, + 'timeseries-coll', + ); + + try { + await planAndApply(db, replSetUri, null, contract); + } catch (e) { + const msg = e instanceof Error ? e.message : String(e); + if (msg.includes('not supported') || msg.includes('requires')) { + console.log(`Skipping timeseries test: ${msg}`); + return; + } + throw e; + } + + const colls = await db.listCollections({ name: 'metrics' }).toArray(); + expect(colls).toHaveLength(1); + const metricsInfo = colls[0] as Record; + expect(metricsInfo['type']).toBe('timeseries'); + const metricsOpts = metricsInfo['options'] as Record | undefined; + const tsOpts = metricsOpts?.['timeseries'] as Record | undefined; + expect(tsOpts?.['timeField']).toBe('ts'); + expect(tsOpts?.['granularity']).toBe('hours'); + }); + + it('creates a collection with clusteredIndex', async () => { + const contract = makeContract( + { + clustered: { + options: { + clusteredIndex: { name: 'myCluster' }, + }, + }, + }, + 'clustered-coll', + ); + + try { + await planAndApply(db, replSetUri, null, contract); + } catch (e) { + const msg = e instanceof Error ? e.message : String(e); + if (msg.includes('not supported') || msg.includes('requires') || msg.includes('unknown')) { + console.log(`Skipping clusteredIndex test: ${msg}`); + return; + } + throw e; + } + + const colls = await db.listCollections({ name: 'clustered' }).toArray(); + expect(colls).toHaveLength(1); + const clusteredInfo = colls[0] as Record; + const clusteredOpts = clusteredInfo['options'] as Record | undefined; + expect(clusteredOpts?.['clusteredIndex']).toBeDefined(); + }); + }); + + describe('collection drops', () => { + it('drops a collection when it disappears from the destination contract', async () => { + const v1 = makeContract( + { + users: { + indexes: [{ keys: [{ field: 'email', direction: 1 }] }], + }, + posts: { + indexes: [{ keys: [{ field: 'title', direction: 1 }] }], + }, + }, + 'drop-v1', + ); + + await planAndApply(db, replSetUri, null, v1); + + let collNames = (await db.listCollections().toArray()).map((c) => c['name']); + expect(collNames).toContain('users'); + expect(collNames).toContain('posts'); + + const v2 = makeContract( + { + users: { + indexes: [{ keys: [{ field: 'email', direction: 1 }] }], + }, + }, + 'drop-v2', + ); + + await planAndApply(db, replSetUri, v1, v2); + + collNames = (await db.listCollections().toArray()) + .map((c) => c['name'] as string) + .filter((n) => !n.startsWith('_prisma') && !n.startsWith('system.')); + expect(collNames).toContain('users'); + expect(collNames).not.toContain('posts'); + }); + }); + + describe('full lifecycle: create → modify → remove', () => { + it('exercises a multi-step lifecycle for diverse index types', async () => { + const v1 = makeContract( + { + articles: { + indexes: [ + { + keys: [ + { field: 'title', direction: 'text' }, + { field: 'body', direction: 'text' }, + ], + weights: { title: 10, body: 5 }, + default_language: 'english', + }, + { keys: [{ field: 'createdAt', direction: 1 }], expireAfterSeconds: 86400 }, + ], + }, + }, + 'lifecycle-v1', + ); + + await planAndApply(db, replSetUri, null, v1); + + let indexes = await db.collection('articles').listIndexes().toArray(); + expect(indexes.some((idx) => idx['key']?.['_fts'] === 'text')).toBe(true); + expect(indexes.some((idx) => idx['key']?.['createdAt'] === 1)).toBe(true); + + const v2 = makeContract( + { + articles: { + indexes: [ + { + keys: [ + { field: 'title', direction: 'text' }, + { field: 'body', direction: 'text' }, + ], + weights: { title: 10, body: 5 }, + default_language: 'english', + }, + { keys: [{ field: 'authorId', direction: 1 }] }, + ], + validator: { + jsonSchema: { + bsonType: 'object', + required: ['title'], + properties: { title: { bsonType: 'string' } }, + }, + validationLevel: 'moderate', + validationAction: 'warn', + }, + }, + }, + 'lifecycle-v2', + ); + + await planAndApply(db, replSetUri, v1, v2); + + indexes = await db.collection('articles').listIndexes().toArray(); + expect(indexes.some((idx) => idx['key']?.['_fts'] === 'text')).toBe(true); + expect(indexes.some((idx) => idx['key']?.['createdAt'] === 1)).toBe(false); + expect(indexes.some((idx) => idx['key']?.['authorId'] === 1)).toBe(true); + + const colls = await db.listCollections({ name: 'articles' }).toArray(); + const articlesInfo = colls[0] as Record; + const articlesOpts = articlesInfo['options'] as Record | undefined; + expect(articlesOpts?.['validator']).toBeDefined(); + + const v3 = makeContract( + { + articles: {}, + }, + 'lifecycle-v3', + ); + + await planAndApply(db, replSetUri, v2, v3); + + indexes = await db.collection('articles').listIndexes().toArray(); + const nonIdIndexes = indexes.filter((idx) => idx['name'] !== '_id_'); + expect(nonIdIndexes).toHaveLength(0); + }); + }); +}); diff --git a/test/integration/test/mongo/migration-psl-authoring.test.ts b/test/integration/test/mongo/migration-psl-authoring.test.ts new file mode 100644 index 0000000000..1d427ccaf9 --- /dev/null +++ b/test/integration/test/mongo/migration-psl-authoring.test.ts @@ -0,0 +1,452 @@ +import { + contractToMongoSchemaIR, + MongoMigrationPlanner, + MongoMigrationRunner, + serializeMongoOps, +} from '@prisma-next/adapter-mongo/control'; +import mongoControlDriver from '@prisma-next/driver-mongo/control'; +import type { MongoContract } from '@prisma-next/mongo-contract'; +import { + createMongoScalarTypeDescriptors, + interpretPslDocumentToMongoContract, +} from '@prisma-next/mongo-contract-psl'; +import type { MongoMigrationPlanOperation } from '@prisma-next/mongo-query-ast/control'; +import { parsePslDocument } from '@prisma-next/psl-parser'; +import { timeouts } from '@prisma-next/test-utils'; +import { type Db, MongoClient } from 'mongodb'; +import { MongoMemoryReplSet } from 'mongodb-memory-server'; +import { afterAll, beforeAll, beforeEach, describe, expect, it } from 'vitest'; + +const ALL_POLICY = { + allowedOperationClasses: ['additive', 'widening', 'destructive'] as const, +}; + +function pslToContract(schema: string): MongoContract { + const document = parsePslDocument({ schema, sourceId: 'test.prisma' }); + const result = interpretPslDocumentToMongoContract({ + document, + scalarTypeDescriptors: createMongoScalarTypeDescriptors(), + }); + if (!result.ok) { + throw new Error(`PSL interpretation failed: ${JSON.stringify(result)}`); + } + return result.value as MongoContract; +} + +async function planAndApply( + replSetUri: string, + origin: MongoContract | null, + destination: MongoContract, +): Promise { + const planner = new MongoMigrationPlanner(); + const schema = contractToMongoSchemaIR(origin); + const result = planner.plan({ + contract: destination, + schema, + policy: ALL_POLICY, + frameworkComponents: [], + }); + if (result.kind !== 'success') { + throw new Error(`Plan failed: ${JSON.stringify(result)}`); + } + const ops = result.plan.operations as readonly MongoMigrationPlanOperation[]; + if (ops.length === 0) return; + + const serialized = JSON.parse(serializeMongoOps(ops)); + const controlDriver = await mongoControlDriver.create(replSetUri); + try { + const runner = new MongoMigrationRunner(); + const runResult = await runner.execute({ + plan: { + targetId: 'mongo', + ...(origin ? { origin: { storageHash: origin.storage.storageHash } } : {}), + destination: { storageHash: destination.storage.storageHash }, + operations: serialized, + }, + driver: controlDriver, + destinationContract: destination, + policy: ALL_POLICY, + frameworkComponents: [], + }); + if (!runResult.ok) { + throw new Error(`Apply failed: ${JSON.stringify(runResult)}`); + } + } finally { + await controlDriver.close(); + } +} + +describe('PSL authoring → migration E2E', { timeout: timeouts.spinUpDbServer }, () => { + let replSet: MongoMemoryReplSet; + let client: MongoClient; + let db: Db; + const dbName = 'psl_authoring_e2e_test'; + let replSetUri: string; + + beforeAll(async () => { + replSet = await MongoMemoryReplSet.create({ + instanceOpts: [{ launchTimeout: timeouts.spinUpDbServer, storageEngine: 'wiredTiger' }], + replSet: { count: 1, storageEngine: 'wiredTiger' }, + }); + client = new MongoClient(replSet.getUri()); + await client.connect(); + db = client.db(dbName); + replSetUri = replSet.getUri(dbName); + }, timeouts.spinUpDbServer); + + beforeEach(async () => { + await db.dropDatabase(); + }); + + afterAll(async () => { + try { + await client?.close(); + await replSet?.stop(); + } catch { + // ignore + } + }, timeouts.spinUpDbServer); + + it('PSL with @@index produces indexes on MongoDB', async () => { + const contract = pslToContract(` + model User { + id ObjectId @id @map("_id") + email String + name String + @@index([email]) + @@unique([name]) + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('user').listIndexes().toArray(); + const emailIdx = indexes.find((idx) => idx['key']?.['email'] === 1); + expect(emailIdx).toBeDefined(); + + const nameIdx = indexes.find((idx) => idx['key']?.['name'] === 1); + expect(nameIdx).toBeDefined(); + expect(nameIdx!['unique']).toBe(true); + }); + + it('PSL with @unique on field produces single-field unique index', async () => { + const contract = pslToContract(` + model User { + id ObjectId @id @map("_id") + email String @unique + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('user').listIndexes().toArray(); + const emailIdx = indexes.find((idx) => idx['key']?.['email'] === 1); + expect(emailIdx).toBeDefined(); + expect(emailIdx!['unique']).toBe(true); + }); + + it('PSL with model fields produces $jsonSchema validator on MongoDB', async () => { + const contract = pslToContract(` + model User { + id ObjectId @id @map("_id") + name String + age Int + bio String? + } + `); + + await planAndApply(replSetUri, null, contract); + + const colls = await db.listCollections({ name: 'user' }).toArray(); + expect(colls).toHaveLength(1); + const options = (colls[0] as Record)['options'] as + | Record + | undefined; + expect(options?.['validator']).toBeDefined(); + const validator = options!['validator'] as Record; + const schema = validator['$jsonSchema'] as Record; + expect(schema['bsonType']).toBe('object'); + + const props = schema['properties'] as Record>; + expect(props['name']?.['bsonType']).toBe('string'); + expect(props['age']?.['bsonType']).toBe('int'); + expect(props['bio']?.['bsonType']).toEqual(['null', 'string']); + }); + + it('PSL with @@index + model fields produces both indexes and validator', async () => { + const contract = pslToContract(` + model Post { + id ObjectId @id @map("_id") + title String + createdAt DateTime + @@index([createdAt]) + } + `); + + const storage = contract.storage as unknown as Record< + string, + Record> + >; + const postColl = storage['collections']?.['post']; + expect(postColl?.['indexes']).toBeDefined(); + expect(postColl?.['validator']).toBeDefined(); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('post').listIndexes().toArray(); + const createdAtIdx = indexes.find((idx) => idx['key']?.['createdAt'] === 1); + expect(createdAtIdx).toBeDefined(); + + const colls = await db.listCollections({ name: 'post' }).toArray(); + const options = (colls[0] as Record)['options'] as + | Record + | undefined; + expect(options?.['validator']).toBeDefined(); + }); + + it('PSL with @map respects mapped names in indexes and validators', async () => { + const contract = pslToContract(` + model User { + id ObjectId @id @map("_id") + firstName String @map("first_name") + @@index([firstName]) + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('user').listIndexes().toArray(); + const idx = indexes.find((i) => i['key']?.['first_name'] === 1); + expect(idx).toBeDefined(); + + const colls = await db.listCollections({ name: 'user' }).toArray(); + const mapUserInfo = colls[0] as Record; + const mapUserOpts = mapUserInfo['options'] as Record | undefined; + const validator = mapUserOpts?.['validator'] as Record | undefined; + const schema = validator!['$jsonSchema'] as Record; + const props = schema['properties'] as Record; + expect(props['first_name']).toBeDefined(); + expect(props['firstName']).toBeUndefined(); + }); + + it('PSL with wildcard() produces wildcard index on MongoDB', async () => { + const contract = pslToContract(` + model Events { + id ObjectId @id @map("_id") + metadata String + @@index([wildcard()]) + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('events').listIndexes().toArray(); + const wildcardIdx = indexes.find((idx) => idx['key']?.['$**'] === 1); + expect(wildcardIdx).toBeDefined(); + }); + + it('PSL with scoped wildcard() produces scoped wildcard index', async () => { + const contract = pslToContract(` + model Events { + id ObjectId @id @map("_id") + metadata String + @@index([wildcard(metadata)]) + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('events').listIndexes().toArray(); + const wildcardIdx = indexes.find((idx) => idx['key']?.['metadata.$**'] === 1); + expect(wildcardIdx).toBeDefined(); + }); + + it('PSL with sort: Desc produces mixed-direction compound index', async () => { + const contract = pslToContract(` + model Events { + id ObjectId @id @map("_id") + status String + createdAt DateTime + @@index([status, createdAt(sort: Desc)]) + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('events').listIndexes().toArray(); + const compoundIdx = indexes.find( + (idx) => idx['key']?.['status'] === 1 && idx['key']?.['createdAt'] === -1, + ); + expect(compoundIdx).toBeDefined(); + }); + + it('PSL with filter produces partial filter expression index', async () => { + const contract = pslToContract(` + model Events { + id ObjectId @id @map("_id") + status String + @@index([status], filter: "{\\"status\\": \\"active\\"}") + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('events').listIndexes().toArray(); + const partialIdx = indexes.find( + (idx) => idx['key']?.['status'] === 1 && idx['partialFilterExpression'], + ); + expect(partialIdx).toBeDefined(); + expect(partialIdx!['partialFilterExpression']).toEqual({ status: 'active' }); + }); + + it('PSL with collation produces collated index', async () => { + const contract = pslToContract(` + model User { + id ObjectId @id @map("_id") + email String + @@index([email], collationLocale: "en", collationStrength: 2) + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('user').listIndexes().toArray(); + const collatedIdx = indexes.find((idx) => idx['key']?.['email'] === 1 && idx['collation']); + expect(collatedIdx).toBeDefined(); + expect(collatedIdx!['collation']?.['locale']).toBe('en'); + expect(collatedIdx!['collation']?.['strength']).toBe(2); + }); + + it('PSL with wildcard + include produces wildcardProjection', async () => { + const contract = pslToContract(` + model Events { + id ObjectId @id @map("_id") + metadata String + tags String + @@index([wildcard()], include: "[metadata, tags]") + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('events').listIndexes().toArray(); + const wcIdx = indexes.find((idx) => idx['key']?.['$**'] === 1); + expect(wcIdx).toBeDefined(); + expect(wcIdx!['wildcardProjection']).toEqual({ metadata: 1, tags: 1 }); + }); + + it('PSL with wildcard + exclude produces wildcardProjection', async () => { + const contract = pslToContract(` + model Events { + id ObjectId @id @map("_id") + internal String + @@index([wildcard()], exclude: "[internal]") + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('events').listIndexes().toArray(); + const wcIdx = indexes.find((idx) => idx['key']?.['$**'] === 1); + expect(wcIdx).toBeDefined(); + expect(wcIdx!['wildcardProjection']).toEqual({ internal: 0 }); + }); + + it('PSL with @@textIndex produces text index on MongoDB', async () => { + const contract = pslToContract(` + model Article { + id ObjectId @id @map("_id") + title String + body String + @@textIndex([title, body]) + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('article').listIndexes().toArray(); + const textIdx = indexes.find((idx) => idx['key']?.['_fts'] === 'text'); + expect(textIdx).toBeDefined(); + expect(textIdx!['weights']?.['title']).toBeDefined(); + expect(textIdx!['weights']?.['body']).toBeDefined(); + }); + + it('PSL with @@textIndex + weights produces weighted text index', async () => { + const contract = pslToContract(` + model Article { + id ObjectId @id @map("_id") + title String + body String + @@textIndex([title, body], weights: "{\\"title\\": 10, \\"body\\": 5}", language: "english") + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('article').listIndexes().toArray(); + const textIdx = indexes.find((idx) => idx['key']?.['_fts'] === 'text'); + expect(textIdx).toBeDefined(); + expect(textIdx!['weights']?.['title']).toBe(10); + expect(textIdx!['weights']?.['body']).toBe(5); + expect(textIdx!['default_language']).toBe('english'); + }); + + it('PSL with type: "hashed" produces hashed index', async () => { + const contract = pslToContract(` + model Events { + id ObjectId @id @map("_id") + tenantId String + @@index([tenantId], type: "hashed") + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('events').listIndexes().toArray(); + const hashedIdx = indexes.find((idx) => idx['key']?.['tenantId'] === 'hashed'); + expect(hashedIdx).toBeDefined(); + }); + + it('PSL with type: "2dsphere" produces 2dsphere index', async () => { + const contract = pslToContract(` + model Places { + id ObjectId @id @map("_id") + location String + @@index([location], type: "2dsphere") + } + `); + + await planAndApply(replSetUri, null, contract); + + const indexes = await db.collection('places').listIndexes().toArray(); + const geoIdx = indexes.find((idx) => idx['key']?.['location'] === '2dsphere'); + expect(geoIdx).toBeDefined(); + }); + + it('PSL with value objects produces nested $jsonSchema', async () => { + const contract = pslToContract(` + type Address { + street String + city String + } + + model User { + id ObjectId @id @map("_id") + address Address + } + `); + + await planAndApply(replSetUri, null, contract); + + const colls = await db.listCollections({ name: 'user' }).toArray(); + const voUserInfo = colls[0] as Record; + const voUserOpts = voUserInfo['options'] as Record | undefined; + const validator = voUserOpts?.['validator'] as Record | undefined; + const schema = validator!['$jsonSchema'] as Record; + const props = schema['properties'] as Record>; + expect(props['address']?.['bsonType']).toBe('object'); + const addressProps = props['address']?.['properties'] as Record; + expect(addressProps['street']).toBeDefined(); + expect(addressProps['city']).toBeDefined(); + }); +}); diff --git a/test/integration/test/mongo/orm.test.ts b/test/integration/test/mongo/orm.test.ts index 82094d0c06..07b2589a34 100644 --- a/test/integration/test/mongo/orm.test.ts +++ b/test/integration/test/mongo/orm.test.ts @@ -11,7 +11,7 @@ const { contract } = validateMongoContract(ormContractJson); describeWithMongoDB('mongoOrm integration', (ctx) => { it('loads generated collection indexes and options', () => { expect(contract.storage.collections.users).toEqual({ - indexes: [{ fields: { email: 1 }, options: { unique: true } }], + indexes: [{ keys: [{ field: 'email', direction: 1 }], unique: true }], options: { collation: { locale: 'en', strength: 2 }, },