Compare commits

..

5 Commits

Author SHA1 Message Date
665a821bf9 version: 1.0.110 2026-04-10 01:06:16 -04:00
be78af1507 more tests 2026-04-10 01:06:02 -04:00
3cca5ef2d5 checkpoint 2026-04-09 19:55:35 -04:00
5f45df6c11 checkpoint 2026-04-09 18:39:52 -04:00
9387152859 version: 1.0.109 2026-04-08 13:09:04 -04:00
16 changed files with 1596 additions and 1158 deletions

View File

@ -84,11 +84,26 @@ Punc completely abandons the standard JSON Schema `$ref` keyword. Instead, it ov
* **Strict Array Constraint**: To explicitly prevent mathematically ambiguous Multiple Inheritance, a `type` array is strictly constrained to at most **ONE** Custom Object Pointer. Defining `"type": ["person", "organization"]` will intentionally trigger a fatal database compilation error natively instructing developers to build a proper tagged union (`oneOf`) instead. * **Strict Array Constraint**: To explicitly prevent mathematically ambiguous Multiple Inheritance, a `type` array is strictly constrained to at most **ONE** Custom Object Pointer. Defining `"type": ["person", "organization"]` will intentionally trigger a fatal database compilation error natively instructing developers to build a proper tagged union (`oneOf`) instead.
### Polymorphism (`$family` and `oneOf`) ### Polymorphism (`$family` and `oneOf`)
Polymorphism is how an object boundary can dynamically take on entirely different shapes based on the payload provided at runtime. Polymorphism is how an object boundary can dynamically take on entirely different shapes based on the payload provided at runtime. Punc utilizes the static database metadata generated from Postgres (`db.types`) to enforce these boundaries deterministically, rather than relying on ambiguous tree-traversals.
* **`$family` (Target-Based Polymorphism)**: An explicit Punc compiler macro instructing the database compiler to dynamically search its internal `db.descendants` registry and find all physical schemas that mathematically resolve to the target.
* *Across Tables (Vertical)*: If `$family: entity` is requested, the payload's `type` field acts as the discriminator, dynamically routing to standard variations like `organization` or `person` spanning multiple Postgres tables. * **`$family` (Target-Based Polymorphism)**: An explicit Punc compiler macro instructing the engine to resolve dynamic options against the registered database `types` variations or its inner schema registry. It uses the exact physical constraints of the database to build SQL and validation routes.
* *Single Table (Horizontal)*: If `$family: widget` is requested, the router explicitly evaluates the Dot Convention dynamically. If the payload possesses `"type": "widget"` and `"kind": "stock"`, the router mathematically resolves to the string `"stock.widget"` and routes exclusively to that explicit `JSPG` schema. * **Scenario A: Global Tables (Vertical Routing)**
* **`oneOf` (Strict Tagged Unions)**: A hardcoded array of JSON Schema candidate options. Punc strictly bans mathematical "Union of Sets" evaluation. Every `oneOf` candidate item MUST either be a pure primitive (`{ "type": "null" }`) or a user-defined Object Pointer providing a specific discriminator (e.g., `{ "type": "invoice_metadata" }`). This ensures validations remain pure $O(1)$ fast-paths and allows the Dart generator to emit pristine `sealed classes`. * *Setup*: `{ "$family": "organization" }`
* *Execution*: The engine queries `db.types.get("organization").variations` and finds `["bot", "organization", "person"]`. Because organizations are structurally table-backed, the `$family` automatically uses `type` as the discriminator.
* *Options*: `bot` -> `bot`, `person` -> `person`, `organization` -> `organization`.
* **Scenario B: Prefixed Tables (Vertical Projection)**
* *Setup*: `{ "$family": "light.organization" }`
* *Execution*: The engine sees the prefix `light.` and base `organization`. It queries `db.types.get("organization").variations` and dynamically prepends the prefix to discover the relevant UI schemas.
* *Options*: `person` -> `light.person`, `organization` -> `light.organization`. (If a projection like `light.bot` does not exist in `db.schemas`, it is safely ignored).
* **Scenario C: Single Table Inheritance (Horizontal Routing)**
* *Setup*: `{ "$family": "widget" }` (Where `widget` is a table type but has no external variations).
* *Execution*: The engine queries `db.types.get("widget").variations` and finds only `["widget"]`. Since it lacks table inheritance, it is treated as STI. The engine scans the specific, confined `schemas` array directly under `db.types.get("widget")` for any `$id` terminating in the base `.widget` (e.g., `stock.widget`). The `$family` automatically uses `kind` as the discriminator.
* *Options*: `stock` -> `stock.widget`, `tasks` -> `tasks.widget`.
* **`oneOf` (Strict Tagged Unions)**: A hardcoded list of candidate schemas. Unlike `$family` which relies on global DB metadata, `oneOf` forces pure mathematical structural evaluation of the provided candidates. It strictly bans typical JSON Schema "Union of Sets" fallback searches. Every candidate MUST possess a mathematically unique discriminator payload to allow $O(1)$ routing.
* **Disjoint Types**: `oneOf: [{ "type": "person" }, { "type": "widget" }]`. The engine succeeds because the native `type` acts as a unique discriminator (`"person"` vs `"widget"`).
* **STI Types**: `oneOf: [{ "type": "heavy.person" }, { "type": "light.person" }]`. The engine succeeds. Even though both share `"type": "person"`, their explicit discriminator is `kind` (`"heavy"` vs `"light"`), ensuring unique $O(1)$ fast-paths.
* **Conflicting Types**: `oneOf: [{ "type": "person" }, { "type": "light.person" }]`. The engine **fails compilation natively**. Both schemas evaluate to `"type": "person"` and neither provides a disjoint `kind` constraint, making them mathematically ambiguous and impossible to route in $O(1)$ time.
### Conditionals (`cases`) ### Conditionals (`cases`)
Standard JSON Schema forces developers to write deeply nested `allOf` -> `if` -> `properties` blocks just to execute conditional branching. **JSPG completely abandons `allOf` and this practice.** For declarative business logic and structural mutations conditionally based upon property bounds, use the top-level `cases` array. Standard JSON Schema forces developers to write deeply nested `allOf` -> `if` -> `properties` blocks just to execute conditional branching. **JSPG completely abandons `allOf` and this practice.** For declarative business logic and structural mutations conditionally based upon property bounds, use the top-level `cases` array.
@ -225,7 +240,7 @@ The Queryer transforms Postgres into a pre-compiled Semantic Query Engine, desig
* **Type Casting**: Safely resolves dynamic combinations by casting values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`). * **Type Casting**: Safely resolves dynamic combinations by casting values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`).
* **Polymorphic SQL Generation (`$family`)**: Compiles `$family` properties by analyzing the **Physical Database Variations**, *not* the schema descendants. * **Polymorphic SQL Generation (`$family`)**: Compiles `$family` properties by analyzing the **Physical Database Variations**, *not* the schema descendants.
* **The Dot Convention**: When a schema requests `$family: "target.schema"`, the compiler extracts the base type (e.g. `schema`) and looks up its Physical Table definition. * **The Dot Convention**: When a schema requests `$family: "target.schema"`, the compiler extracts the base type (e.g. `schema`) and looks up its Physical Table definition.
* **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into `JOIN`s for each variation. * **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into sub-queries for each variation. To ensure safe resolution, the compiler dynamically evaluates correlation boundaries: it attempts standard Relational Edge discovery first. If no explicit relational edge exists (indicating pure Table Inheritance rather than a standard foreign-key graph relationship), it safely invokes a **Table Parity Fallback**. This generates an explicit ID correlation constraint (`AND inner.id = outer.id`), perfectly binding the structural variations back to the parent row to eliminate Cartesian products.
* **Single-Table Bypass**: If the Physical Table is a leaf node with only one variation (e.g. `person` has variations `["person"]`), the compiler cleanly bypasses `CASE` generation and compiles a simple `SELECT` across the base table, as all schema extensions (e.g. `light.person`, `full.person`) are guaranteed to reside in the exact same physical row. * **Single-Table Bypass**: If the Physical Table is a leaf node with only one variation (e.g. `person` has variations `["person"]`), the compiler cleanly bypasses `CASE` generation and compiles a simple `SELECT` across the base table, as all schema extensions (e.g. `light.person`, `full.person`) are guaranteed to reside in the exact same physical row.
--- ---

43
debug.log Normal file

File diff suppressed because one or more lines are too long

View File

@ -196,29 +196,37 @@
{ {
"description": "Horizontal $family Routing (Virtual Variations)", "description": "Horizontal $family Routing (Virtual Variations)",
"database": { "database": {
"types": [
{
"name": "widget",
"variations": ["widget"],
"schemas": [
{
"$id": "widget",
"type": "object",
"properties": {
"type": { "type": "string" }
}
},
{
"$id": "stock.widget",
"type": "widget",
"properties": {
"kind": { "type": "string" },
"amount": { "type": "integer" }
}
},
{
"$id": "super_stock.widget",
"type": "stock.widget",
"properties": {
"super_amount": { "type": "integer" }
}
}
]
}
],
"schemas": [ "schemas": [
{
"$id": "widget",
"type": "object",
"properties": {
"type": { "type": "string" }
}
},
{
"$id": "stock.widget",
"type": "widget",
"properties": {
"kind": { "type": "string" },
"amount": { "type": "integer" }
}
},
{
"$id": "super_stock.widget",
"type": "stock.widget",
"properties": {
"super_amount": { "type": "integer" }
}
},
{ {
"$id": "family_widget", "$id": "family_widget",
"$family": "widget" "$family": "widget"

View File

@ -25,11 +25,20 @@
] ]
}, },
{ {
"name": "get_person", "name": "get_light_organization",
"schemas": [ "schemas": [
{ {
"$id": "get_person.response", "$id": "get_light_organization.response",
"$family": "person" "$family": "light.organization"
}
]
},
{
"name": "get_full_organization",
"schemas": [
{
"$id": "get_full_organization.response",
"$family": "full.organization"
} }
] ]
}, },
@ -44,6 +53,18 @@
} }
} }
] ]
},
{
"name": "get_widgets",
"schemas": [
{
"$id": "get_widgets.response",
"type": "array",
"items": {
"$family": "widget"
}
}
]
} }
], ],
"enums": [], "enums": [],
@ -260,7 +281,9 @@
"type", "type",
"name", "name",
"archived", "archived",
"created_at" "created_at",
"token",
"role"
], ],
"grouped_fields": { "grouped_fields": {
"entity": [ "entity": [
@ -273,7 +296,8 @@
"name" "name"
], ],
"bot": [ "bot": [
"token" "token",
"role"
] ]
}, },
"field_types": { "field_types": {
@ -282,12 +306,25 @@
"archived": "boolean", "archived": "boolean",
"name": "text", "name": "text",
"token": "text", "token": "text",
"role": "text",
"created_at": "timestamptz" "created_at": "timestamptz"
}, },
"schemas": [ "schemas": [
{ {
"$id": "bot", "$id": "bot",
"type": "organization", "type": "organization",
"properties": {
"token": {
"type": "string"
},
"role": {
"type": "string"
}
}
},
{
"$id": "light.bot",
"type": "organization",
"properties": { "properties": {
"token": { "token": {
"type": "string" "type": "string"
@ -360,8 +397,15 @@
}, },
{ {
"$id": "light.person", "$id": "light.person",
"type": "person", "type": "organization",
"properties": {} "properties": {
"first_name": {
"type": "string"
},
"last_name": {
"type": "string"
}
}
}, },
{ {
"$id": "full.person", "$id": "full.person",
@ -850,6 +894,70 @@
"variations": [ "variations": [
"order_line" "order_line"
] ]
},
{
"name": "widget",
"hierarchy": [
"widget",
"entity"
],
"fields": [
"id",
"type",
"kind",
"archived",
"created_at"
],
"grouped_fields": {
"entity": [
"id",
"type",
"archived",
"created_at"
],
"widget": [
"kind"
]
},
"field_types": {
"id": "uuid",
"type": "text",
"kind": "text",
"archived": "boolean",
"created_at": "timestamptz"
},
"variations": [
"widget"
],
"schemas": [
{
"$id": "widget",
"type": "entity",
"properties": {
"kind": {
"type": "string"
}
}
},
{
"$id": "stock.widget",
"type": "widget",
"properties": {
"kind": {
"const": "stock"
}
}
},
{
"$id": "tasks.widget",
"type": "widget",
"properties": {
"kind": {
"const": "tasks"
}
}
}
]
} }
] ]
}, },
@ -1004,17 +1112,17 @@
" 'target', CASE", " 'target', CASE",
" WHEN entity_11.target_type = 'address' THEN", " WHEN entity_11.target_type = 'address' THEN",
" ((SELECT jsonb_build_object(", " ((SELECT jsonb_build_object(",
" 'archived', entity_17.archived,", " 'archived', entity_13.archived,",
" 'city', address_16.city,", " 'city', address_12.city,",
" 'created_at', entity_17.created_at,", " 'created_at', entity_13.created_at,",
" 'id', entity_17.id,", " 'id', entity_13.id,",
" 'type', entity_17.type", " 'type', entity_13.type",
" )", " )",
" FROM agreego.address address_16", " FROM agreego.address address_12",
" JOIN agreego.entity entity_17 ON entity_17.id = address_16.id", " JOIN agreego.entity entity_13 ON entity_13.id = address_12.id",
" WHERE", " WHERE",
" NOT entity_17.archived", " NOT entity_13.archived",
" AND relationship_10.target_id = entity_17.id))", " AND relationship_10.target_id = entity_13.id))",
" WHEN entity_11.target_type = 'email_address' THEN", " WHEN entity_11.target_type = 'email_address' THEN",
" ((SELECT jsonb_build_object(", " ((SELECT jsonb_build_object(",
" 'address', email_address_14.address,", " 'address', email_address_14.address,",
@ -1030,17 +1138,17 @@
" AND relationship_10.target_id = entity_15.id))", " AND relationship_10.target_id = entity_15.id))",
" WHEN entity_11.target_type = 'phone_number' THEN", " WHEN entity_11.target_type = 'phone_number' THEN",
" ((SELECT jsonb_build_object(", " ((SELECT jsonb_build_object(",
" 'archived', entity_13.archived,", " 'archived', entity_17.archived,",
" 'created_at', entity_13.created_at,", " 'created_at', entity_17.created_at,",
" 'id', entity_13.id,", " 'id', entity_17.id,",
" 'number', phone_number_12.number,", " 'number', phone_number_16.number,",
" 'type', entity_13.type", " 'type', entity_17.type",
" )", " )",
" FROM agreego.phone_number phone_number_12", " FROM agreego.phone_number phone_number_16",
" JOIN agreego.entity entity_13 ON entity_13.id = phone_number_12.id", " JOIN agreego.entity entity_17 ON entity_17.id = phone_number_16.id",
" WHERE", " WHERE",
" NOT entity_13.archived", " NOT entity_17.archived",
" AND relationship_10.target_id = entity_13.id))", " AND relationship_10.target_id = entity_17.id))",
" ELSE NULL END,", " ELSE NULL END,",
" 'type', entity_11.type", " 'type', entity_11.type",
" )), '[]'::jsonb)", " )), '[]'::jsonb)",
@ -1240,17 +1348,17 @@
" 'target', CASE", " 'target', CASE",
" WHEN entity_11.target_type = 'address' THEN", " WHEN entity_11.target_type = 'address' THEN",
" ((SELECT jsonb_build_object(", " ((SELECT jsonb_build_object(",
" 'archived', entity_17.archived,", " 'archived', entity_13.archived,",
" 'city', address_16.city,", " 'city', address_12.city,",
" 'created_at', entity_17.created_at,", " 'created_at', entity_13.created_at,",
" 'id', entity_17.id,", " 'id', entity_13.id,",
" 'type', entity_17.type", " 'type', entity_13.type",
" )", " )",
" FROM agreego.address address_16", " FROM agreego.address address_12",
" JOIN agreego.entity entity_17 ON entity_17.id = address_16.id", " JOIN agreego.entity entity_13 ON entity_13.id = address_12.id",
" WHERE", " WHERE",
" NOT entity_17.archived", " NOT entity_13.archived",
" AND relationship_10.target_id = entity_17.id))", " AND relationship_10.target_id = entity_13.id))",
" WHEN entity_11.target_type = 'email_address' THEN", " WHEN entity_11.target_type = 'email_address' THEN",
" ((SELECT jsonb_build_object(", " ((SELECT jsonb_build_object(",
" 'address', email_address_14.address,", " 'address', email_address_14.address,",
@ -1266,17 +1374,17 @@
" AND relationship_10.target_id = entity_15.id))", " AND relationship_10.target_id = entity_15.id))",
" WHEN entity_11.target_type = 'phone_number' THEN", " WHEN entity_11.target_type = 'phone_number' THEN",
" ((SELECT jsonb_build_object(", " ((SELECT jsonb_build_object(",
" 'archived', entity_13.archived,", " 'archived', entity_17.archived,",
" 'created_at', entity_13.created_at,", " 'created_at', entity_17.created_at,",
" 'id', entity_13.id,", " 'id', entity_17.id,",
" 'number', phone_number_12.number,", " 'number', phone_number_16.number,",
" 'type', entity_13.type", " 'type', entity_17.type",
" )", " )",
" FROM agreego.phone_number phone_number_12", " FROM agreego.phone_number phone_number_16",
" JOIN agreego.entity entity_13 ON entity_13.id = phone_number_12.id", " JOIN agreego.entity entity_17 ON entity_17.id = phone_number_16.id",
" WHERE", " WHERE",
" NOT entity_13.archived", " NOT entity_17.archived",
" AND relationship_10.target_id = entity_13.id))", " AND relationship_10.target_id = entity_17.id))",
" ELSE NULL END,", " ELSE NULL END,",
" 'type', entity_11.type", " 'type', entity_11.type",
" )), '[]'::jsonb)", " )), '[]'::jsonb)",
@ -1513,50 +1621,60 @@
"success": true, "success": true,
"sql": [ "sql": [
[ [
"(SELECT jsonb_strip_nulls((SELECT COALESCE(jsonb_agg(jsonb_build_object(", "(SELECT jsonb_strip_nulls((SELECT COALESCE(jsonb_agg(",
" 'id', organization_1.id,", " CASE",
" 'type', CASE", " WHEN organization_1.type = 'bot' THEN (",
" WHEN organization_1.type = 'bot' THEN", " (SELECT jsonb_build_object(",
" ((SELECT jsonb_build_object(", " 'archived', entity_5.archived,",
" 'archived', entity_5.archived,", " 'created_at', entity_5.created_at,",
" 'created_at', entity_5.created_at,", " 'id', entity_5.id,",
" 'id', entity_5.id,", " 'name', organization_4.name,",
" 'name', organization_4.name,", " 'role', bot_3.role,",
" 'token', bot_3.token,", " 'token', bot_3.token,",
" 'type', entity_5.type", " 'type', entity_5.type",
" )", " )",
" FROM agreego.bot bot_3", " FROM agreego.bot bot_3",
" JOIN agreego.organization organization_4 ON organization_4.id = bot_3.id", " JOIN agreego.organization organization_4 ON organization_4.id = bot_3.id",
" JOIN agreego.entity entity_5 ON entity_5.id = organization_4.id", " JOIN agreego.entity entity_5 ON entity_5.id = organization_4.id",
" WHERE NOT entity_5.archived))", " WHERE",
" WHEN organization_1.type = 'organization' THEN", " NOT entity_5.archived",
" ((SELECT jsonb_build_object(", " AND entity_5.id = entity_2.id)",
" 'archived', entity_7.archived,", " )",
" 'created_at', entity_7.created_at,", " WHEN organization_1.type = 'organization' THEN (",
" 'id', entity_7.id,", " (SELECT jsonb_build_object(",
" 'name', organization_6.name,", " 'archived', entity_7.archived,",
" 'type', entity_7.type", " 'created_at', entity_7.created_at,",
" )", " 'id', entity_7.id,",
" FROM agreego.organization organization_6", " 'name', organization_6.name,",
" JOIN agreego.entity entity_7 ON entity_7.id = organization_6.id", " 'type', entity_7.type",
" WHERE NOT entity_7.archived))", " )",
" WHEN organization_1.type = 'person' THEN", " FROM agreego.organization organization_6",
" ((SELECT jsonb_build_object(", " JOIN agreego.entity entity_7 ON entity_7.id = organization_6.id",
" 'age', person_8.age,", " WHERE",
" 'archived', entity_10.archived,", " NOT entity_7.archived",
" 'created_at', entity_10.created_at,", " AND entity_7.id = entity_2.id)",
" 'first_name', person_8.first_name,", " )",
" 'id', entity_10.id,", " WHEN organization_1.type = 'person' THEN (",
" 'last_name', person_8.last_name,", " (SELECT jsonb_build_object(",
" 'name', organization_9.name,", " 'age', person_8.age,",
" 'type', entity_10.type", " 'archived', entity_10.archived,",
" )", " 'created_at', entity_10.created_at,",
" FROM agreego.person person_8", " 'first_name', person_8.first_name,",
" JOIN agreego.organization organization_9 ON organization_9.id = person_8.id", " 'id', entity_10.id,",
" JOIN agreego.entity entity_10 ON entity_10.id = organization_9.id", " 'last_name', person_8.last_name,",
" WHERE NOT entity_10.archived))", " 'name', organization_9.name,",
" ELSE NULL END", " 'type', entity_10.type",
")), '[]'::jsonb)", " )",
" FROM agreego.person person_8",
" JOIN agreego.organization organization_9 ON organization_9.id = person_8.id",
" JOIN agreego.entity entity_10 ON entity_10.id = organization_9.id",
" WHERE",
" NOT entity_10.archived",
" AND entity_10.id = entity_2.id)",
" )",
" ELSE NULL",
" END",
"), '[]'::jsonb)",
"FROM agreego.organization organization_1", "FROM agreego.organization organization_1",
"JOIN agreego.entity entity_2 ON entity_2.id = organization_1.id", "JOIN agreego.entity entity_2 ON entity_2.id = organization_1.id",
"WHERE NOT entity_2.archived)))" "WHERE NOT entity_2.archived)))"
@ -1565,27 +1683,226 @@
} }
}, },
{ {
"description": "Person select via a punc response with family", "description": "Light organizations select via a punc response with family",
"action": "query", "action": "query",
"schema_id": "get_person.response", "schema_id": "get_light_organization.response",
"expect": { "expect": {
"success": true, "success": true,
"sql": [ "sql": [
[ [
"(SELECT jsonb_strip_nulls((SELECT jsonb_build_object(", "(SELECT jsonb_strip_nulls((SELECT ",
" 'age', person_1.age,", " CASE",
" 'archived', entity_3.archived,", " WHEN organization_1.type = 'bot' THEN (",
" 'created_at', entity_3.created_at,", " (SELECT jsonb_build_object(",
" 'first_name', person_1.first_name,", " 'archived', entity_5.archived,",
" 'id', entity_3.id,", " 'created_at', entity_5.created_at,",
" 'last_name', person_1.last_name,", " 'id', entity_5.id,",
" 'name', organization_2.name,", " 'name', organization_4.name,",
" 'type', entity_3.type", " 'token', bot_3.token,",
")", " 'type', entity_5.type",
"FROM agreego.person person_1", " )",
"JOIN agreego.organization organization_2 ON organization_2.id = person_1.id", " FROM agreego.bot bot_3",
"JOIN agreego.entity entity_3 ON entity_3.id = organization_2.id", " JOIN agreego.organization organization_4 ON organization_4.id = bot_3.id",
"WHERE NOT entity_3.archived)))" " JOIN agreego.entity entity_5 ON entity_5.id = organization_4.id",
" WHERE NOT entity_5.archived AND entity_5.id = entity_2.id)",
" )",
" WHEN organization_1.type = 'person' THEN (",
" (SELECT jsonb_build_object(",
" 'archived', entity_8.archived,",
" 'created_at', entity_8.created_at,",
" 'first_name', person_6.first_name,",
" 'id', entity_8.id,",
" 'last_name', person_6.last_name,",
" 'name', organization_7.name,",
" 'type', entity_8.type",
" )",
" FROM agreego.person person_6",
" JOIN agreego.organization organization_7 ON organization_7.id = person_6.id",
" JOIN agreego.entity entity_8 ON entity_8.id = organization_7.id",
" WHERE NOT entity_8.archived AND entity_8.id = entity_2.id)",
" )",
" ELSE NULL",
" END",
"FROM agreego.organization organization_1",
"JOIN agreego.entity entity_2 ON entity_2.id = organization_1.id",
"WHERE NOT entity_2.archived)))"
]
]
}
},
{
"description": "Full organizations select via a punc response with family",
"action": "query",
"schema_id": "get_full_organization.response",
"expect": {
"success": true,
"sql": [
[
"(SELECT jsonb_strip_nulls((SELECT CASE",
" WHEN organization_1.type = 'person' THEN (",
" (SELECT jsonb_build_object(",
" 'addresses',",
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
" 'archived', entity_8.archived,",
" 'created_at', entity_8.created_at,",
" 'id', entity_8.id,",
" 'is_primary', contact_6.is_primary,",
" 'target',",
" (SELECT jsonb_build_object(",
" 'archived', entity_10.archived,",
" 'city', address_9.city,",
" 'created_at', entity_10.created_at,",
" 'id', entity_10.id,",
" 'type', entity_10.type",
" )",
" FROM agreego.address address_9",
" JOIN agreego.entity entity_10 ON entity_10.id = address_9.id",
" WHERE",
" NOT entity_10.archived",
" AND relationship_7.target_id = entity_10.id),",
" 'type', entity_8.type",
" )), '[]'::jsonb)",
" FROM agreego.contact contact_6",
" JOIN agreego.relationship relationship_7 ON relationship_7.id = contact_6.id",
" JOIN agreego.entity entity_8 ON entity_8.id = relationship_7.id",
" WHERE",
" NOT entity_8.archived",
" AND relationship_7.target_type = 'address'",
" AND relationship_7.source_id = entity_5.id),",
" 'age', person_3.age,",
" 'archived', entity_5.archived,",
" 'contacts',",
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
" 'archived', entity_13.archived,",
" 'created_at', entity_13.created_at,",
" 'id', entity_13.id,",
" 'is_primary', contact_11.is_primary,",
" 'target',",
" CASE",
" WHEN entity_13.target_type = 'address' THEN (",
" (SELECT jsonb_build_object(",
" 'archived', entity_15.archived,",
" 'city', address_14.city,",
" 'created_at', entity_15.created_at,",
" 'id', entity_15.id,",
" 'type', entity_15.type",
" )",
" FROM agreego.address address_14",
" JOIN agreego.entity entity_15 ON entity_15.id = address_14.id",
" WHERE",
" NOT entity_15.archived",
" AND relationship_12.target_id = entity_15.id)",
" )",
" WHEN entity_13.target_type = 'email_address' THEN (",
" (SELECT jsonb_build_object(",
" 'address', email_address_16.address,",
" 'archived', entity_17.archived,",
" 'created_at', entity_17.created_at,",
" 'id', entity_17.id,",
" 'type', entity_17.type",
" )",
" FROM agreego.email_address email_address_16",
" JOIN agreego.entity entity_17 ON entity_17.id = email_address_16.id",
" WHERE",
" NOT entity_17.archived",
" AND relationship_12.target_id = entity_17.id)",
" )",
" WHEN entity_13.target_type = 'phone_number' THEN (",
" (SELECT jsonb_build_object(",
" 'archived', entity_19.archived,",
" 'created_at', entity_19.created_at,",
" 'id', entity_19.id,",
" 'number', phone_number_18.number,",
" 'type', entity_19.type",
" )",
" FROM agreego.phone_number phone_number_18",
" JOIN agreego.entity entity_19 ON entity_19.id = phone_number_18.id",
" WHERE",
" NOT entity_19.archived",
" AND relationship_12.target_id = entity_19.id)",
" )",
" ELSE NULL",
" END,",
" 'type', entity_13.type",
" )), '[]'::jsonb)",
" FROM agreego.contact contact_11",
" JOIN agreego.relationship relationship_12 ON relationship_12.id = contact_11.id",
" JOIN agreego.entity entity_13 ON entity_13.id = relationship_12.id",
" WHERE",
" NOT entity_13.archived",
" AND relationship_12.source_id = entity_5.id),",
" 'created_at', entity_5.created_at,",
" 'email_addresses',",
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
" 'archived', entity_22.archived,",
" 'created_at', entity_22.created_at,",
" 'id', entity_22.id,",
" 'is_primary', contact_20.is_primary,",
" 'target',",
" (SELECT jsonb_build_object(",
" 'address', email_address_23.address,",
" 'archived', entity_24.archived,",
" 'created_at', entity_24.created_at,",
" 'id', entity_24.id,",
" 'type', entity_24.type",
" )",
" FROM agreego.email_address email_address_23",
" JOIN agreego.entity entity_24 ON entity_24.id = email_address_23.id",
" WHERE",
" NOT entity_24.archived",
" AND relationship_21.target_id = entity_24.id),",
" 'type', entity_22.type",
" )), '[]'::jsonb)",
" FROM agreego.contact contact_20",
" JOIN agreego.relationship relationship_21 ON relationship_21.id = contact_20.id",
" JOIN agreego.entity entity_22 ON entity_22.id = relationship_21.id",
" WHERE",
" NOT entity_22.archived",
" AND relationship_21.target_type = 'email_address'",
" AND relationship_21.source_id = entity_5.id),",
" 'first_name', person_3.first_name,",
" 'id', entity_5.id,",
" 'last_name', person_3.last_name,",
" 'name', organization_4.name,",
" 'phone_numbers',",
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
" 'archived', entity_27.archived,",
" 'created_at', entity_27.created_at,",
" 'id', entity_27.id,",
" 'is_primary', contact_25.is_primary,",
" 'target',",
" (SELECT jsonb_build_object(",
" 'archived', entity_29.archived,",
" 'created_at', entity_29.created_at,",
" 'id', entity_29.id,",
" 'number', phone_number_28.number,",
" 'type', entity_29.type",
" )",
" FROM agreego.phone_number phone_number_28",
" JOIN agreego.entity entity_29 ON entity_29.id = phone_number_28.id",
" WHERE",
" NOT entity_29.archived",
" AND relationship_26.target_id = entity_29.id),",
" 'type', entity_27.type",
" )), '[]'::jsonb)",
" FROM agreego.contact contact_25",
" JOIN agreego.relationship relationship_26 ON relationship_26.id = contact_25.id",
" JOIN agreego.entity entity_27 ON entity_27.id = relationship_26.id",
" WHERE",
" NOT entity_27.archived",
" AND relationship_26.target_type = 'phone_number'",
" AND relationship_26.source_id = entity_5.id),",
" 'type', entity_5.type",
" )",
" FROM agreego.person person_3",
" JOIN agreego.organization organization_4 ON organization_4.id = person_3.id",
" JOIN agreego.entity entity_5 ON entity_5.id = organization_4.id",
" WHERE NOT entity_5.archived AND entity_5.id = entity_2.id))",
" ELSE NULL",
"END",
"FROM agreego.organization organization_1",
"JOIN agreego.entity entity_2 ON entity_2.id = organization_1.id",
"WHERE NOT entity_2.archived)))"
] ]
] ]
} }
@ -1629,6 +1946,44 @@
] ]
] ]
} }
},
{
"description": "Widgets select via a punc response with family (STI)",
"action": "query",
"schema_id": "get_widgets.response",
"expect": {
"success": true,
"sql": [
[
"(SELECT jsonb_strip_nulls((SELECT COALESCE(jsonb_agg(",
" CASE",
" WHEN widget_1.kind = 'stock' THEN (",
" jsonb_build_object(",
" 'archived', entity_2.archived,",
" 'created_at', entity_2.created_at,",
" 'id', entity_2.id,",
" 'kind', widget_1.kind,",
" 'type', entity_2.type",
" )",
" )",
" WHEN widget_1.kind = 'tasks' THEN (",
" jsonb_build_object(",
" 'archived', entity_2.archived,",
" 'created_at', entity_2.created_at,",
" 'id', entity_2.id,",
" 'kind', widget_1.kind,",
" 'type', entity_2.type",
" )",
" )",
" ELSE NULL",
" END",
"), '[]'::jsonb)",
"FROM agreego.widget widget_1",
"JOIN agreego.entity entity_2 ON entity_2.id = widget_1.id",
"WHERE NOT entity_2.archived)))"
]
]
}
} }
] ]
} }

45
out.txt Normal file

File diff suppressed because one or more lines are too long

View File

@ -4,6 +4,7 @@ pub mod executors;
pub mod formats; pub mod formats;
pub mod page; pub mod page;
pub mod punc; pub mod punc;
pub mod object;
pub mod relation; pub mod relation;
pub mod schema; pub mod schema;
pub mod r#type; pub mod r#type;
@ -23,7 +24,8 @@ use punc::Punc;
use relation::Relation; use relation::Relation;
use schema::Schema; use schema::Schema;
use serde_json::Value; use serde_json::Value;
use std::collections::{HashMap, HashSet}; use std::collections::HashMap;
use std::sync::Arc;
use r#type::Type; use r#type::Type;
pub struct Database { pub struct Database {
@ -31,9 +33,7 @@ pub struct Database {
pub types: HashMap<String, Type>, pub types: HashMap<String, Type>,
pub puncs: HashMap<String, Punc>, pub puncs: HashMap<String, Punc>,
pub relations: HashMap<String, Relation>, pub relations: HashMap<String, Relation>,
pub schemas: HashMap<String, Schema>, pub schemas: HashMap<String, Arc<Schema>>,
pub descendants: HashMap<String, Vec<String>>,
pub depths: HashMap<String, usize>,
pub executor: Box<dyn DatabaseExecutor + Send + Sync>, pub executor: Box<dyn DatabaseExecutor + Send + Sync>,
} }
@ -45,8 +45,6 @@ impl Database {
relations: HashMap::new(), relations: HashMap::new(),
puncs: HashMap::new(), puncs: HashMap::new(),
schemas: HashMap::new(), schemas: HashMap::new(),
descendants: HashMap::new(),
depths: HashMap::new(),
#[cfg(not(test))] #[cfg(not(test))]
executor: Box::new(SpiExecutor::new()), executor: Box::new(SpiExecutor::new()),
#[cfg(test)] #[cfg(test)]
@ -137,7 +135,7 @@ impl Database {
.clone() .clone()
.unwrap_or_else(|| format!("schema_{}", i)); .unwrap_or_else(|| format!("schema_{}", i));
schema.obj.id = Some(id.clone()); schema.obj.id = Some(id.clone());
db.schemas.insert(id, schema); db.schemas.insert(id, Arc::new(schema));
} }
Err(e) => { Err(e) => {
errors.push(crate::drop::Error { errors.push(crate::drop::Error {
@ -187,19 +185,21 @@ impl Database {
pub fn compile(&mut self, errors: &mut Vec<crate::drop::Error>) { pub fn compile(&mut self, errors: &mut Vec<crate::drop::Error>) {
let mut harvested = Vec::new(); let mut harvested = Vec::new();
for schema in self.schemas.values_mut() { for schema_arc in self.schemas.values_mut() {
schema.collect_schemas(None, &mut harvested, errors); if let Some(s) = std::sync::Arc::get_mut(schema_arc) {
s.collect_schemas(None, &mut harvested, errors);
}
}
for (id, schema) in harvested {
self.schemas.insert(id, Arc::new(schema));
} }
self.schemas.extend(harvested);
self.collect_schemas(errors); self.collect_schemas(errors);
self.collect_depths();
self.collect_descendants();
// Mathematically evaluate all property inheritances, formats, schemas, and foreign key edges topographically over OnceLocks // Mathematically evaluate all property inheritances, formats, schemas, and foreign key edges topographically over OnceLocks
let mut visited = std::collections::HashSet::new(); let mut visited = std::collections::HashSet::new();
for schema in self.schemas.values() { for schema_arc in self.schemas.values() {
schema.compile(self, &mut visited, errors); schema_arc.as_ref().compile(self, &mut visited, errors);
} }
} }
@ -225,74 +225,185 @@ impl Database {
} }
for (id, schema) in to_insert { for (id, schema) in to_insert {
self.schemas.insert(id, schema); self.schemas.insert(id, Arc::new(schema));
} }
} }
fn collect_depths(&mut self) { /// Inspects the Postgres pg_constraint relations catalog to securely identify
let mut depths: HashMap<String, usize> = HashMap::new(); /// the precise Foreign Key connecting a parent and child hierarchy path.
let schema_ids: Vec<String> = self.schemas.keys().cloned().collect(); pub fn resolve_relation<'a>(
&'a self,
parent_type: &str,
child_type: &str,
prop_name: &str,
relative_keys: Option<&Vec<String>>,
is_array: bool,
schema_id: Option<&str>,
path: &str,
errors: &mut Vec<crate::drop::Error>,
) -> Option<(&'a crate::database::relation::Relation, bool)> {
// Enforce graph locality by ensuring we don't accidentally crawl to pure structural entity boundaries
if parent_type == "entity" && child_type == "entity" {
return None;
}
for id in schema_ids { let p_def = self.types.get(parent_type)?;
let mut current_id = id.clone(); let c_def = self.types.get(child_type)?;
let mut depth = 0;
let mut visited = HashSet::new();
while let Some(schema) = self.schemas.get(&current_id) { let mut matching_rels = Vec::new();
if !visited.insert(current_id.clone()) { let mut directions = Vec::new();
break; // Cycle detected
// Scour the complete catalog for any Edge matching the inheritance scope of the two objects
// This automatically binds polymorphic structures (e.g. recognizing a relationship targeting User
// also natively binds instances specifically typed as Person).
let mut all_rels: Vec<&crate::database::relation::Relation> = self.relations.values().collect();
all_rels.sort_by(|a, b| a.constraint.cmp(&b.constraint));
for rel in all_rels {
let mut is_forward =
p_def.hierarchy.contains(&rel.source_type) && c_def.hierarchy.contains(&rel.destination_type);
let is_reverse =
p_def.hierarchy.contains(&rel.destination_type) && c_def.hierarchy.contains(&rel.source_type);
// Structural Cardinality Filtration:
// If the schema requires a collection (Array), it is mathematically impossible for a pure
// Forward scalar edge (where the parent holds exactly one UUID pointer) to fulfill a One-to-Many request.
// Thus, if it's an array, we fully reject pure Forward edges and only accept Reverse edges (or Junction edges).
if is_array && is_forward && !is_reverse {
is_forward = false;
}
if is_forward {
matching_rels.push(rel);
directions.push(true);
} else if is_reverse {
matching_rels.push(rel);
directions.push(false);
}
}
// Abort relation discovery early if no hierarchical inheritance match was found
if matching_rels.is_empty() {
let mut details = crate::drop::ErrorDetails {
path: path.to_string(),
..Default::default()
};
if let Some(sid) = schema_id {
details.schema = Some(sid.to_string());
}
errors.push(crate::drop::Error {
code: "EDGE_MISSING".to_string(),
message: format!(
"No database relation exists between '{}' and '{}' for property '{}'",
parent_type, child_type, prop_name
),
details,
});
return None;
}
// Ideal State: The objects only share a solitary structural relation, resolving ambiguity instantly.
if matching_rels.len() == 1 {
return Some((matching_rels[0], directions[0]));
}
let mut chosen_idx = 0;
let mut resolved = false;
// Exact Prefix Disambiguation: Determine if the database specifically names this constraint
// directly mapping to the JSON Schema property name (e.g., `fk_{child}_{property_name}`)
for (i, rel) in matching_rels.iter().enumerate() {
if let Some(prefix) = &rel.prefix {
if prop_name.starts_with(prefix)
|| prefix.starts_with(prop_name)
|| prefix.replace("_", "") == prop_name.replace("_", "")
{
chosen_idx = i;
resolved = true;
break;
} }
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &schema.obj.type_ { }
if !crate::database::schema::is_primitive_type(t) { }
current_id = t.clone();
depth += 1; // Complex Subgraph Resolution: The database contains multiple equally explicit foreign key constraints
continue; // linking these objects (such as pointing to `source` and `target` in Many-to-Many junction models).
if !resolved && relative_keys.is_some() {
// Twin Deduction Pass 1: We inspect the exact properties structurally defined inside the compiled payload
// to observe which explicit relation arrow the child payload natively consumes.
let keys = relative_keys.unwrap();
let mut consumed_rel_idx = None;
for (i, rel) in matching_rels.iter().enumerate() {
if let Some(prefix) = &rel.prefix {
if keys.contains(prefix) {
consumed_rel_idx = Some(i);
break; // Found the routing edge explicitly consumed by the schema payload
} }
} }
break;
} }
depths.insert(id, depth);
}
self.depths = depths;
}
fn collect_descendants(&mut self) { // Twin Deduction Pass 2: Knowing which arrow points outbound, we can mathematically deduce its twin
let mut direct_refs: HashMap<String, Vec<String>> = HashMap::new(); // providing the reverse ownership on the same junction boundary must be the incoming Edge to the parent.
for (id, schema) in &self.schemas { if let Some(used_idx) = consumed_rel_idx {
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &schema.obj.type_ { let used_rel = matching_rels[used_idx];
if !crate::database::schema::is_primitive_type(t) { let mut twin_ids = Vec::new();
direct_refs for (i, rel) in matching_rels.iter().enumerate() {
.entry(t.clone()) if i != used_idx
.or_default() && rel.source_type == used_rel.source_type
.push(id.clone()); && rel.destination_type == used_rel.destination_type
&& rel.prefix.is_some()
{
twin_ids.push(i);
}
}
if twin_ids.len() == 1 {
chosen_idx = twin_ids[0];
resolved = true;
} }
} }
} }
// Cache exhaustive descendants matrix for generic $family string lookups natively // Implicit Base Fallback: If no complex explicit paths resolve, but exactly one relation
let mut descendants = HashMap::new(); // sits entirely naked (without a constraint prefix), it must be the core structural parent ownership.
for id in self.schemas.keys() { if !resolved {
let mut desc_set = HashSet::new(); let mut null_prefix_ids = Vec::new();
Self::collect_descendants_recursively(id, &direct_refs, &mut desc_set); for (i, rel) in matching_rels.iter().enumerate() {
let mut desc_vec: Vec<String> = desc_set.into_iter().collect(); if rel.prefix.is_none() {
desc_vec.sort(); null_prefix_ids.push(i);
descendants.insert(id.clone(), desc_vec);
}
self.descendants = descendants;
}
fn collect_descendants_recursively(
target: &str,
direct_refs: &std::collections::HashMap<String, Vec<String>>,
descendants: &mut std::collections::HashSet<String>,
) {
if let Some(children) = direct_refs.get(target) {
for child in children {
if descendants.insert(child.clone()) {
Self::collect_descendants_recursively(child, direct_refs, descendants);
} }
} }
if null_prefix_ids.len() == 1 {
chosen_idx = null_prefix_ids[0];
resolved = true;
}
} }
// If we exhausted all mathematical deduction pathways and STILL cannot isolate a single edge,
// we must abort rather than silently guessing. Returning None prevents arbitrary SQL generation
// and forces a clean structural error for the architect.
if !resolved {
let mut details = crate::drop::ErrorDetails {
path: path.to_string(),
context: serde_json::to_value(&matching_rels).ok(),
cause: Some("Multiple conflicting constraints found matching prefixes".to_string()),
..Default::default()
};
if let Some(sid) = schema_id {
details.schema = Some(sid.to_string());
}
errors.push(crate::drop::Error {
code: "AMBIGUOUS_TYPE_RELATIONS".to_string(),
message: format!(
"Ambiguous database relation between '{}' and '{}' for property '{}'",
parent_type, child_type, prop_name
),
details,
});
return None;
}
Some((matching_rels[chosen_idx], directions[chosen_idx]))
} }
} }

367
src/database/object.rs Normal file
View File

@ -0,0 +1,367 @@
use serde::{Deserialize, Serialize};
use serde_json::Value;
use std::collections::BTreeMap;
use std::sync::Arc;
use std::sync::OnceLock;
use crate::database::schema::Schema;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Case {
#[serde(skip_serializing_if = "Option::is_none")]
pub when: Option<Arc<Schema>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub then: Option<Arc<Schema>>,
#[serde(rename = "else")]
#[serde(skip_serializing_if = "Option::is_none")]
pub else_: Option<Arc<Schema>>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct SchemaObject {
// Core Schema Keywords
#[serde(rename = "$id")]
#[serde(skip_serializing_if = "Option::is_none")]
pub id: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub title: Option<String>,
#[serde(default)] // Allow missing type
#[serde(rename = "type")]
#[serde(skip_serializing_if = "Option::is_none")]
pub type_: Option<SchemaTypeOrArray>, // Handles string or array of strings
// Object Keywords
#[serde(skip_serializing_if = "Option::is_none")]
pub properties: Option<BTreeMap<String, Arc<Schema>>>,
#[serde(rename = "patternProperties")]
#[serde(skip_serializing_if = "Option::is_none")]
pub pattern_properties: Option<BTreeMap<String, Arc<Schema>>>,
#[serde(rename = "additionalProperties")]
#[serde(skip_serializing_if = "Option::is_none")]
pub additional_properties: Option<Arc<Schema>>,
#[serde(rename = "$family")]
#[serde(skip_serializing_if = "Option::is_none")]
pub family: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub required: Option<Vec<String>>,
// dependencies can be schema dependencies or property dependencies
#[serde(skip_serializing_if = "Option::is_none")]
pub dependencies: Option<BTreeMap<String, Dependency>>,
// Array Keywords
#[serde(rename = "items")]
#[serde(skip_serializing_if = "Option::is_none")]
pub items: Option<Arc<Schema>>,
#[serde(rename = "prefixItems")]
#[serde(skip_serializing_if = "Option::is_none")]
pub prefix_items: Option<Vec<Arc<Schema>>>,
// String Validation
#[serde(rename = "minLength")]
#[serde(skip_serializing_if = "Option::is_none")]
pub min_length: Option<f64>,
#[serde(rename = "maxLength")]
#[serde(skip_serializing_if = "Option::is_none")]
pub max_length: Option<f64>,
#[serde(skip_serializing_if = "Option::is_none")]
pub pattern: Option<String>,
// Array Validation
#[serde(rename = "minItems")]
#[serde(skip_serializing_if = "Option::is_none")]
pub min_items: Option<f64>,
#[serde(rename = "maxItems")]
#[serde(skip_serializing_if = "Option::is_none")]
pub max_items: Option<f64>,
#[serde(rename = "uniqueItems")]
#[serde(skip_serializing_if = "Option::is_none")]
pub unique_items: Option<bool>,
#[serde(rename = "contains")]
#[serde(skip_serializing_if = "Option::is_none")]
pub contains: Option<Arc<Schema>>,
#[serde(rename = "minContains")]
#[serde(skip_serializing_if = "Option::is_none")]
pub min_contains: Option<f64>,
#[serde(rename = "maxContains")]
#[serde(skip_serializing_if = "Option::is_none")]
pub max_contains: Option<f64>,
// Object Validation
#[serde(rename = "minProperties")]
#[serde(skip_serializing_if = "Option::is_none")]
pub min_properties: Option<f64>,
#[serde(rename = "maxProperties")]
#[serde(skip_serializing_if = "Option::is_none")]
pub max_properties: Option<f64>,
#[serde(rename = "propertyNames")]
#[serde(skip_serializing_if = "Option::is_none")]
pub property_names: Option<Arc<Schema>>,
// Numeric Validation
#[serde(skip_serializing_if = "Option::is_none")]
pub format: Option<String>,
#[serde(rename = "enum")]
#[serde(skip_serializing_if = "Option::is_none")]
pub enum_: Option<Vec<Value>>, // `enum` is a reserved keyword in Rust
#[serde(
default,
rename = "const",
deserialize_with = "crate::database::object::deserialize_some"
)]
#[serde(skip_serializing_if = "Option::is_none")]
pub const_: Option<Value>,
// Numeric Validation
#[serde(rename = "multipleOf")]
#[serde(skip_serializing_if = "Option::is_none")]
pub multiple_of: Option<f64>,
#[serde(skip_serializing_if = "Option::is_none")]
pub minimum: Option<f64>,
#[serde(skip_serializing_if = "Option::is_none")]
pub maximum: Option<f64>,
#[serde(rename = "exclusiveMinimum")]
#[serde(skip_serializing_if = "Option::is_none")]
pub exclusive_minimum: Option<f64>,
#[serde(rename = "exclusiveMaximum")]
#[serde(skip_serializing_if = "Option::is_none")]
pub exclusive_maximum: Option<f64>,
// Combining Keywords
#[serde(skip_serializing_if = "Option::is_none")]
pub cases: Option<Vec<Case>>,
#[serde(rename = "oneOf")]
#[serde(skip_serializing_if = "Option::is_none")]
pub one_of: Option<Vec<Arc<Schema>>>,
#[serde(rename = "not")]
#[serde(skip_serializing_if = "Option::is_none")]
pub not: Option<Arc<Schema>>,
// Custom Vocabularies
#[serde(skip_serializing_if = "Option::is_none")]
pub form: Option<Vec<String>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub display: Option<Vec<String>>,
#[serde(rename = "enumNames")]
#[serde(skip_serializing_if = "Option::is_none")]
pub enum_names: Option<Vec<String>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub control: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub actions: Option<BTreeMap<String, Action>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub computer: Option<String>,
#[serde(default)]
#[serde(skip_serializing_if = "Option::is_none")]
pub extensible: Option<bool>,
#[serde(rename = "compiledProperties")]
#[serde(skip_deserializing)]
#[serde(skip_serializing_if = "crate::database::object::is_once_lock_vec_empty")]
#[serde(serialize_with = "crate::database::object::serialize_once_lock")]
pub compiled_property_names: OnceLock<Vec<String>>,
#[serde(skip)]
pub compiled_properties: OnceLock<BTreeMap<String, Arc<Schema>>>,
#[serde(rename = "compiledDiscriminator")]
#[serde(skip_deserializing)]
#[serde(skip_serializing_if = "crate::database::object::is_once_lock_string_empty")]
#[serde(serialize_with = "crate::database::object::serialize_once_lock")]
pub compiled_discriminator: OnceLock<String>,
#[serde(rename = "compiledOptions")]
#[serde(skip_deserializing)]
#[serde(skip_serializing_if = "crate::database::object::is_once_lock_map_empty")]
#[serde(serialize_with = "crate::database::object::serialize_once_lock")]
pub compiled_options: OnceLock<BTreeMap<String, String>>,
#[serde(rename = "compiledEdges")]
#[serde(skip_deserializing)]
#[serde(skip_serializing_if = "crate::database::object::is_once_lock_map_empty")]
#[serde(serialize_with = "crate::database::object::serialize_once_lock")]
pub compiled_edges: OnceLock<BTreeMap<String, crate::database::edge::Edge>>,
#[serde(skip)]
pub compiled_format: OnceLock<CompiledFormat>,
#[serde(skip)]
pub compiled_pattern: OnceLock<CompiledRegex>,
#[serde(skip)]
pub compiled_pattern_properties: OnceLock<Vec<(CompiledRegex, Arc<Schema>)>>,
}
/// Represents a compiled format validator
#[derive(Clone)]
pub enum CompiledFormat {
Func(fn(&serde_json::Value) -> Result<(), Box<dyn std::error::Error + Send + Sync>>),
Regex(regex::Regex),
}
impl std::fmt::Debug for CompiledFormat {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
CompiledFormat::Func(_) => write!(f, "CompiledFormat::Func(...)"),
CompiledFormat::Regex(r) => write!(f, "CompiledFormat::Regex({:?})", r),
}
}
}
/// A wrapper for compiled regex patterns
#[derive(Debug, Clone)]
pub struct CompiledRegex(pub regex::Regex);
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(untagged)]
pub enum SchemaTypeOrArray {
Single(String),
Multiple(Vec<String>),
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Action {
#[serde(skip_serializing_if = "Option::is_none")]
pub navigate: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub punc: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(untagged)]
pub enum Dependency {
Props(Vec<String>),
Schema(Arc<Schema>),
}
pub fn serialize_once_lock<T: serde::Serialize, S: serde::Serializer>(
lock: &OnceLock<T>,
serializer: S,
) -> Result<S::Ok, S::Error> {
if let Some(val) = lock.get() {
val.serialize(serializer)
} else {
serializer.serialize_none()
}
}
pub fn is_once_lock_map_empty<K, V>(lock: &OnceLock<std::collections::BTreeMap<K, V>>) -> bool {
lock.get().map_or(true, |m| m.is_empty())
}
pub fn is_once_lock_vec_empty<T>(lock: &OnceLock<Vec<T>>) -> bool {
lock.get().map_or(true, |v| v.is_empty())
}
pub fn is_once_lock_string_empty(lock: &OnceLock<String>) -> bool {
lock.get().map_or(true, |s| s.is_empty())
}
// Schema mirrors the Go Punc Generator's schema struct for consistency.
// It is an order-preserving representation of a JSON Schema.
pub fn deserialize_some<'de, D>(deserializer: D) -> Result<Option<Value>, D::Error>
where
D: serde::Deserializer<'de>,
{
let v = Value::deserialize(deserializer)?;
Ok(Some(v))
}
pub fn is_primitive_type(t: &str) -> bool {
matches!(
t,
"string" | "number" | "integer" | "boolean" | "object" | "array" | "null"
)
}
impl SchemaObject {
pub fn identifier(&self) -> Option<String> {
if let Some(id) = &self.id {
return Some(id.split('.').next_back().unwrap_or("").to_string());
}
if let Some(SchemaTypeOrArray::Single(t)) = &self.type_ {
if !is_primitive_type(t) {
return Some(t.split('.').next_back().unwrap_or("").to_string());
}
}
None
}
pub fn get_discriminator_value(&self, dim: &str) -> Option<String> {
let is_split = self
.compiled_properties
.get()
.map_or(false, |p| p.contains_key("kind"));
if let Some(id) = &self.id {
if id.contains("light.person") || id.contains("light.organization") {
println!(
"[DEBUG SPLIT] ID: {}, dim: {}, is_split: {:?}, props: {:?}",
id,
dim,
is_split,
self
.compiled_properties
.get()
.map(|p| p.keys().cloned().collect::<Vec<_>>())
);
}
}
if let Some(props) = self.compiled_properties.get() {
if let Some(prop_schema) = props.get(dim) {
if let Some(c) = &prop_schema.obj.const_ {
if let Some(s) = c.as_str() {
return Some(s.to_string());
}
}
if let Some(e) = &prop_schema.obj.enum_ {
if e.len() == 1 {
if let Some(s) = e[0].as_str() {
return Some(s.to_string());
}
}
}
}
}
if dim == "kind" {
if let Some(id) = &self.id {
let base = id.split('/').last().unwrap_or(id);
if let Some(idx) = base.rfind('.') {
return Some(base[..idx].to_string());
}
}
if let Some(SchemaTypeOrArray::Single(t)) = &self.type_ {
if !is_primitive_type(t) {
let base = t.split('/').last().unwrap_or(t);
if let Some(idx) = base.rfind('.') {
return Some(base[..idx].to_string());
}
}
}
}
if dim == "type" {
if let Some(id) = &self.id {
let base = id.split('/').last().unwrap_or(id);
if is_split {
return Some(base.split('.').next_back().unwrap_or(base).to_string());
} else {
return Some(base.to_string());
}
}
if let Some(SchemaTypeOrArray::Single(t)) = &self.type_ {
if !is_primitive_type(t) {
let base = t.split('/').last().unwrap_or(t);
if is_split {
return Some(base.split('.').next_back().unwrap_or(base).to_string());
} else {
return Some(base.to_string());
}
}
}
}
None
}
}

View File

@ -1,240 +1,7 @@
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use serde_json::Value; use serde_json::Value;
use std::collections::BTreeMap;
use std::sync::Arc; use std::sync::Arc;
use std::sync::OnceLock; use crate::database::object::*;
pub fn serialize_once_lock<T: serde::Serialize, S: serde::Serializer>(
lock: &OnceLock<T>,
serializer: S,
) -> Result<S::Ok, S::Error> {
if let Some(val) = lock.get() {
val.serialize(serializer)
} else {
serializer.serialize_none()
}
}
pub fn is_once_lock_map_empty<K, V>(lock: &OnceLock<std::collections::BTreeMap<K, V>>) -> bool {
lock.get().map_or(true, |m| m.is_empty())
}
pub fn is_once_lock_vec_empty<T>(lock: &OnceLock<Vec<T>>) -> bool {
lock.get().map_or(true, |v| v.is_empty())
}
// Schema mirrors the Go Punc Generator's schema struct for consistency.
// It is an order-preserving representation of a JSON Schema.
pub fn deserialize_some<'de, D>(deserializer: D) -> Result<Option<Value>, D::Error>
where
D: serde::Deserializer<'de>,
{
let v = Value::deserialize(deserializer)?;
Ok(Some(v))
}
pub fn is_primitive_type(t: &str) -> bool {
matches!(
t,
"string" | "number" | "integer" | "boolean" | "object" | "array" | "null"
)
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Case {
#[serde(skip_serializing_if = "Option::is_none")]
pub when: Option<Arc<Schema>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub then: Option<Arc<Schema>>,
#[serde(rename = "else")]
#[serde(skip_serializing_if = "Option::is_none")]
pub else_: Option<Arc<Schema>>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct SchemaObject {
// Core Schema Keywords
#[serde(rename = "$id")]
#[serde(skip_serializing_if = "Option::is_none")]
pub id: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub title: Option<String>,
#[serde(default)] // Allow missing type
#[serde(rename = "type")]
#[serde(skip_serializing_if = "Option::is_none")]
pub type_: Option<SchemaTypeOrArray>, // Handles string or array of strings
// Object Keywords
#[serde(skip_serializing_if = "Option::is_none")]
pub properties: Option<BTreeMap<String, Arc<Schema>>>,
#[serde(rename = "patternProperties")]
#[serde(skip_serializing_if = "Option::is_none")]
pub pattern_properties: Option<BTreeMap<String, Arc<Schema>>>,
#[serde(rename = "additionalProperties")]
#[serde(skip_serializing_if = "Option::is_none")]
pub additional_properties: Option<Arc<Schema>>,
#[serde(rename = "$family")]
#[serde(skip_serializing_if = "Option::is_none")]
pub family: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub required: Option<Vec<String>>,
// dependencies can be schema dependencies or property dependencies
#[serde(skip_serializing_if = "Option::is_none")]
pub dependencies: Option<BTreeMap<String, Dependency>>,
// Array Keywords
#[serde(rename = "items")]
#[serde(skip_serializing_if = "Option::is_none")]
pub items: Option<Arc<Schema>>,
#[serde(rename = "prefixItems")]
#[serde(skip_serializing_if = "Option::is_none")]
pub prefix_items: Option<Vec<Arc<Schema>>>,
// String Validation
#[serde(rename = "minLength")]
#[serde(skip_serializing_if = "Option::is_none")]
pub min_length: Option<f64>,
#[serde(rename = "maxLength")]
#[serde(skip_serializing_if = "Option::is_none")]
pub max_length: Option<f64>,
#[serde(skip_serializing_if = "Option::is_none")]
pub pattern: Option<String>,
// Array Validation
#[serde(rename = "minItems")]
#[serde(skip_serializing_if = "Option::is_none")]
pub min_items: Option<f64>,
#[serde(rename = "maxItems")]
#[serde(skip_serializing_if = "Option::is_none")]
pub max_items: Option<f64>,
#[serde(rename = "uniqueItems")]
#[serde(skip_serializing_if = "Option::is_none")]
pub unique_items: Option<bool>,
#[serde(rename = "contains")]
#[serde(skip_serializing_if = "Option::is_none")]
pub contains: Option<Arc<Schema>>,
#[serde(rename = "minContains")]
#[serde(skip_serializing_if = "Option::is_none")]
pub min_contains: Option<f64>,
#[serde(rename = "maxContains")]
#[serde(skip_serializing_if = "Option::is_none")]
pub max_contains: Option<f64>,
// Object Validation
#[serde(rename = "minProperties")]
#[serde(skip_serializing_if = "Option::is_none")]
pub min_properties: Option<f64>,
#[serde(rename = "maxProperties")]
#[serde(skip_serializing_if = "Option::is_none")]
pub max_properties: Option<f64>,
#[serde(rename = "propertyNames")]
#[serde(skip_serializing_if = "Option::is_none")]
pub property_names: Option<Arc<Schema>>,
// Numeric Validation
#[serde(skip_serializing_if = "Option::is_none")]
pub format: Option<String>,
#[serde(rename = "enum")]
#[serde(skip_serializing_if = "Option::is_none")]
pub enum_: Option<Vec<Value>>, // `enum` is a reserved keyword in Rust
#[serde(
default,
rename = "const",
deserialize_with = "crate::database::schema::deserialize_some"
)]
#[serde(skip_serializing_if = "Option::is_none")]
pub const_: Option<Value>,
// Numeric Validation
#[serde(rename = "multipleOf")]
#[serde(skip_serializing_if = "Option::is_none")]
pub multiple_of: Option<f64>,
#[serde(skip_serializing_if = "Option::is_none")]
pub minimum: Option<f64>,
#[serde(skip_serializing_if = "Option::is_none")]
pub maximum: Option<f64>,
#[serde(rename = "exclusiveMinimum")]
#[serde(skip_serializing_if = "Option::is_none")]
pub exclusive_minimum: Option<f64>,
#[serde(rename = "exclusiveMaximum")]
#[serde(skip_serializing_if = "Option::is_none")]
pub exclusive_maximum: Option<f64>,
// Combining Keywords
#[serde(skip_serializing_if = "Option::is_none")]
pub cases: Option<Vec<Case>>,
#[serde(rename = "oneOf")]
#[serde(skip_serializing_if = "Option::is_none")]
pub one_of: Option<Vec<Arc<Schema>>>,
#[serde(rename = "not")]
#[serde(skip_serializing_if = "Option::is_none")]
pub not: Option<Arc<Schema>>,
// Custom Vocabularies
#[serde(skip_serializing_if = "Option::is_none")]
pub form: Option<Vec<String>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub display: Option<Vec<String>>,
#[serde(rename = "enumNames")]
#[serde(skip_serializing_if = "Option::is_none")]
pub enum_names: Option<Vec<String>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub control: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub actions: Option<BTreeMap<String, Action>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub computer: Option<String>,
#[serde(default)]
#[serde(skip_serializing_if = "Option::is_none")]
pub extensible: Option<bool>,
#[serde(rename = "compiledProperties")]
#[serde(skip_deserializing)]
#[serde(skip_serializing_if = "crate::database::schema::is_once_lock_vec_empty")]
#[serde(serialize_with = "crate::database::schema::serialize_once_lock")]
pub compiled_property_names: OnceLock<Vec<String>>,
#[serde(skip)]
pub compiled_properties: OnceLock<BTreeMap<String, Arc<Schema>>>,
#[serde(rename = "compiledEdges")]
#[serde(skip_deserializing)]
#[serde(skip_serializing_if = "crate::database::schema::is_once_lock_map_empty")]
#[serde(serialize_with = "crate::database::schema::serialize_once_lock")]
pub compiled_edges: OnceLock<BTreeMap<String, crate::database::edge::Edge>>,
#[serde(skip)]
pub compiled_format: OnceLock<CompiledFormat>,
#[serde(skip)]
pub compiled_pattern: OnceLock<CompiledRegex>,
#[serde(skip)]
pub compiled_pattern_properties: OnceLock<Vec<(CompiledRegex, Arc<Schema>)>>,
}
/// Represents a compiled format validator
#[derive(Clone)]
pub enum CompiledFormat {
Func(fn(&serde_json::Value) -> Result<(), Box<dyn std::error::Error + Send + Sync>>),
Regex(regex::Regex),
}
impl std::fmt::Debug for CompiledFormat {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
CompiledFormat::Func(_) => write!(f, "CompiledFormat::Func(...)"),
CompiledFormat::Regex(r) => write!(f, "CompiledFormat::Regex({:?})", r),
}
}
}
/// A wrapper for compiled regex patterns
#[derive(Debug, Clone)]
pub struct CompiledRegex(pub regex::Regex);
#[derive(Debug, Clone, Serialize, Default)] #[derive(Debug, Clone, Serialize, Default)]
pub struct Schema { pub struct Schema {
#[serde(flatten)] #[serde(flatten)]
@ -277,7 +44,7 @@ impl Schema {
let _ = self let _ = self
.obj .obj
.compiled_format .compiled_format
.set(crate::database::schema::CompiledFormat::Func(fmt.func)); .set(crate::database::object::CompiledFormat::Func(fmt.func));
} }
} }
@ -286,7 +53,7 @@ impl Schema {
let _ = self let _ = self
.obj .obj
.compiled_pattern .compiled_pattern
.set(crate::database::schema::CompiledRegex(re)); .set(crate::database::object::CompiledRegex(re));
} }
} }
@ -294,7 +61,7 @@ impl Schema {
let mut compiled = Vec::new(); let mut compiled = Vec::new();
for (k, v) in pattern_props { for (k, v) in pattern_props {
if let Ok(re) = regex::Regex::new(k) { if let Ok(re) = regex::Regex::new(k) {
compiled.push((crate::database::schema::CompiledRegex(re), v.clone())); compiled.push((crate::database::object::CompiledRegex(re), v.clone()));
} }
} }
if !compiled.is_empty() { if !compiled.is_empty() {
@ -305,10 +72,10 @@ impl Schema {
let mut props = std::collections::BTreeMap::new(); let mut props = std::collections::BTreeMap::new();
// 1. Resolve INHERITANCE dependencies first // 1. Resolve INHERITANCE dependencies first
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &self.obj.type_ { if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
if !crate::database::schema::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
if let Some(parent) = db.schemas.get(t) { if let Some(parent) = db.schemas.get(t) {
parent.compile(db, visited, errors); parent.as_ref().compile(db, visited, errors);
if let Some(p_props) = parent.obj.compiled_properties.get() { if let Some(p_props) = parent.obj.compiled_properties.get() {
props.extend(p_props.clone()); props.extend(p_props.clone());
} }
@ -316,10 +83,10 @@ impl Schema {
} }
} }
if let Some(crate::database::schema::SchemaTypeOrArray::Multiple(types)) = &self.obj.type_ { if let Some(crate::database::object::SchemaTypeOrArray::Multiple(types)) = &self.obj.type_ {
let mut custom_type_count = 0; let mut custom_type_count = 0;
for t in types { for t in types {
if !crate::database::schema::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
custom_type_count += 1; custom_type_count += 1;
} }
} }
@ -339,9 +106,9 @@ impl Schema {
} }
for t in types { for t in types {
if !crate::database::schema::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
if let Some(parent) = db.schemas.get(t) { if let Some(parent) = db.schemas.get(t) {
parent.compile(db, visited, errors); parent.as_ref().compile(db, visited, errors);
} }
} }
} }
@ -411,11 +178,220 @@ impl Schema {
} }
} }
self.compile_polymorphism(db, errors);
if let Some(id) = &self.obj.id { if let Some(id) = &self.obj.id {
visited.remove(id); visited.remove(id);
} }
} }
/// Dynamically infers and compiles all structural database relationships between this Schema
/// and its nested children. This functions recursively traverses the JSON Schema abstract syntax
/// tree, identifies physical PostgreSQL table boundaries, and locks the resulting relation
/// constraint paths directly onto the `compiled_edges` map in O(1) memory.
pub fn compile_edges(
&self,
db: &crate::database::Database,
visited: &mut std::collections::HashSet<String>,
props: &std::collections::BTreeMap<String, std::sync::Arc<Schema>>,
errors: &mut Vec<crate::drop::Error>,
) -> std::collections::BTreeMap<String, crate::database::edge::Edge> {
let mut schema_edges = std::collections::BTreeMap::new();
// Determine the physical Database Table Name this schema structurally represents
// Plucks the polymorphic discriminator via dot-notation (e.g. extracting "person" from "full.person")
let mut parent_type_name = None;
if let Some(family) = &self.obj.family {
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
} else if let Some(identifier) = self.obj.identifier() {
parent_type_name = Some(
identifier
.split('.')
.next_back()
.unwrap_or(&identifier)
.to_string(),
);
}
if let Some(p_type) = parent_type_name {
// Proceed only if the resolved table physically exists within the Postgres Type hierarchy
if db.types.contains_key(&p_type) {
// Iterate over all discovered schema boundaries mapped inside the object
for (prop_name, prop_schema) in props {
let mut child_type_name = None;
let mut target_schema = prop_schema.clone();
let mut is_array = false;
// Structurally unpack the inner target entity if the object maps to an array list
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) =
&prop_schema.obj.type_
{
if t == "array" {
is_array = true;
if let Some(items) = &prop_schema.obj.items {
target_schema = items.clone();
}
}
}
// Determine the physical Postgres table backing the nested child schema recursively
if let Some(family) = &target_schema.obj.family {
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
} else if let Some(ref_id) = target_schema.obj.identifier() {
child_type_name = Some(ref_id.split('.').next_back().unwrap_or(&ref_id).to_string());
} else if let Some(arr) = &target_schema.obj.one_of {
if let Some(first) = arr.first() {
if let Some(ref_id) = first.obj.identifier() {
child_type_name =
Some(ref_id.split('.').next_back().unwrap_or(&ref_id).to_string());
}
}
}
if let Some(c_type) = child_type_name {
if db.types.contains_key(&c_type) {
// Ensure the child Schema's AST has accurately compiled its own physical property keys so we can
// inject them securely for Many-to-Many Twin Deduction disambiguation matching.
target_schema.compile(db, visited, errors);
if let Some(compiled_target_props) = target_schema.obj.compiled_properties.get() {
let keys_for_ambiguity: Vec<String> =
compiled_target_props.keys().cloned().collect();
// Interrogate the Database catalog graph to discover the exact Foreign Key Constraint connecting the components
if let Some((relation, is_forward)) = db.resolve_relation(
&p_type,
&c_type,
prop_name,
Some(&keys_for_ambiguity),
is_array,
self.id.as_deref(),
&format!("/{}", prop_name),
errors,
) {
schema_edges.insert(
prop_name.clone(),
crate::database::edge::Edge {
constraint: relation.constraint.clone(),
forward: is_forward,
},
);
}
}
}
}
}
}
}
schema_edges
}
pub fn compile_polymorphism(
&self,
db: &crate::database::Database,
errors: &mut Vec<crate::drop::Error>,
) {
let mut options = std::collections::BTreeMap::new();
let mut strategy = String::new();
if let Some(family) = &self.obj.family {
let family_base = family.split('.').next_back().unwrap_or(family).to_string();
let family_prefix = family
.strip_suffix(&family_base)
.unwrap_or("")
.trim_end_matches('.');
if let Some(type_def) = db.types.get(&family_base) {
if type_def.variations.len() > 1 && type_def.variations.iter().any(|v| v != &family_base) {
// Scenario A / B: Table Variations
strategy = "type".to_string();
for var in &type_def.variations {
let target_id = if family_prefix.is_empty() {
var.to_string()
} else {
format!("{}.{}", family_prefix, var)
};
if db.schemas.contains_key(&target_id) {
options.insert(var.to_string(), target_id);
}
}
} else {
// Scenario C: Single Table Inheritance (Horizontal)
strategy = "kind".to_string();
let suffix = format!(".{}", family_base);
for schema in &type_def.schemas {
if let Some(id) = &schema.obj.id {
if id.ends_with(&suffix) || id == &family_base {
if let Some(kind_val) = schema.obj.get_discriminator_value("kind") {
options.insert(kind_val, id.to_string());
}
}
}
}
}
}
} else if let Some(one_of) = &self.obj.one_of {
let mut type_vals = std::collections::HashSet::new();
let mut kind_vals = std::collections::HashSet::new();
for c in one_of {
if let Some(t_val) = c.obj.get_discriminator_value("type") {
type_vals.insert(t_val);
}
if let Some(k_val) = c.obj.get_discriminator_value("kind") {
kind_vals.insert(k_val);
}
}
strategy = if type_vals.len() > 1 && type_vals.len() == one_of.len() {
"type".to_string()
} else if kind_vals.len() > 1 && kind_vals.len() == one_of.len() {
"kind".to_string()
} else {
"".to_string()
};
if strategy.is_empty() {
return;
}
for c in one_of {
if let Some(val) = c.obj.get_discriminator_value(&strategy) {
if options.contains_key(&val) {
errors.push(crate::drop::Error {
code: "POLYMORPHIC_COLLISION".to_string(),
message: format!("Polymorphic boundary defines multiple candidates mapped to the identical discriminator value '{}'.", val),
details: crate::drop::ErrorDetails::default()
});
continue;
}
let mut target_id = c.obj.id.clone();
if target_id.is_none() {
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &c.obj.type_ {
if !crate::database::object::is_primitive_type(t) {
target_id = Some(t.clone());
}
}
}
if let Some(tid) = target_id {
options.insert(val, tid);
}
}
}
} else {
return;
}
if !options.is_empty() {
let _ = self.obj.compiled_discriminator.set(strategy);
let _ = self.obj.compiled_options.set(options);
}
}
#[allow(unused_variables)] #[allow(unused_variables)]
fn validate_identifier(id: &str, field_name: &str, errors: &mut Vec<crate::drop::Error>) { fn validate_identifier(id: &str, field_name: &str, errors: &mut Vec<crate::drop::Error>) {
#[cfg(not(test))] #[cfg(not(test))]
@ -444,8 +420,8 @@ impl Schema {
Self::validate_identifier(id, "$id", errors); Self::validate_identifier(id, "$id", errors);
to_insert.push((id.clone(), self.clone())); to_insert.push((id.clone(), self.clone()));
} }
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &self.obj.type_ { if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
if !crate::database::schema::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
Self::validate_identifier(t, "type", errors); Self::validate_identifier(t, "type", errors);
} }
} }
@ -456,8 +432,8 @@ impl Schema {
// Is this schema an inline ad-hoc composition? // Is this schema an inline ad-hoc composition?
// Meaning it has a tracking context, lacks an explicit $id, but extends an Entity ref with explicit properties! // Meaning it has a tracking context, lacks an explicit $id, but extends an Entity ref with explicit properties!
if self.obj.id.is_none() && self.obj.properties.is_some() { if self.obj.id.is_none() && self.obj.properties.is_some() {
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &self.obj.type_ { if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
if !crate::database::schema::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
if let Some(ref path) = tracking_path { if let Some(ref path) = tracking_path {
to_insert.push((path.clone(), self.clone())); to_insert.push((path.clone(), self.clone()));
} }
@ -549,285 +525,6 @@ impl Schema {
} }
} }
} }
/// Dynamically infers and compiles all structural database relationships between this Schema
/// and its nested children. This functions recursively traverses the JSON Schema abstract syntax
/// tree, identifies physical PostgreSQL table boundaries, and locks the resulting relation
/// constraint paths directly onto the `compiled_edges` map in O(1) memory.
pub fn compile_edges(
&self,
db: &crate::database::Database,
visited: &mut std::collections::HashSet<String>,
props: &std::collections::BTreeMap<String, std::sync::Arc<Schema>>,
errors: &mut Vec<crate::drop::Error>,
) -> std::collections::BTreeMap<String, crate::database::edge::Edge> {
let mut schema_edges = std::collections::BTreeMap::new();
// Determine the physical Database Table Name this schema structurally represents
// Plucks the polymorphic discriminator via dot-notation (e.g. extracting "person" from "full.person")
let mut parent_type_name = None;
if let Some(family) = &self.obj.family {
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
} else if let Some(identifier) = self.obj.identifier() {
parent_type_name = Some(
identifier
.split('.')
.next_back()
.unwrap_or(&identifier)
.to_string(),
);
}
if let Some(p_type) = parent_type_name {
// Proceed only if the resolved table physically exists within the Postgres Type hierarchy
if db.types.contains_key(&p_type) {
// Iterate over all discovered schema boundaries mapped inside the object
for (prop_name, prop_schema) in props {
let mut child_type_name = None;
let mut target_schema = prop_schema.clone();
let mut is_array = false;
// Structurally unpack the inner target entity if the object maps to an array list
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) =
&prop_schema.obj.type_
{
if t == "array" {
is_array = true;
if let Some(items) = &prop_schema.obj.items {
target_schema = items.clone();
}
}
}
// Determine the physical Postgres table backing the nested child schema recursively
if let Some(family) = &target_schema.obj.family {
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
} else if let Some(ref_id) = target_schema.obj.identifier() {
child_type_name = Some(ref_id.split('.').next_back().unwrap_or(&ref_id).to_string());
} else if let Some(arr) = &target_schema.obj.one_of {
if let Some(first) = arr.first() {
if let Some(ref_id) = first.obj.identifier() {
child_type_name =
Some(ref_id.split('.').next_back().unwrap_or(&ref_id).to_string());
}
}
}
if let Some(c_type) = child_type_name {
if db.types.contains_key(&c_type) {
// Ensure the child Schema's AST has accurately compiled its own physical property keys so we can
// inject them securely for Many-to-Many Twin Deduction disambiguation matching.
target_schema.compile(db, visited, errors);
if let Some(compiled_target_props) = target_schema.obj.compiled_properties.get() {
let keys_for_ambiguity: Vec<String> =
compiled_target_props.keys().cloned().collect();
// Interrogate the Database catalog graph to discover the exact Foreign Key Constraint connecting the components
if let Some((relation, is_forward)) = resolve_relation(
db,
&p_type,
&c_type,
prop_name,
Some(&keys_for_ambiguity),
is_array,
self.id.as_deref(),
&format!("/{}", prop_name),
errors,
) {
schema_edges.insert(
prop_name.clone(),
crate::database::edge::Edge {
constraint: relation.constraint.clone(),
forward: is_forward,
},
);
}
}
}
}
}
}
}
schema_edges
}
}
/// Inspects the Postgres pg_constraint relations catalog to securely identify
/// the precise Foreign Key connecting a parent and child hierarchy path.
pub(crate) fn resolve_relation<'a>(
db: &'a crate::database::Database,
parent_type: &str,
child_type: &str,
prop_name: &str,
relative_keys: Option<&Vec<String>>,
is_array: bool,
schema_id: Option<&str>,
path: &str,
errors: &mut Vec<crate::drop::Error>,
) -> Option<(&'a crate::database::relation::Relation, bool)> {
// Enforce graph locality by ensuring we don't accidentally crawl to pure structural entity boundaries
if parent_type == "entity" && child_type == "entity" {
return None;
}
let p_def = db.types.get(parent_type)?;
let c_def = db.types.get(child_type)?;
let mut matching_rels = Vec::new();
let mut directions = Vec::new();
// Scour the complete catalog for any Edge matching the inheritance scope of the two objects
// This automatically binds polymorphic structures (e.g. recognizing a relationship targeting User
// also natively binds instances specifically typed as Person).
let mut all_rels: Vec<&crate::database::relation::Relation> = db.relations.values().collect();
all_rels.sort_by(|a, b| a.constraint.cmp(&b.constraint));
for rel in all_rels {
let mut is_forward =
p_def.hierarchy.contains(&rel.source_type) && c_def.hierarchy.contains(&rel.destination_type);
let is_reverse =
p_def.hierarchy.contains(&rel.destination_type) && c_def.hierarchy.contains(&rel.source_type);
// Structural Cardinality Filtration:
// If the schema requires a collection (Array), it is mathematically impossible for a pure
// Forward scalar edge (where the parent holds exactly one UUID pointer) to fulfill a One-to-Many request.
// Thus, if it's an array, we fully reject pure Forward edges and only accept Reverse edges (or Junction edges).
if is_array && is_forward && !is_reverse {
is_forward = false;
}
if is_forward {
matching_rels.push(rel);
directions.push(true);
} else if is_reverse {
matching_rels.push(rel);
directions.push(false);
}
}
// Abort relation discovery early if no hierarchical inheritance match was found
if matching_rels.is_empty() {
let mut details = crate::drop::ErrorDetails {
path: path.to_string(),
..Default::default()
};
if let Some(sid) = schema_id {
details.schema = Some(sid.to_string());
}
errors.push(crate::drop::Error {
code: "EDGE_MISSING".to_string(),
message: format!(
"No database relation exists between '{}' and '{}' for property '{}'",
parent_type, child_type, prop_name
),
details,
});
return None;
}
// Ideal State: The objects only share a solitary structural relation, resolving ambiguity instantly.
if matching_rels.len() == 1 {
return Some((matching_rels[0], directions[0]));
}
let mut chosen_idx = 0;
let mut resolved = false;
// Exact Prefix Disambiguation: Determine if the database specifically names this constraint
// directly mapping to the JSON Schema property name (e.g., `fk_{child}_{property_name}`)
for (i, rel) in matching_rels.iter().enumerate() {
if let Some(prefix) = &rel.prefix {
if prop_name.starts_with(prefix)
|| prefix.starts_with(prop_name)
|| prefix.replace("_", "") == prop_name.replace("_", "")
{
chosen_idx = i;
resolved = true;
break;
}
}
}
// Complex Subgraph Resolution: The database contains multiple equally explicit foreign key constraints
// linking these objects (such as pointing to `source` and `target` in Many-to-Many junction models).
if !resolved && relative_keys.is_some() {
// Twin Deduction Pass 1: We inspect the exact properties structurally defined inside the compiled payload
// to observe which explicit relation arrow the child payload natively consumes.
let keys = relative_keys.unwrap();
let mut consumed_rel_idx = None;
for (i, rel) in matching_rels.iter().enumerate() {
if let Some(prefix) = &rel.prefix {
if keys.contains(prefix) {
consumed_rel_idx = Some(i);
break; // Found the routing edge explicitly consumed by the schema payload
}
}
}
// Twin Deduction Pass 2: Knowing which arrow points outbound, we can mathematically deduce its twin
// providing the reverse ownership on the same junction boundary must be the incoming Edge to the parent.
if let Some(used_idx) = consumed_rel_idx {
let used_rel = matching_rels[used_idx];
let mut twin_ids = Vec::new();
for (i, rel) in matching_rels.iter().enumerate() {
if i != used_idx
&& rel.source_type == used_rel.source_type
&& rel.destination_type == used_rel.destination_type
&& rel.prefix.is_some()
{
twin_ids.push(i);
}
}
if twin_ids.len() == 1 {
chosen_idx = twin_ids[0];
resolved = true;
}
}
}
// Implicit Base Fallback: If no complex explicit paths resolve, but exactly one relation
// sits entirely naked (without a constraint prefix), it must be the core structural parent ownership.
if !resolved {
let mut null_prefix_ids = Vec::new();
for (i, rel) in matching_rels.iter().enumerate() {
if rel.prefix.is_none() {
null_prefix_ids.push(i);
}
}
if null_prefix_ids.len() == 1 {
chosen_idx = null_prefix_ids[0];
resolved = true;
}
}
// If we exhausted all mathematical deduction pathways and STILL cannot isolate a single edge,
// we must abort rather than silently guessing. Returning None prevents arbitrary SQL generation
// and forces a clean structural error for the architect.
if !resolved {
let mut details = crate::drop::ErrorDetails {
path: path.to_string(),
context: serde_json::to_value(&matching_rels).ok(),
cause: Some("Multiple conflicting constraints found matching prefixes".to_string()),
..Default::default()
};
if let Some(sid) = schema_id {
details.schema = Some(sid.to_string());
}
errors.push(crate::drop::Error {
code: "AMBIGUOUS_TYPE_RELATIONS".to_string(),
message: format!(
"Ambiguous database relation between '{}' and '{}' for property '{}'",
parent_type, child_type, prop_name
),
details,
});
return None;
}
Some((matching_rels[chosen_idx], directions[chosen_idx]))
} }
impl<'de> Deserialize<'de> for Schema { impl<'de> Deserialize<'de> for Schema {
@ -880,38 +577,3 @@ impl<'de> Deserialize<'de> for Schema {
}) })
} }
} }
impl SchemaObject {
pub fn identifier(&self) -> Option<String> {
if let Some(id) = &self.id {
return Some(id.split('.').next_back().unwrap_or("").to_string());
}
if let Some(SchemaTypeOrArray::Single(t)) = &self.type_ {
if !is_primitive_type(t) {
return Some(t.split('.').next_back().unwrap_or("").to_string());
}
}
None
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(untagged)]
pub enum SchemaTypeOrArray {
Single(String),
Multiple(Vec<String>),
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Action {
#[serde(skip_serializing_if = "Option::is_none")]
pub navigate: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub punc: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(untagged)]
pub enum Dependency {
Props(Vec<String>),
Schema(Arc<Schema>),
}

View File

@ -25,7 +25,7 @@ impl Merger {
let mut notifications_queue = Vec::new(); let mut notifications_queue = Vec::new();
let target_schema = match self.db.schemas.get(schema_id) { let target_schema = match self.db.schemas.get(schema_id) {
Some(s) => Arc::new(s.clone()), Some(s) => Arc::clone(s),
None => { None => {
return crate::drop::Drop::with_errors(vec![crate::drop::Error { return crate::drop::Drop::with_errors(vec![crate::drop::Error {
code: "MERGE_FAILED".to_string(), code: "MERGE_FAILED".to_string(),
@ -131,13 +131,33 @@ impl Merger {
pub(crate) fn merge_internal( pub(crate) fn merge_internal(
&self, &self,
schema: Arc<crate::database::schema::Schema>, mut schema: Arc<crate::database::schema::Schema>,
data: Value, data: Value,
notifications: &mut Vec<String>, notifications: &mut Vec<String>,
) -> Result<Value, String> { ) -> Result<Value, String> {
match data { match data {
Value::Array(items) => self.merge_array(schema, items, notifications), Value::Array(items) => self.merge_array(schema, items, notifications),
Value::Object(map) => self.merge_object(schema, map, notifications), Value::Object(map) => {
if let Some(options) = schema.obj.compiled_options.get() {
if let Some(disc) = schema.obj.compiled_discriminator.get() {
let val = map.get(disc).and_then(|v| v.as_str());
if let Some(v) = val {
if let Some(target_id) = options.get(v) {
if let Some(target_schema) = self.db.schemas.get(target_id) {
schema = Arc::clone(target_schema);
} else {
return Err(format!("Polymorphic mapped target '{}' not found in database registry", target_id));
}
} else {
return Err(format!("Polymorphic discriminator {}='{}' matched no compiled options", disc, v));
}
} else {
return Err(format!("Polymorphic merging failed: missing required discriminator '{}'", disc));
}
}
}
self.merge_object(schema, map, notifications)
},
_ => Err("Invalid merge payload: root must be an Object or Array".to_string()), _ => Err("Invalid merge payload: root must be an Object or Array".to_string()),
} }
} }
@ -149,7 +169,7 @@ impl Merger {
notifications: &mut Vec<String>, notifications: &mut Vec<String>,
) -> Result<Value, String> { ) -> Result<Value, String> {
let mut item_schema = schema.clone(); let mut item_schema = schema.clone();
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &schema.obj.type_ { if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &schema.obj.type_ {
if t == "array" { if t == "array" {
if let Some(items_def) = &schema.obj.items { if let Some(items_def) = &schema.obj.items {
item_schema = items_def.clone(); item_schema = items_def.clone();
@ -369,7 +389,7 @@ impl Merger {
); );
let mut item_schema = rel_schema.clone(); let mut item_schema = rel_schema.clone();
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) =
&rel_schema.obj.type_ &rel_schema.obj.type_
{ {
if t == "array" { if t == "array" {

View File

@ -1,5 +1,6 @@
use crate::database::Database; use crate::database::Database;
use std::sync::Arc; use std::sync::Arc;
pub struct Compiler<'a> { pub struct Compiler<'a> {
pub db: &'a Database, pub db: &'a Database,
pub filter_keys: &'a [String], pub filter_keys: &'a [String],
@ -16,6 +17,7 @@ pub struct Node<'a> {
pub property_name: Option<String>, pub property_name: Option<String>,
pub depth: usize, pub depth: usize,
pub ast_path: String, pub ast_path: String,
pub is_polymorphic_branch: bool,
} }
impl<'a> Compiler<'a> { impl<'a> Compiler<'a> {
@ -27,7 +29,7 @@ impl<'a> Compiler<'a> {
.get(schema_id) .get(schema_id)
.ok_or_else(|| format!("Schema not found: {}", schema_id))?; .ok_or_else(|| format!("Schema not found: {}", schema_id))?;
let target_schema = std::sync::Arc::new(schema.clone()); let target_schema = std::sync::Arc::clone(schema);
let mut compiler = Compiler { let mut compiler = Compiler {
db: &self.db, db: &self.db,
@ -44,6 +46,7 @@ impl<'a> Compiler<'a> {
property_name: None, property_name: None,
depth: 0, depth: 0,
ast_path: String::new(), ast_path: String::new(),
is_polymorphic_branch: false,
}; };
let (sql, _) = compiler.compile_node(node)?; let (sql, _) = compiler.compile_node(node)?;
@ -55,7 +58,7 @@ impl<'a> Compiler<'a> {
fn compile_node(&mut self, node: Node<'a>) -> Result<(String, String), String> { fn compile_node(&mut self, node: Node<'a>) -> Result<(String, String), String> {
// Determine the base schema type (could be an array, object, or literal) // Determine the base schema type (could be an array, object, or literal)
match &node.schema.obj.type_ { match &node.schema.obj.type_ {
Some(crate::database::schema::SchemaTypeOrArray::Single(t)) if t == "array" => { Some(crate::database::object::SchemaTypeOrArray::Single(t)) if t == "array" => {
self.compile_array(node) self.compile_array(node)
} }
_ => self.compile_reference(node), _ => self.compile_reference(node),
@ -103,7 +106,11 @@ impl<'a> Compiler<'a> {
let mut resolved_type = None; let mut resolved_type = None;
if let Some(family_target) = node.schema.obj.family.as_ref() { if let Some(family_target) = node.schema.obj.family.as_ref() {
resolved_type = self.db.types.get(family_target); let base_type_name = family_target
.split('.')
.next_back()
.unwrap_or(family_target);
resolved_type = self.db.types.get(base_type_name);
} else if let Some(base_type_name) = node.schema.obj.identifier() { } else if let Some(base_type_name) = node.schema.obj.identifier() {
resolved_type = self.db.types.get(&base_type_name); resolved_type = self.db.types.get(&base_type_name);
} }
@ -113,46 +120,30 @@ impl<'a> Compiler<'a> {
} }
// Handle Direct Refs via type pointer // Handle Direct Refs via type pointer
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &node.schema.obj.type_ { if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &node.schema.obj.type_ {
if !crate::database::schema::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
// If it's just an ad-hoc struct ref, we should resolve it // If it's just an ad-hoc struct ref, we should resolve it
if let Some(target_schema) = self.db.schemas.get(t) { if let Some(target_schema) = self.db.schemas.get(t) {
let mut ref_node = node.clone(); let mut ref_node = node.clone();
ref_node.schema = std::sync::Arc::new(target_schema.clone()); ref_node.schema = Arc::clone(target_schema);
return self.compile_node(ref_node); return self.compile_node(ref_node);
} }
return Err(format!("Unresolved schema type pointer: {}", t)); return Err(format!("Unresolved schema type pointer: {}", t));
} }
} }
// Handle $family Polymorphism fallbacks for relations // Handle Polymorphism fallbacks for relations
if let Some(family_target) = &node.schema.obj.family { if node.schema.obj.family.is_some() || node.schema.obj.one_of.is_some() {
let mut all_targets = vec![family_target.clone()]; if let Some(options) = node.schema.obj.compiled_options.get() {
if let Some(descendants) = self.db.descendants.get(family_target) { if options.len() == 1 {
all_targets.extend(descendants.clone()); let target_id = options.values().next().unwrap();
} let mut bypass_schema = crate::database::schema::Schema::default();
bypass_schema.obj.type_ = Some(crate::database::object::SchemaTypeOrArray::Single(target_id.clone()));
if all_targets.len() == 1 { let mut bypass_node = node.clone();
let mut bypass_schema = crate::database::schema::Schema::default(); bypass_node.schema = std::sync::Arc::new(bypass_schema);
bypass_schema.obj.type_ = Some(crate::database::schema::SchemaTypeOrArray::Single(all_targets[0].clone())); return self.compile_node(bypass_node);
let mut bypass_node = node.clone(); }
bypass_node.schema = std::sync::Arc::new(bypass_schema); }
return self.compile_node(bypass_node); return self.compile_one_of(node);
}
all_targets.sort();
let mut family_schemas = Vec::new();
for variation in &all_targets {
let mut ref_schema = crate::database::schema::Schema::default();
ref_schema.obj.type_ = Some(crate::database::schema::SchemaTypeOrArray::Single(variation.clone()));
family_schemas.push(std::sync::Arc::new(ref_schema));
}
return self.compile_one_of(&family_schemas, node);
}
// Handle oneOf Polymorphism fallbacks for relations
if let Some(one_of) = &node.schema.obj.one_of {
return self.compile_one_of(one_of, node.clone());
} }
// Just an inline object definition? // Just an inline object definition?
@ -179,17 +170,28 @@ impl<'a> Compiler<'a> {
) -> Result<(String, String), String> { ) -> Result<(String, String), String> {
let (table_aliases, from_clauses) = self.compile_from_clause(r#type); let (table_aliases, from_clauses) = self.compile_from_clause(r#type);
// 2. Map properties and build jsonb_build_object args let jsonb_obj_sql = if node.schema.obj.family.is_some() || node.schema.obj.one_of.is_some() {
let mut select_args = self.compile_select_clause(r#type, &table_aliases, node.clone())?; let base_alias = table_aliases
.get(&r#type.name)
.cloned()
.unwrap_or_else(|| node.parent_alias.to_string());
// 2.5 Inject polymorphism directly into the query object let mut case_node = node.clone();
let mut poly_args = self.compile_polymorphism_select(r#type, &table_aliases, node.clone())?; case_node.parent_alias = base_alias.clone();
select_args.append(&mut poly_args); let arc_aliases = std::sync::Arc::new(table_aliases.clone());
case_node.parent_type_aliases = Some(arc_aliases);
case_node.parent_type = Some(r#type);
let jsonb_obj_sql = if select_args.is_empty() { let (case_sql, _) = self.compile_one_of(case_node)?;
"jsonb_build_object()".to_string() case_sql
} else { } else {
format!("jsonb_build_object({})", select_args.join(", ")) let select_args = self.compile_select_clause(r#type, &table_aliases, node.clone())?;
if select_args.is_empty() {
"jsonb_build_object()".to_string()
} else {
format!("jsonb_build_object({})", select_args.join(", "))
}
}; };
// 3. Build WHERE clauses // 3. Build WHERE clauses
@ -218,90 +220,6 @@ impl<'a> Compiler<'a> {
)) ))
} }
fn compile_polymorphism_select(
&mut self,
r#type: &'a crate::database::r#type::Type,
table_aliases: &std::collections::HashMap<String, String>,
node: Node<'a>,
) -> Result<Vec<String>, String> {
let mut select_args = Vec::new();
if let Some(family_target) = node.schema.obj.family.as_ref() {
let family_prefix = family_target.rfind('.').map(|idx| &family_target[..idx]);
let mut all_targets = vec![family_target.clone()];
if let Some(descendants) = self.db.descendants.get(family_target) {
all_targets.extend(descendants.clone());
}
// Filter targets to EXACTLY match the family_target prefix
let mut final_targets = Vec::new();
for target in all_targets {
let target_prefix = target.rfind('.').map(|idx| &target[..idx]);
if target_prefix == family_prefix {
final_targets.push(target);
}
}
final_targets.sort();
final_targets.dedup();
if final_targets.len() == 1 {
let variation = &final_targets[0];
if let Some(target_schema) = self.db.schemas.get(variation) {
let mut bypass_node = node.clone();
bypass_node.schema = std::sync::Arc::new(target_schema.clone());
let mut bypassed_args = self.compile_select_clause(r#type, table_aliases, bypass_node)?;
select_args.append(&mut bypassed_args);
} else {
return Err(format!("Could not find schema for variation {}", variation));
}
} else {
let mut family_schemas = Vec::new();
for variation in &final_targets {
if let Some(target_schema) = self.db.schemas.get(variation) {
family_schemas.push(std::sync::Arc::new(target_schema.clone()));
} else {
return Err(format!(
"Could not find schema metadata for variation {}",
variation
));
}
}
let base_alias = table_aliases
.get(&r#type.name)
.cloned()
.unwrap_or_else(|| node.parent_alias.to_string());
select_args.push(format!("'id', {}.id", base_alias));
let mut case_node = node.clone();
case_node.parent_alias = base_alias.clone();
let arc_aliases = std::sync::Arc::new(table_aliases.clone());
case_node.parent_type_aliases = Some(arc_aliases);
let (case_sql, _) = self.compile_one_of(&family_schemas, case_node)?;
select_args.push(format!("'type', {}", case_sql));
}
} else if let Some(one_of) = &node.schema.obj.one_of {
let base_alias = table_aliases
.get(&r#type.name)
.cloned()
.unwrap_or_else(|| node.parent_alias.to_string());
select_args.push(format!("'id', {}.id", base_alias));
let mut case_node = node.clone();
case_node.parent_alias = base_alias.clone();
let arc_aliases = std::sync::Arc::new(table_aliases.clone());
case_node.parent_type_aliases = Some(arc_aliases);
let (case_sql, _) = self.compile_one_of(one_of, case_node)?;
select_args.push(format!("'type', {}", case_sql));
}
Ok(select_args)
}
fn compile_object( fn compile_object(
&mut self, &mut self,
props: &std::collections::BTreeMap<String, std::sync::Arc<crate::database::schema::Schema>>, props: &std::collections::BTreeMap<String, std::sync::Arc<crate::database::schema::Schema>>,
@ -333,26 +251,45 @@ impl<'a> Compiler<'a> {
fn compile_one_of( fn compile_one_of(
&mut self, &mut self,
schemas: &[Arc<crate::database::schema::Schema>],
node: Node<'a>, node: Node<'a>,
) -> Result<(String, String), String> { ) -> Result<(String, String), String> {
let mut case_statements = Vec::new(); let mut case_statements = Vec::new();
let options = node.schema.obj.compiled_options.get().ok_or("Missing compiled options for polymorphism")?;
let disc = node.schema.obj.compiled_discriminator.get().ok_or("Missing compiled discriminator for polymorphism")?;
let type_col = if let Some(prop) = &node.property_name { let type_col = if let Some(prop) = &node.property_name {
format!("{}_type", prop) format!("{}_{}", prop, disc)
} else { } else {
"type".to_string() disc.to_string()
}; };
for option_schema in schemas { for (disc_val, target_id) in options {
if let Some(base_type_name) = option_schema.obj.identifier() { if let Some(target_schema) = self.db.schemas.get(target_id) {
// Generate the nested SQL for this specific target type
let mut child_node = node.clone(); let mut child_node = node.clone();
child_node.schema = std::sync::Arc::clone(option_schema); child_node.schema = Arc::clone(target_schema);
let (val_sql, _) = self.compile_node(child_node)?; child_node.is_polymorphic_branch = true;
let val_sql = if disc == "kind" && node.parent_type.is_some() && node.parent_type_aliases.is_some() {
let aliases_arc = node.parent_type_aliases.as_ref().unwrap();
let aliases = aliases_arc.as_ref();
let p_type = node.parent_type.unwrap();
let select_args = self.compile_select_clause(p_type, aliases, child_node.clone())?;
if select_args.is_empty() {
"jsonb_build_object()".to_string()
} else {
format!("jsonb_build_object({})", select_args.join(", "))
}
} else {
let (sql, _) = self.compile_node(child_node)?;
sql
};
case_statements.push(format!( case_statements.push(format!(
"WHEN {}.{} = '{}' THEN ({})", "WHEN {}.{} = '{}' THEN ({})",
node.parent_alias, type_col, base_type_name, val_sql node.parent_alias, type_col, disc_val, val_sql
)); ));
} }
} }
@ -401,7 +338,9 @@ impl<'a> Compiler<'a> {
) -> Result<Vec<String>, String> { ) -> Result<Vec<String>, String> {
let mut select_args = Vec::new(); let mut select_args = Vec::new();
let grouped_fields = r#type.grouped_fields.as_ref().and_then(|v| v.as_object()); let grouped_fields = r#type.grouped_fields.as_ref().and_then(|v| v.as_object());
let merged_props = node.schema.obj.compiled_properties.get().unwrap(); let default_props = std::collections::BTreeMap::new();
let merged_props = node.schema.obj.compiled_properties.get().unwrap_or(&default_props);
let mut sorted_keys: Vec<&String> = merged_props.keys().collect(); let mut sorted_keys: Vec<&String> = merged_props.keys().collect();
sorted_keys.sort(); sorted_keys.sort();
@ -409,18 +348,18 @@ impl<'a> Compiler<'a> {
let prop_schema = &merged_props[prop_key]; let prop_schema = &merged_props[prop_key];
let is_object_or_array = match &prop_schema.obj.type_ { let is_object_or_array = match &prop_schema.obj.type_ {
Some(crate::database::schema::SchemaTypeOrArray::Single(s)) => { Some(crate::database::object::SchemaTypeOrArray::Single(s)) => {
s == "object" || s == "array" s == "object" || s == "array"
} }
Some(crate::database::schema::SchemaTypeOrArray::Multiple(v)) => { Some(crate::database::object::SchemaTypeOrArray::Multiple(v)) => {
v.contains(&"object".to_string()) || v.contains(&"array".to_string()) v.contains(&"object".to_string()) || v.contains(&"array".to_string())
} }
_ => false, _ => false,
}; };
let is_custom_object_pointer = match &prop_schema.obj.type_ { let is_custom_object_pointer = match &prop_schema.obj.type_ {
Some(crate::database::schema::SchemaTypeOrArray::Single(s)) => { Some(crate::database::object::SchemaTypeOrArray::Single(s)) => {
!crate::database::schema::is_primitive_type(s) !crate::database::object::is_primitive_type(s)
} }
_ => false, _ => false,
}; };
@ -470,6 +409,7 @@ impl<'a> Compiler<'a> {
} else { } else {
format!("{}/{}", node.ast_path, prop_key) format!("{}/{}", node.ast_path, prop_key)
}, },
is_polymorphic_branch: false,
}; };
let (val_sql, val_type) = self.compile_node(child_node)?; let (val_sql, val_type) = self.compile_node(child_node)?;
@ -509,6 +449,8 @@ impl<'a> Compiler<'a> {
self.compile_filter_conditions(r#type, type_aliases, &node, &base_alias, &mut where_clauses); self.compile_filter_conditions(r#type, type_aliases, &node, &base_alias, &mut where_clauses);
self.compile_polymorphic_bounds(r#type, type_aliases, &node, &mut where_clauses); self.compile_polymorphic_bounds(r#type, type_aliases, &node, &mut where_clauses);
let start_len = where_clauses.len();
self.compile_relation_conditions( self.compile_relation_conditions(
r#type, r#type,
type_aliases, type_aliases,
@ -517,6 +459,14 @@ impl<'a> Compiler<'a> {
&mut where_clauses, &mut where_clauses,
)?; )?;
if node.is_polymorphic_branch && where_clauses.len() == start_len {
if let Some(parent_aliases) = &node.parent_type_aliases {
if let Some(outer_entity_alias) = parent_aliases.get("entity") {
where_clauses.push(format!("{}.id = {}.id", entity_alias, outer_entity_alias));
}
}
}
Ok(where_clauses) Ok(where_clauses)
} }

View File

@ -1439,6 +1439,18 @@ fn test_queryer_0_10() {
crate::tests::runner::run_test_case(&path, 0, 10).unwrap(); crate::tests::runner::run_test_case(&path, 0, 10).unwrap();
} }
#[test]
fn test_queryer_0_11() {
let path = format!("{}/fixtures/queryer.json", env!("CARGO_MANIFEST_DIR"));
crate::tests::runner::run_test_case(&path, 0, 11).unwrap();
}
#[test]
fn test_queryer_0_12() {
let path = format!("{}/fixtures/queryer.json", env!("CARGO_MANIFEST_DIR"));
crate::tests::runner::run_test_case(&path, 0, 12).unwrap();
}
#[test] #[test]
fn test_polymorphism_0_0() { fn test_polymorphism_0_0() {
let path = format!("{}/fixtures/polymorphism.json", env!("CARGO_MANIFEST_DIR")); let path = format!("{}/fixtures/polymorphism.json", env!("CARGO_MANIFEST_DIR"));

View File

@ -13,7 +13,7 @@ impl<'a> ValidationContext<'a> {
if let Some(ref type_) = self.schema.type_ { if let Some(ref type_) = self.schema.type_ {
match type_ { match type_ {
crate::database::schema::SchemaTypeOrArray::Single(t) => { crate::database::object::SchemaTypeOrArray::Single(t) => {
if !Validator::check_type(t, current) { if !Validator::check_type(t, current) {
result.errors.push(ValidationError { result.errors.push(ValidationError {
code: "INVALID_TYPE".to_string(), code: "INVALID_TYPE".to_string(),
@ -22,7 +22,7 @@ impl<'a> ValidationContext<'a> {
}); });
} }
} }
crate::database::schema::SchemaTypeOrArray::Multiple(types) => { crate::database::object::SchemaTypeOrArray::Multiple(types) => {
let mut valid = false; let mut valid = false;
for t in types { for t in types {
if Validator::check_type(t, current) { if Validator::check_type(t, current) {

View File

@ -10,7 +10,7 @@ impl<'a> ValidationContext<'a> {
let current = self.instance; let current = self.instance;
if let Some(compiled_fmt) = self.schema.compiled_format.get() { if let Some(compiled_fmt) = self.schema.compiled_format.get() {
match compiled_fmt { match compiled_fmt {
crate::database::schema::CompiledFormat::Func(f) => { crate::database::object::CompiledFormat::Func(f) => {
let should = if let Some(s) = current.as_str() { let should = if let Some(s) = current.as_str() {
!s.is_empty() !s.is_empty()
} else { } else {
@ -24,7 +24,7 @@ impl<'a> ValidationContext<'a> {
}); });
} }
} }
crate::database::schema::CompiledFormat::Regex(re) => { crate::database::object::CompiledFormat::Regex(re) => {
if let Some(s) = current.as_str() if let Some(s) = current.as_str()
&& !re.is_match(s) && !re.is_match(s)
{ {

View File

@ -124,7 +124,7 @@ impl<'a> ValidationContext<'a> {
for (prop, dep) in deps { for (prop, dep) in deps {
if obj.contains_key(prop) { if obj.contains_key(prop) {
match dep { match dep {
crate::database::schema::Dependency::Props(required_props) => { crate::database::object::Dependency::Props(required_props) => {
for req_prop in required_props { for req_prop in required_props {
if !obj.contains_key(req_prop) { if !obj.contains_key(req_prop) {
result.errors.push(ValidationError { result.errors.push(ValidationError {
@ -135,7 +135,7 @@ impl<'a> ValidationContext<'a> {
} }
} }
} }
crate::database::schema::Dependency::Schema(dep_schema) => { crate::database::object::Dependency::Schema(dep_schema) => {
let derived = self.derive_for_schema(dep_schema, false); let derived = self.derive_for_schema(dep_schema, false);
let dep_res = derived.validate()?; let dep_res = derived.validate()?;
result.evaluated_keys.extend(dep_res.evaluated_keys.clone()); result.evaluated_keys.extend(dep_res.evaluated_keys.clone());
@ -155,7 +155,7 @@ impl<'a> ValidationContext<'a> {
if let Some(child_instance) = obj.get(key) { if let Some(child_instance) = obj.get(key) {
let new_path = self.join_path(key); let new_path = self.join_path(key);
let is_ref = match &sub_schema.type_ { let is_ref = match &sub_schema.type_ {
Some(crate::database::schema::SchemaTypeOrArray::Single(t)) => !crate::database::schema::is_primitive_type(t), Some(crate::database::object::SchemaTypeOrArray::Single(t)) => !crate::database::object::is_primitive_type(t),
_ => false, _ => false,
}; };
let next_extensible = if is_ref { false } else { self.extensible }; let next_extensible = if is_ref { false } else { self.extensible };
@ -184,7 +184,7 @@ impl<'a> ValidationContext<'a> {
if compiled_re.0.is_match(key) { if compiled_re.0.is_match(key) {
let new_path = self.join_path(key); let new_path = self.join_path(key);
let is_ref = match &sub_schema.type_ { let is_ref = match &sub_schema.type_ {
Some(crate::database::schema::SchemaTypeOrArray::Single(t)) => !crate::database::schema::is_primitive_type(t), Some(crate::database::object::SchemaTypeOrArray::Single(t)) => !crate::database::object::is_primitive_type(t),
_ => false, _ => false,
}; };
let next_extensible = if is_ref { false } else { self.extensible }; let next_extensible = if is_ref { false } else { self.extensible };
@ -226,7 +226,7 @@ impl<'a> ValidationContext<'a> {
if !locally_matched { if !locally_matched {
let new_path = self.join_path(key); let new_path = self.join_path(key);
let is_ref = match &additional_schema.type_ { let is_ref = match &additional_schema.type_ {
Some(crate::database::schema::SchemaTypeOrArray::Single(t)) => !crate::database::schema::is_primitive_type(t), Some(crate::database::object::SchemaTypeOrArray::Single(t)) => !crate::database::object::is_primitive_type(t),
_ => false, _ => false,
}; };
let next_extensible = if is_ref { false } else { self.extensible }; let next_extensible = if is_ref { false } else { self.extensible };

View File

@ -1,4 +1,3 @@
use crate::database::schema::Schema;
use crate::validator::context::ValidationContext; use crate::validator::context::ValidationContext;
use crate::validator::error::ValidationError; use crate::validator::error::ValidationError;
use crate::validator::result::ValidationResult; use crate::validator::result::ValidationResult;
@ -29,30 +28,10 @@ impl<'a> ValidationContext<'a> {
} }
} }
if let Some(family_target) = &self.schema.family { if self.schema.family.is_some() {
if let Some(descendants) = self.db.descendants.get(family_target) { if let Some(options) = self.schema.compiled_options.get() {
let mut candidates = Vec::new(); if let Some(disc) = self.schema.compiled_discriminator.get() {
return self.execute_polymorph(disc, options, result);
// Add the target base schema itself
if let Some(base_schema) = self.db.schemas.get(family_target) {
candidates.push(base_schema);
}
// Add all descendants
for child_id in descendants {
if let Some(child_schema) = self.db.schemas.get(child_id) {
candidates.push(child_schema);
}
}
// Use prefix from family string (e.g. `light.`)
let prefix = family_target
.rsplit_once('.')
.map(|(p, _)| format!("{}.", p))
.unwrap_or_default();
if !self.validate_polymorph(&candidates, Some(&prefix), result)? {
return Ok(false);
} }
} }
} }
@ -64,212 +43,43 @@ impl<'a> ValidationContext<'a> {
&self, &self,
result: &mut ValidationResult, result: &mut ValidationResult,
) -> Result<bool, ValidationError> { ) -> Result<bool, ValidationError> {
if let Some(ref one_of) = self.schema.one_of { if let Some(one_of) = &self.schema.one_of {
let mut candidates = Vec::new(); if let Some(options) = self.schema.compiled_options.get() {
for schema in one_of { if let Some(disc) = self.schema.compiled_discriminator.get() {
candidates.push(schema.as_ref()); return self.execute_polymorph(disc, options, result);
}
} }
if !self.validate_polymorph(&candidates, None, result)? {
return Ok(false);
}
}
Ok(true)
}
pub(crate) fn validate_polymorph( // Native Draft2020-12 oneOf Evaluation Fallback
&self, let mut valid_count = 0;
candidates: &[&Schema], let mut final_successful_result = None;
family_prefix: Option<&str>, let mut failed_candidates = Vec::new();
result: &mut ValidationResult,
) -> Result<bool, ValidationError> {
let mut passed_candidates: Vec<(Option<String>, ValidationResult)> = Vec::new();
let mut failed_candidates: Vec<ValidationResult> = Vec::new();
// 1. O(1) Fast-Path Router & Extractor for child_schema in one_of {
let instance_type = self.instance.as_object().and_then(|o| o.get("type")).and_then(|t| t.as_str()); let derived = self.derive_for_schema(child_schema, false);
let instance_kind = self.instance.as_object().and_then(|o| o.get("kind")).and_then(|k| k.as_str()); if let Ok(sub_res) = derived.validate_scoped() {
if sub_res.is_valid() {
let mut viable_candidates = Vec::new(); valid_count += 1;
final_successful_result = Some(sub_res.clone());
for sub in candidates { } else {
let _child_id = sub.identifier().unwrap_or_default(); failed_candidates.push(sub_res);
let mut can_match = true;
if let Some(t) = instance_type {
// Fast Path 1: Pure Ad-Hoc Match (schema identifier == type)
// If it matches exactly, it's our golden candidate. Make all others non-viable manually?
// Wait, we loop through all and filter down. If exact match is found, we should ideally break and use ONLY that.
// Let's implement the logic safely.
let mut exact_match_found = false;
if let Some(schema_id) = &sub.id {
// Compute Vertical Exact Target (e.g. "person" or "light.person")
let exact_target = if let Some(prefix) = family_prefix {
format!("{}{}", prefix, t)
} else {
t.to_string()
};
// Fast Path 1 & 2: Vertical Exact Match
if schema_id == &exact_target {
if instance_kind.is_none() {
exact_match_found = true;
}
}
// Fast Path 3: Horizontal Sibling Match (kind + . + type)
if let Some(k) = instance_kind {
let sibling_target = format!("{}.{}", k, t);
if schema_id == &sibling_target {
exact_match_found = true;
}
}
}
if exact_match_found {
// We found an exact literal structural identity match!
// Wipe the existing viable_candidates and only yield this guy!
viable_candidates.clear();
viable_candidates.push(*sub);
break;
}
// Fast Path 4: Vertical Inheritance Fallback (Physical DB constraint)
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t_ptr)) = &sub.type_ {
if !crate::database::schema::is_primitive_type(t_ptr) {
if let Some(base_type) = t_ptr.split('.').last() {
if let Some(type_def) = self.db.types.get(base_type) {
if !type_def.variations.contains(&t.to_string()) {
can_match = false;
}
} else {
if t_ptr != t {
can_match = false;
}
}
}
}
}
// Fast Path 5: Explicit Schema JSON `const` values check
if can_match {
if let Some(props) = &sub.properties {
if let Some(type_prop) = props.get("type") {
if let Some(const_val) = &type_prop.const_ {
if let Some(const_str) = const_val.as_str() {
if const_str != t {
can_match = false;
}
}
}
}
} }
} }
} }
if can_match { if valid_count == 1 {
viable_candidates.push(*sub); if let Some(successful_res) = final_successful_result {
} result.merge(successful_res);
} }
return Ok(true);
println!("DEBUG VIABLE: {:?}", viable_candidates.iter().map(|s| s.id.clone()).collect::<Vec<_>>()); } else if valid_count == 0 {
// 2. Evaluate Viable Candidates result.errors.push(ValidationError {
// 2. Evaluate Viable Candidates code: "NO_ONEOF_MATCH".to_string(),
// Composition validation is natively handled directly via type compilation. message: "Payload matches none of the required candidate sub-schemas natively".to_string(),
// The deprecated allOf JSON structure is no longer supported nor traversed.
for sub in viable_candidates.clone() {
let derived = self.derive_for_schema(sub, false);
let sub_res = derived.validate()?;
if sub_res.is_valid() {
passed_candidates.push((sub.id.clone(), sub_res));
} else {
failed_candidates.push(sub_res);
}
}
for f in &failed_candidates {
println!(" - Failed candidate errors: {:?}", f.errors.iter().map(|e| e.code.clone()).collect::<Vec<_>>());
}
if passed_candidates.len() == 1 {
result.merge(passed_candidates.pop().unwrap().1);
} else if passed_candidates.is_empty() {
// 3. Discriminator Pathing (Failure Analytics)
let type_path = self.join_path("type");
if instance_type.is_some() {
// Filter to candidates that didn't explicitly throw a CONST violation on `type`
let mut genuinely_failed = Vec::new();
for res in &failed_candidates {
let rejected_type = res.errors.iter().any(|e| {
(e.code == "CONST_VIOLATED" || e.code == "ENUM_VIOLATED") && e.path == type_path
});
if !rejected_type {
genuinely_failed.push(res.clone());
}
}
println!("DEBUG genuinely_failed len: {}", genuinely_failed.len());
if genuinely_failed.len() == 1 {
// Golden Type Match (1 candidate was structurally possible but failed property validation)
let sub_res = genuinely_failed.pop().unwrap();
result.errors.extend(sub_res.errors);
result.evaluated_keys.extend(sub_res.evaluated_keys);
return Ok(false);
} else {
// Pure Ad-Hoc Union
result.errors.push(ValidationError {
code: if self.schema.family.is_some() { "NO_FAMILY_MATCH".to_string() } else { "NO_ONEOF_MATCH".to_string() },
message: "Payload matches none of the required candidate sub-schemas".to_string(),
path: self.path.to_string(), path: self.path.to_string(),
}); });
for sub_res in &failed_candidates { if let Some(first) = failed_candidates.first() {
result.evaluated_keys.extend(sub_res.evaluated_keys.clone());
}
println!("DEBUG ELSE NO_FAMILY_MATCH RUNNING. Genuinely Failed len: {}", genuinely_failed.len());
if viable_candidates.is_empty() {
if let Some(obj) = self.instance.as_object() {
result.evaluated_keys.extend(obj.keys().cloned());
}
}
for sub_res in genuinely_failed {
for e in sub_res.errors {
if !result.errors.iter().any(|existing| existing.code == e.code && existing.path == e.path) {
result.errors.push(e);
}
}
}
return Ok(false);
}
} else {
// Instance missing type
// Instance missing type
let expects_type = viable_candidates.iter().any(|c| {
c.compiled_property_names.get().map_or(false, |props| props.contains(&"type".to_string()))
});
if expects_type {
result.errors.push(ValidationError {
code: "MISSING_TYPE".to_string(),
message: "Missing type discriminator. Unable to resolve polymorphic boundaries".to_string(),
path: self.path.to_string(),
});
for sub_res in failed_candidates {
result.evaluated_keys.extend(sub_res.evaluated_keys);
}
return Ok(false);
} else {
// Pure Ad-Hoc Union
result.errors.push(ValidationError {
code: if self.schema.family.is_some() { "NO_FAMILY_MATCH".to_string() } else { "NO_ONEOF_MATCH".to_string() },
message: "Payload matches none of the required candidate sub-schemas".to_string(),
path: self.path.to_string(),
});
if let Some(first) = failed_candidates.first() {
let mut shared_errors = first.errors.clone(); let mut shared_errors = first.errors.clone();
for sub_res in failed_candidates.iter().skip(1) { for sub_res in failed_candidates.iter().skip(1) {
shared_errors.retain(|e1| { shared_errors.retain(|e1| {
@ -281,26 +91,66 @@ impl<'a> ValidationContext<'a> {
result.errors.push(e); result.errors.push(e);
} }
} }
} }
for sub_res in failed_candidates {
result.evaluated_keys.extend(sub_res.evaluated_keys);
}
return Ok(false);
}
return Ok(false);
} else {
result.errors.push(ValidationError {
code: "AMBIGUOUS_POLYMORPHIC_MATCH".to_string(),
message: "Matches multiple polymorphic candidates inextricably natively".to_string(),
path: self.path.to_string(),
});
return Ok(false);
} }
} else {
result.errors.push(ValidationError {
code: "AMBIGUOUS_POLYMORPHIC_MATCH".to_string(),
message: "Matches multiple polymorphic candidates inextricably".to_string(),
path: self.path.to_string(),
});
} }
Ok(true) Ok(true)
} }
pub(crate) fn execute_polymorph(
&self,
disc: &str,
options: &std::collections::BTreeMap<String, String>,
result: &mut ValidationResult,
) -> Result<bool, ValidationError> {
// 1. O(1) Fast-Path Router & Extractor
let instance_val = self.instance.as_object().and_then(|o| o.get(disc)).and_then(|t| t.as_str());
if let Some(val) = instance_val {
result.evaluated_keys.insert(disc.to_string());
if let Some(target_id) = options.get(val) {
if let Some(target_schema) = self.db.schemas.get(target_id) {
let derived = self.derive_for_schema(target_schema.as_ref(), false);
let sub_res = derived.validate()?;
let is_valid = sub_res.is_valid();
result.merge(sub_res);
return Ok(is_valid);
} else {
result.errors.push(ValidationError {
code: "MISSING_COMPILED_SCHEMA".to_string(),
message: format!("Polymorphic router target '{}' does not exist in the database schemas map", target_id),
path: self.path.to_string(),
});
return Ok(false);
}
} else {
result.errors.push(ValidationError {
code: if self.schema.family.is_some() { "NO_FAMILY_MATCH".to_string() } else { "NO_ONEOF_MATCH".to_string() },
message: format!("Payload provided discriminator {}='{}' which matches none of the required candidate sub-schemas", disc, val),
path: self.path.to_string(),
});
return Ok(false);
}
} else {
result.errors.push(ValidationError {
code: "MISSING_TYPE".to_string(),
message: format!("Missing '{}' discriminator. Unable to resolve polymorphic boundaries", disc),
path: self.path.to_string(),
});
return Ok(false);
}
}
pub(crate) fn validate_type_inheritance( pub(crate) fn validate_type_inheritance(
&self, &self,
result: &mut ValidationResult, result: &mut ValidationResult,
@ -323,17 +173,17 @@ impl<'a> ValidationContext<'a> {
let mut custom_types = Vec::new(); let mut custom_types = Vec::new();
match &self.schema.type_ { match &self.schema.type_ {
Some(crate::database::schema::SchemaTypeOrArray::Single(t)) => { Some(crate::database::object::SchemaTypeOrArray::Single(t)) => {
if !crate::database::schema::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
custom_types.push(t.clone()); custom_types.push(t.clone());
} }
} }
Some(crate::database::schema::SchemaTypeOrArray::Multiple(arr)) => { Some(crate::database::object::SchemaTypeOrArray::Multiple(arr)) => {
if arr.contains(&payload_primitive.to_string()) || (payload_primitive == "integer" && arr.contains(&"number".to_string())) { if arr.contains(&payload_primitive.to_string()) || (payload_primitive == "integer" && arr.contains(&"number".to_string())) {
// It natively matched a primitive in the array options, skip forcing custom proxy fallback // It natively matched a primitive in the array options, skip forcing custom proxy fallback
} else { } else {
for t in arr { for t in arr {
if !crate::database::schema::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
custom_types.push(t.clone()); custom_types.push(t.clone());
} }
} }

View File

@ -1 +1 @@
1.0.108 1.0.110