Compare commits
21 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 8ca9017cc4 | |||
| 10c57e59ec | |||
| ef4571767c | |||
| 29bd25eaff | |||
| 4d9b510819 | |||
| 3c4b1066df | |||
| 4c59d9ba7f | |||
| a1038490dd | |||
| 14707330a7 | |||
| 77bc92533c | |||
| 4060119b01 | |||
| 95546fe10c | |||
| 882bdc6271 | |||
| 9bdb767685 | |||
| bdd89fe695 | |||
| 8135d80045 | |||
| 9255439d53 | |||
| 9038607729 | |||
| 9f6c27c3b8 | |||
| 75aac41362 | |||
| dbcef42401 |
26
GEMINI.md
26
GEMINI.md
@ -20,9 +20,16 @@ JSPG operates by deeply integrating the JSON Schema Draft 2020-12 specification
|
||||
To support high-throughput operations while allowing for runtime updates (e.g., during hot-reloading), JSPG uses an **Atomic Swap** pattern:
|
||||
1. **Parser Phase**: Schema JSONs are parsed into ordered `Schema` structs.
|
||||
2. **Compiler Phase**: The database iterates all parsed schemas and pre-computes native optimization maps (Descendants Map, Depths Map, Variations Map).
|
||||
3. **Immutable Validator**: The `Validator` struct immutably owns the `Database` registry and all its global maps. Schemas themselves are completely frozen; `$ref` strings are resolved dynamically at runtime using pre-computed O(1) maps.
|
||||
3. **Immutable AST Caching**: The `Validator` struct immutably owns the `Database` registry. Schemas themselves are frozen structurally, but utilize `OnceLock` interior mutability during the Compilation Phase to permanently cache resolved `$ref` inheritances, properties, and `compiled_edges` directly onto their AST nodes. This guarantees strict `O(1)` relationship and property validation execution at runtime without locking or recursive DB polling.
|
||||
4. **Lock-Free Reads**: Incoming operations acquire a read lock just long enough to clone the `Arc` inside an `RwLock<Option<Arc<Validator>>>`, ensuring zero blocking during schema updates.
|
||||
|
||||
### Global API Reference
|
||||
These functions operate on the global `GLOBAL_JSPG` engine instance and provide administrative boundaries:
|
||||
|
||||
* `jspg_setup(database jsonb) -> jsonb`: Initializes the engine. Deserializes the full database schema registry (types, enums, puncs, relations) from Postgres and compiles them into memory atomically.
|
||||
* `jspg_teardown() -> jsonb`: Clears the current session's engine instance from `GLOBAL_JSPG`, resetting the cache.
|
||||
* `jspg_schemas() -> jsonb`: Exports the fully compiled AST snapshot (including all inherited dependencies) out of `GLOBAL_JSPG` into standard JSON Schema representations.
|
||||
|
||||
---
|
||||
|
||||
## 2. Validator
|
||||
@ -30,10 +37,7 @@ To support high-throughput operations while allowing for runtime updates (e.g.,
|
||||
The Validator provides strict, schema-driven evaluation for the "Punc" architecture.
|
||||
|
||||
### API Reference
|
||||
* `jspg_setup(database jsonb) -> jsonb`: Loads and compiles the entire registry (types, enums, puncs, relations) atomically.
|
||||
* `mask_json_schema(schema_id text, instance jsonb) -> jsonb`: Validates and prunes unknown properties dynamically, returning masked data.
|
||||
* `jspg_validate(schema_id text, instance jsonb) -> jsonb`: Returns boolean-like success or structured errors.
|
||||
* `jspg_teardown() -> jsonb`: Clears the current session's schema cache.
|
||||
* `jspg_validate(schema_id text, instance jsonb) -> jsonb`: Validates the `instance` JSON payload strictly against the constraints of the registered `schema_id`. Returns boolean-like success or structured error codes.
|
||||
|
||||
### Custom Features & Deviations
|
||||
JSPG implements specific extensions to the Draft 2020-12 standard to support the Punc architecture's object-oriented needs while heavily optimizing for zero-runtime lookups.
|
||||
@ -69,11 +73,14 @@ To simplify frontend form validation, format validators specifically for `uuid`,
|
||||
|
||||
## 3. Merger
|
||||
|
||||
The Merger provides an automated, high-performance graph synchronization engine via the `jspg_merge(cue JSONB)` API. It orchestrates the complex mapping of nested JSON objects into normalized Postgres relational tables, honoring all inheritance and graph constraints.
|
||||
The Merger provides an automated, high-performance graph synchronization engine. It orchestrates the complex mapping of nested JSON objects into normalized Postgres relational tables, honoring all inheritance and graph constraints.
|
||||
|
||||
### API Reference
|
||||
* `jspg_merge(schema_id text, data jsonb) -> jsonb`: Traverses the provided JSON payload according to the compiled relational map of `schema_id`. Dynamically builds and executes relational SQL UPSERT paths natively.
|
||||
|
||||
### Core Features
|
||||
|
||||
* **Caching Strategy**: The Merger leverages the `Validator`'s in-memory `Database` registry to instantly resolve Foreign Key mapping graphs. It additionally utilizes the concurrent `GLOBAL_JSPG` application memory (`DashMap`) to cache statically constructed SQL `SELECT` strings used during deduplication (`lk_`) and difference tracking calculations.
|
||||
* **Caching Strategy**: The Merger leverages the native `compiled_edges` permanently cached onto the Schema AST via `OnceLock` to instantly resolve Foreign Key mapping graphs natively in absolute `O(1)` time. It additionally utilizes the concurrent `GLOBAL_JSPG` application memory (`DashMap`) to cache statically constructed SQL `SELECT` strings used during deduplication (`lk_`) and difference tracking calculations.
|
||||
* **Deep Graph Merging**: The Merger walks arbitrary levels of deeply nested JSON schemas (e.g. tracking an `order`, its `customer`, and an array of its `lines`). It intelligently discovers the correct parent-to-child or child-to-parent Foreign Keys stored in the registry and automatically maps the UUIDs across the relationships during UPSERT.
|
||||
* **Prefix Foreign Key Matching**: Handles scenario where multiple relations point to the same table by using database Foreign Key constraint prefixes (`fk_`). For example, if a schema has `shipping_address` and `billing_address`, the merger resolves against `fk_shipping_address_entity` vs `fk_billing_address_entity` automatically to correctly route object properties.
|
||||
* **Dynamic Deduplication & Lookups**: If a nested object is provided without an `id`, the Merger utilizes Postgres `lk_` index constraints defined in the schema registry (e.g. `lk_person` mapped to `first_name` and `last_name`). It dynamically queries these unique matching constraints to discover the correct UUID to perform an UPDATE, preventing data duplication.
|
||||
@ -91,7 +98,10 @@ The Merger provides an automated, high-performance graph synchronization engine
|
||||
|
||||
## 4. Queryer
|
||||
|
||||
The Queryer transforms Postgres into a pre-compiled Semantic Query Engine via the `jspg_query(schema_id text, cue jsonb)` API, designed to serve the exact shape of Punc responses directly via SQL.
|
||||
The Queryer transforms Postgres into a pre-compiled Semantic Query Engine, designed to serve the exact shape of Punc responses directly via SQL.
|
||||
|
||||
### API Reference
|
||||
* `jspg_query(schema_id text, filters jsonb) -> jsonb`: Compiles the JSON Schema AST of `schema_id` directly into pre-planned, nested multi-JOIN SQL execution trees. Processes `filters` structurally.
|
||||
|
||||
### Core Features
|
||||
|
||||
|
||||
@ -359,6 +359,15 @@
|
||||
},
|
||||
"customer_id": {
|
||||
"type": "string"
|
||||
},
|
||||
"customer": {
|
||||
"$ref": "person"
|
||||
},
|
||||
"lines": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"$ref": "order_line"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -719,6 +728,24 @@
|
||||
{
|
||||
"name": "attachment",
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "type_metadata",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"type": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"$id": "other_metadata",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"other": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"$id": "attachment",
|
||||
"$ref": "entity",
|
||||
@ -729,9 +756,11 @@
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"metadata": {
|
||||
"type": "object",
|
||||
"additionalProperties": true
|
||||
"type_metadata": {
|
||||
"$ref": "type_metadata"
|
||||
},
|
||||
"other_metadata": {
|
||||
"$ref": "other_metadata"
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -744,7 +773,8 @@
|
||||
"id",
|
||||
"type",
|
||||
"flags",
|
||||
"metadata",
|
||||
"type_metadata",
|
||||
"other_metadata",
|
||||
"created_at",
|
||||
"created_by",
|
||||
"modified_at",
|
||||
@ -756,7 +786,8 @@
|
||||
"id",
|
||||
"type",
|
||||
"flags",
|
||||
"metadata"
|
||||
"type_metadata",
|
||||
"other_metadata"
|
||||
],
|
||||
"entity": [
|
||||
"id",
|
||||
@ -772,7 +803,8 @@
|
||||
"id": "uuid",
|
||||
"type": "text",
|
||||
"flags": "_text",
|
||||
"metadata": "jsonb",
|
||||
"type_metadata": "jsonb",
|
||||
"other_metadata": "jsonb",
|
||||
"created_at": "timestamptz",
|
||||
"created_by": "uuid",
|
||||
"modified_at": "timestamptz",
|
||||
@ -806,6 +838,7 @@
|
||||
"contact_id": "old-contact"
|
||||
}
|
||||
],
|
||||
"schema_id": "person",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -929,6 +962,7 @@
|
||||
"contact_id": "old-contact"
|
||||
}
|
||||
],
|
||||
"schema_id": "person",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -1022,6 +1056,7 @@
|
||||
"last_name": "OldLast"
|
||||
}
|
||||
],
|
||||
"schema_id": "person",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -1111,6 +1146,7 @@
|
||||
"date_of_birth": "1990-01-01T00:00:00Z",
|
||||
"pronouns": ""
|
||||
},
|
||||
"schema_id": "person",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -1243,6 +1279,7 @@
|
||||
"date_of_birth": "2000-01-01"
|
||||
}
|
||||
},
|
||||
"schema_id": "order",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -1439,6 +1476,7 @@
|
||||
}
|
||||
]
|
||||
},
|
||||
"schema_id": "order",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -1634,6 +1672,7 @@
|
||||
}
|
||||
]
|
||||
},
|
||||
"schema_id": "person",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -2197,6 +2236,7 @@
|
||||
"archived": false
|
||||
}
|
||||
],
|
||||
"schema_id": "person",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -2260,7 +2300,7 @@
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Insert attachment displaying side-by-side array literal and jsonb formatting translations",
|
||||
"description": "Attachment with text[] and jsonb metadata structures",
|
||||
"action": "merge",
|
||||
"data": {
|
||||
"type": "attachment",
|
||||
@ -2268,11 +2308,14 @@
|
||||
"urgent",
|
||||
"reviewed"
|
||||
],
|
||||
"metadata": {
|
||||
"size": 1024,
|
||||
"source": "upload"
|
||||
"other_metadata": {
|
||||
"other": "hello"
|
||||
},
|
||||
"type_metadata": {
|
||||
"type": "type_metadata"
|
||||
}
|
||||
},
|
||||
"schema_id": "attachment",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
@ -2298,14 +2341,16 @@
|
||||
"INSERT INTO agreego.\"attachment\" (",
|
||||
" \"flags\",",
|
||||
" \"id\",",
|
||||
" \"metadata\",",
|
||||
" \"type\"",
|
||||
" \"other_metadata\",",
|
||||
" \"type\",",
|
||||
" \"type_metadata\"",
|
||||
")",
|
||||
"VALUES (",
|
||||
" '{\"urgent\",\"reviewed\"}',",
|
||||
" '{{uuid:attachment_id}}',",
|
||||
" '{\"size\":1024,\"source\":\"upload\"}',",
|
||||
" 'attachment'",
|
||||
" '{\"other\":\"hello\"}',",
|
||||
" 'attachment',",
|
||||
" '{\"type\":\"type_metadata\"}'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
@ -2322,8 +2367,9 @@
|
||||
" NULL,",
|
||||
" '{",
|
||||
" \"flags\":[\"urgent\",\"reviewed\"],",
|
||||
" \"metadata\":{\"size\":1024,\"source\":\"upload\"},",
|
||||
" \"type\":\"attachment\"",
|
||||
" \"other_metadata\":{\"other\":\"hello\"},",
|
||||
" \"type\":\"attachment\",",
|
||||
" \"type_metadata\":{\"type\":\"type_metadata\"}",
|
||||
" }',",
|
||||
" '{{uuid:attachment_id}}',",
|
||||
" '{{uuid}}',",
|
||||
@ -2339,15 +2385,226 @@
|
||||
" \"created_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||
" \"flags\":[\"urgent\",\"reviewed\"],",
|
||||
" \"id\":\"{{uuid:attachment_id}}\",",
|
||||
" \"metadata\":{\"size\":1024,\"source\":\"upload\"},",
|
||||
" \"modified_at\":\"{{timestamp}}\",",
|
||||
" \"modified_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||
" \"type\":\"attachment\"",
|
||||
" \"other_metadata\":{\"other\":\"hello\"},",
|
||||
" \"type\":\"attachment\",",
|
||||
" \"type_metadata\":{\"type\":\"type_metadata\"}",
|
||||
" },",
|
||||
" \"new\":{",
|
||||
" \"flags\":[\"urgent\",\"reviewed\"],",
|
||||
" \"metadata\":{\"size\":1024,\"source\":\"upload\"},",
|
||||
" \"type\":\"attachment\"",
|
||||
" \"other_metadata\":{\"other\":\"hello\"},",
|
||||
" \"type\":\"attachment\",",
|
||||
" \"type_metadata\":{\"type\":\"type_metadata\"}",
|
||||
" }",
|
||||
" }')"
|
||||
]
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Anchor order and insert new line (no line id)",
|
||||
"action": "merge",
|
||||
"data": {
|
||||
"id": "abc",
|
||||
"type": "order",
|
||||
"lines": [
|
||||
{
|
||||
"type": "order_line",
|
||||
"product": "Widget",
|
||||
"price": 99.0
|
||||
}
|
||||
]
|
||||
},
|
||||
"schema_id": "order",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
[
|
||||
"INSERT INTO agreego.\"entity\" (",
|
||||
" \"created_at\",",
|
||||
" \"created_by\",",
|
||||
" \"id\",",
|
||||
" \"modified_at\",",
|
||||
" \"modified_by\",",
|
||||
" \"type\"",
|
||||
")",
|
||||
"VALUES (",
|
||||
" '{{timestamp}}',",
|
||||
" '00000000-0000-0000-0000-000000000000',",
|
||||
" '{{uuid:line_id}}',",
|
||||
" '{{timestamp}}',",
|
||||
" '00000000-0000-0000-0000-000000000000',",
|
||||
" 'order_line'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
"INSERT INTO agreego.\"order_line\" (",
|
||||
" \"id\",",
|
||||
" \"order_id\",",
|
||||
" \"price\",",
|
||||
" \"product\",",
|
||||
" \"type\"",
|
||||
")",
|
||||
"VALUES (",
|
||||
" '{{uuid:line_id}}',",
|
||||
" 'abc',",
|
||||
" 99,",
|
||||
" 'Widget',",
|
||||
" 'order_line'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
"INSERT INTO agreego.change (",
|
||||
" \"old\",",
|
||||
" \"new\",",
|
||||
" entity_id,",
|
||||
" id,",
|
||||
" kind,",
|
||||
" modified_at,",
|
||||
" modified_by",
|
||||
")",
|
||||
"VALUES (",
|
||||
" NULL,",
|
||||
" '{",
|
||||
" \"order_id\":\"abc\",",
|
||||
" \"price\":99.0,",
|
||||
" \"product\":\"Widget\",",
|
||||
" \"type\":\"order_line\"",
|
||||
" }',",
|
||||
" '{{uuid:line_id}}',",
|
||||
" '{{uuid}}',",
|
||||
" 'create',",
|
||||
" '{{timestamp}}',",
|
||||
" '00000000-0000-0000-0000-000000000000'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
"SELECT pg_notify('entity', '{",
|
||||
" \"complete\":{",
|
||||
" \"created_at\":\"{{timestamp}}\",",
|
||||
" \"created_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||
" \"id\":\"{{uuid:line_id}}\",",
|
||||
" \"modified_at\":\"{{timestamp}}\",",
|
||||
" \"modified_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||
" \"order_id\":\"abc\",",
|
||||
" \"price\":99.0,",
|
||||
" \"product\":\"Widget\",",
|
||||
" \"type\":\"order_line\"",
|
||||
" },",
|
||||
" \"new\":{",
|
||||
" \"order_id\":\"abc\",",
|
||||
" \"price\":99.0,",
|
||||
" \"product\":\"Widget\",",
|
||||
" \"type\":\"order_line\"",
|
||||
" }",
|
||||
" }')"
|
||||
]
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Anchor order and insert new line (with line id)",
|
||||
"action": "merge",
|
||||
"data": {
|
||||
"id": "abc",
|
||||
"type": "order",
|
||||
"lines": [
|
||||
{
|
||||
"id": "11111111-2222-3333-4444-555555555555",
|
||||
"type": "order_line",
|
||||
"product": "Widget",
|
||||
"price": 99.0
|
||||
}
|
||||
]
|
||||
},
|
||||
"schema_id": "order",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
[
|
||||
"SELECT to_jsonb(t1.*) || to_jsonb(t2.*)",
|
||||
"FROM agreego.\"order_line\" t1",
|
||||
"LEFT JOIN agreego.\"entity\" t2 ON t2.id = t1.id",
|
||||
"WHERE t1.id = '11111111-2222-3333-4444-555555555555'"
|
||||
],
|
||||
[
|
||||
"INSERT INTO agreego.\"entity\" (",
|
||||
" \"created_at\",",
|
||||
" \"created_by\",",
|
||||
" \"id\",",
|
||||
" \"modified_at\",",
|
||||
" \"modified_by\",",
|
||||
" \"type\"",
|
||||
")",
|
||||
"VALUES (",
|
||||
" '{{timestamp}}',",
|
||||
" '00000000-0000-0000-0000-000000000000',",
|
||||
" '11111111-2222-3333-4444-555555555555',",
|
||||
" '{{timestamp}}',",
|
||||
" '00000000-0000-0000-0000-000000000000',",
|
||||
" 'order_line'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
"INSERT INTO agreego.\"order_line\" (",
|
||||
" \"id\",",
|
||||
" \"order_id\",",
|
||||
" \"price\",",
|
||||
" \"product\",",
|
||||
" \"type\"",
|
||||
")",
|
||||
"VALUES (",
|
||||
" '11111111-2222-3333-4444-555555555555',",
|
||||
" 'abc',",
|
||||
" 99,",
|
||||
" 'Widget',",
|
||||
" 'order_line'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
"INSERT INTO agreego.change (",
|
||||
" \"old\",",
|
||||
" \"new\",",
|
||||
" entity_id,",
|
||||
" id,",
|
||||
" kind,",
|
||||
" modified_at,",
|
||||
" modified_by",
|
||||
")",
|
||||
"VALUES (",
|
||||
" NULL,",
|
||||
" '{",
|
||||
" \"order_id\":\"abc\",",
|
||||
" \"price\":99.0,",
|
||||
" \"product\":\"Widget\",",
|
||||
" \"type\":\"order_line\"",
|
||||
" }',",
|
||||
" '11111111-2222-3333-4444-555555555555',",
|
||||
" '{{uuid}}',",
|
||||
" 'create',",
|
||||
" '{{timestamp}}',",
|
||||
" '00000000-0000-0000-0000-000000000000'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
"SELECT pg_notify('entity', '{",
|
||||
" \"complete\":{",
|
||||
" \"created_at\":\"{{timestamp}}\",",
|
||||
" \"created_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||
" \"id\":\"11111111-2222-3333-4444-555555555555\",",
|
||||
" \"modified_at\":\"{{timestamp}}\",",
|
||||
" \"modified_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||
" \"order_id\":\"abc\",",
|
||||
" \"price\":99.0,",
|
||||
" \"product\":\"Widget\",",
|
||||
" \"type\":\"order_line\"",
|
||||
" },",
|
||||
" \"new\":{",
|
||||
" \"order_id\":\"abc\",",
|
||||
" \"price\":99.0,",
|
||||
" \"product\":\"Widget\",",
|
||||
" \"type\":\"order_line\"",
|
||||
" }",
|
||||
" }')"
|
||||
]
|
||||
|
||||
@ -1163,7 +1163,7 @@
|
||||
"$eq": true,
|
||||
"$ne": false
|
||||
},
|
||||
"contacts.#.is_primary": {
|
||||
"contacts/is_primary": {
|
||||
"$eq": true
|
||||
},
|
||||
"created_at": {
|
||||
@ -1203,7 +1203,7 @@
|
||||
"$eq": "%Doe%",
|
||||
"$ne": "%Smith%"
|
||||
},
|
||||
"phone_numbers.#.target.number": {
|
||||
"phone_numbers/target/number": {
|
||||
"$eq": "555-1234"
|
||||
}
|
||||
},
|
||||
@ -1408,6 +1408,44 @@
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Person ad-hoc email addresses select",
|
||||
"action": "query",
|
||||
"schema_id": "full.person/email_addresses",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
[
|
||||
"(SELECT jsonb_build_object(",
|
||||
" 'archived', entity_3.archived,",
|
||||
" 'created_at', entity_3.created_at,",
|
||||
" 'id', entity_3.id,",
|
||||
" 'is_primary', contact_1.is_primary,",
|
||||
" 'name', entity_3.name,",
|
||||
" 'target',",
|
||||
" (SELECT jsonb_build_object(",
|
||||
" 'address', email_address_4.address,",
|
||||
" 'archived', entity_5.archived,",
|
||||
" 'created_at', entity_5.created_at,",
|
||||
" 'id', entity_5.id,",
|
||||
" 'name', entity_5.name,",
|
||||
" 'type', entity_5.type",
|
||||
" )",
|
||||
" FROM agreego.email_address email_address_4",
|
||||
" JOIN agreego.entity entity_5 ON entity_5.id = email_address_4.id",
|
||||
" WHERE",
|
||||
" NOT entity_5.archived",
|
||||
" AND relationship_2.target_id = entity_5.id),",
|
||||
" 'type', entity_3.type",
|
||||
")",
|
||||
"FROM agreego.contact contact_1",
|
||||
"JOIN agreego.relationship relationship_2 ON relationship_2.id = contact_1.id",
|
||||
"JOIN agreego.entity entity_3 ON entity_3.id = relationship_2.id",
|
||||
"WHERE NOT entity_3.archived)"
|
||||
]
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Order select with customer and lines",
|
||||
"action": "query",
|
||||
|
||||
7
src/database/edge.rs
Normal file
7
src/database/edge.rs
Normal file
@ -0,0 +1,7 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
pub struct Edge {
|
||||
pub constraint: String,
|
||||
pub forward: bool,
|
||||
}
|
||||
@ -1,3 +1,4 @@
|
||||
pub mod edge;
|
||||
pub mod r#enum;
|
||||
pub mod executors;
|
||||
pub mod formats;
|
||||
@ -29,7 +30,7 @@ pub struct Database {
|
||||
pub enums: HashMap<String, Enum>,
|
||||
pub types: HashMap<String, Type>,
|
||||
pub puncs: HashMap<String, Punc>,
|
||||
pub relations: Vec<Relation>,
|
||||
pub relations: HashMap<String, Relation>,
|
||||
pub schemas: HashMap<String, Schema>,
|
||||
pub descendants: HashMap<String, Vec<String>>,
|
||||
pub depths: HashMap<String, usize>,
|
||||
@ -41,7 +42,7 @@ impl Database {
|
||||
let mut db = Self {
|
||||
enums: HashMap::new(),
|
||||
types: HashMap::new(),
|
||||
relations: Vec::new(),
|
||||
relations: HashMap::new(),
|
||||
puncs: HashMap::new(),
|
||||
schemas: HashMap::new(),
|
||||
descendants: HashMap::new(),
|
||||
@ -75,10 +76,21 @@ impl Database {
|
||||
if db.types.contains_key(&def.source_type)
|
||||
&& db.types.contains_key(&def.destination_type)
|
||||
{
|
||||
db.relations.push(def);
|
||||
db.relations.insert(def.constraint.clone(), def);
|
||||
}
|
||||
}
|
||||
Err(e) => println!("DATABASE RELATION PARSE FAILED: {:?}", e),
|
||||
Err(e) => {
|
||||
return Err(crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||
code: "DATABASE_RELATION_PARSE_FAILED".to_string(),
|
||||
message: format!("Failed to parse database relation: {}", e),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: "".to_string(),
|
||||
cause: None,
|
||||
context: None,
|
||||
schema: None,
|
||||
},
|
||||
}]));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -136,37 +148,67 @@ impl Database {
|
||||
}
|
||||
|
||||
pub fn compile(&mut self) -> Result<(), crate::drop::Drop> {
|
||||
self.collect_schemas();
|
||||
let mut harvested = Vec::new();
|
||||
for schema in self.schemas.values_mut() {
|
||||
if let Err(msg) = schema.collect_schemas(None, &mut harvested) {
|
||||
return Err(crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||
code: "SCHEMA_VALIDATION_FAILED".to_string(),
|
||||
message: msg,
|
||||
details: crate::drop::ErrorDetails { path: "".to_string(), cause: None, context: None, schema: None },
|
||||
}]));
|
||||
}
|
||||
}
|
||||
self.schemas.extend(harvested);
|
||||
|
||||
if let Err(msg) = self.collect_schemas() {
|
||||
return Err(crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||
code: "SCHEMA_VALIDATION_FAILED".to_string(),
|
||||
message: msg,
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: "".to_string(),
|
||||
cause: None,
|
||||
context: None,
|
||||
schema: None,
|
||||
},
|
||||
}]));
|
||||
}
|
||||
self.collect_depths();
|
||||
self.collect_descendants();
|
||||
self.compile_schemas();
|
||||
|
||||
// Mathematically evaluate all property inheritances, formats, schemas, and foreign key edges topographically over OnceLocks
|
||||
let mut visited = std::collections::HashSet::new();
|
||||
for schema in self.schemas.values() {
|
||||
schema.compile(self, &mut visited);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn collect_schemas(&mut self) {
|
||||
fn collect_schemas(&mut self) -> Result<(), String> {
|
||||
let mut to_insert = Vec::new();
|
||||
|
||||
// Pass 1: Extract all Schemas structurally off top level definitions into the master registry.
|
||||
// Validate every node recursively via string filters natively!
|
||||
for type_def in self.types.values() {
|
||||
for mut schema in type_def.schemas.clone() {
|
||||
schema.harvest(&mut to_insert);
|
||||
schema.collect_schemas(None, &mut to_insert)?;
|
||||
}
|
||||
}
|
||||
for punc_def in self.puncs.values() {
|
||||
for mut schema in punc_def.schemas.clone() {
|
||||
schema.harvest(&mut to_insert);
|
||||
schema.collect_schemas(None, &mut to_insert)?;
|
||||
}
|
||||
}
|
||||
for enum_def in self.enums.values() {
|
||||
for mut schema in enum_def.schemas.clone() {
|
||||
schema.harvest(&mut to_insert);
|
||||
schema.collect_schemas(None, &mut to_insert)?;
|
||||
}
|
||||
}
|
||||
|
||||
for (id, schema) in to_insert {
|
||||
self.schemas.insert(id, schema);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn collect_depths(&mut self) {
|
||||
@ -222,83 +264,10 @@ impl Database {
|
||||
self.descendants = descendants;
|
||||
}
|
||||
|
||||
pub fn get_relation(
|
||||
&self,
|
||||
parent_type: &str,
|
||||
child_type: &str,
|
||||
prop_name: &str,
|
||||
relative_keys: Option<&Vec<String>>,
|
||||
) -> Option<(&Relation, bool)> {
|
||||
if parent_type == "entity" && child_type == "entity" {
|
||||
return None; // Ignore entity <-> entity generic fallbacks, they aren't useful edges
|
||||
}
|
||||
|
||||
let p_def = self.types.get(parent_type)?;
|
||||
let c_def = self.types.get(child_type)?;
|
||||
|
||||
let mut matching_rels = Vec::new();
|
||||
let mut directions = Vec::new();
|
||||
|
||||
for rel in &self.relations {
|
||||
let is_forward = p_def.hierarchy.contains(&rel.source_type)
|
||||
&& c_def.hierarchy.contains(&rel.destination_type);
|
||||
let is_reverse = p_def.hierarchy.contains(&rel.destination_type)
|
||||
&& c_def.hierarchy.contains(&rel.source_type);
|
||||
|
||||
if is_forward {
|
||||
matching_rels.push(rel);
|
||||
directions.push(true);
|
||||
} else if is_reverse {
|
||||
matching_rels.push(rel);
|
||||
directions.push(false);
|
||||
}
|
||||
}
|
||||
|
||||
if matching_rels.is_empty() {
|
||||
return None;
|
||||
}
|
||||
|
||||
if matching_rels.len() == 1 {
|
||||
return Some((matching_rels[0], directions[0]));
|
||||
}
|
||||
|
||||
let mut chosen_idx = 0;
|
||||
let mut resolved = false;
|
||||
|
||||
// Reduce ambiguity with prefix
|
||||
for (i, rel) in matching_rels.iter().enumerate() {
|
||||
if let Some(prefix) = &rel.prefix {
|
||||
if prefix == prop_name {
|
||||
chosen_idx = i;
|
||||
resolved = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Reduce ambiguity by checking if relative payload OMITS the prefix (M:M heuristic)
|
||||
if !resolved && relative_keys.is_some() {
|
||||
let keys = relative_keys.unwrap();
|
||||
let mut missing_prefix_ids = Vec::new();
|
||||
for (i, rel) in matching_rels.iter().enumerate() {
|
||||
if let Some(prefix) = &rel.prefix {
|
||||
if !keys.contains(prefix) {
|
||||
missing_prefix_ids.push(i);
|
||||
}
|
||||
}
|
||||
}
|
||||
if missing_prefix_ids.len() == 1 {
|
||||
chosen_idx = missing_prefix_ids[0];
|
||||
}
|
||||
}
|
||||
|
||||
Some((matching_rels[chosen_idx], directions[chosen_idx]))
|
||||
}
|
||||
|
||||
fn collect_descendants_recursively(
|
||||
target: &str,
|
||||
direct_refs: &HashMap<String, Vec<String>>,
|
||||
descendants: &mut HashSet<String>,
|
||||
direct_refs: &std::collections::HashMap<String, Vec<String>>,
|
||||
descendants: &mut std::collections::HashSet<String>,
|
||||
) {
|
||||
if let Some(children) = direct_refs.get(target) {
|
||||
for child in children {
|
||||
@ -308,14 +277,4 @@ impl Database {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn compile_schemas(&mut self) {
|
||||
// Pass 3: compile_internals across pure structure
|
||||
let schema_ids: Vec<String> = self.schemas.keys().cloned().collect();
|
||||
for id in schema_ids {
|
||||
if let Some(schema) = self.schemas.get_mut(&id) {
|
||||
schema.compile_internals();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -2,6 +2,26 @@ use serde::{Deserialize, Serialize};
|
||||
use serde_json::Value;
|
||||
use std::collections::BTreeMap;
|
||||
use std::sync::Arc;
|
||||
use std::sync::OnceLock;
|
||||
|
||||
pub fn serialize_once_lock<T: serde::Serialize, S: serde::Serializer>(
|
||||
lock: &OnceLock<T>,
|
||||
serializer: S,
|
||||
) -> Result<S::Ok, S::Error> {
|
||||
if let Some(val) = lock.get() {
|
||||
val.serialize(serializer)
|
||||
} else {
|
||||
serializer.serialize_none()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn is_once_lock_map_empty<K, V>(lock: &OnceLock<std::collections::BTreeMap<K, V>>) -> bool {
|
||||
lock.get().map_or(true, |m| m.is_empty())
|
||||
}
|
||||
|
||||
pub fn is_once_lock_vec_empty<T>(lock: &OnceLock<Vec<T>>) -> bool {
|
||||
lock.get().map_or(true, |v| v.is_empty())
|
||||
}
|
||||
|
||||
// Schema mirrors the Go Punc Generator's schema struct for consistency.
|
||||
// It is an order-preserving representation of a JSON Schema.
|
||||
@ -167,12 +187,27 @@ pub struct SchemaObject {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub extensible: Option<bool>,
|
||||
|
||||
#[serde(rename = "compiledProperties")]
|
||||
#[serde(skip_deserializing)]
|
||||
#[serde(skip_serializing_if = "crate::database::schema::is_once_lock_vec_empty")]
|
||||
#[serde(serialize_with = "crate::database::schema::serialize_once_lock")]
|
||||
pub compiled_property_names: OnceLock<Vec<String>>,
|
||||
|
||||
#[serde(skip)]
|
||||
pub compiled_format: Option<CompiledFormat>,
|
||||
pub compiled_properties: OnceLock<BTreeMap<String, Arc<Schema>>>,
|
||||
|
||||
#[serde(rename = "compiledEdges")]
|
||||
#[serde(skip_deserializing)]
|
||||
#[serde(skip_serializing_if = "crate::database::schema::is_once_lock_map_empty")]
|
||||
#[serde(serialize_with = "crate::database::schema::serialize_once_lock")]
|
||||
pub compiled_edges: OnceLock<BTreeMap<String, crate::database::edge::Edge>>,
|
||||
|
||||
#[serde(skip)]
|
||||
pub compiled_pattern: Option<CompiledRegex>,
|
||||
pub compiled_format: OnceLock<CompiledFormat>,
|
||||
#[serde(skip)]
|
||||
pub compiled_pattern_properties: Option<Vec<(CompiledRegex, Arc<Schema>)>>,
|
||||
pub compiled_pattern: OnceLock<CompiledRegex>,
|
||||
#[serde(skip)]
|
||||
pub compiled_pattern_properties: OnceLock<Vec<(CompiledRegex, Arc<Schema>)>>,
|
||||
}
|
||||
|
||||
/// Represents a compiled format validator
|
||||
@ -216,19 +251,37 @@ impl std::ops::DerefMut for Schema {
|
||||
}
|
||||
|
||||
impl Schema {
|
||||
pub fn compile_internals(&mut self) {
|
||||
self.map_children(|child| child.compile_internals());
|
||||
|
||||
if let Some(format_str) = &self.obj.format
|
||||
&& let Some(fmt) = crate::database::formats::FORMATS.get(format_str.as_str())
|
||||
{
|
||||
self.obj.compiled_format = Some(crate::database::schema::CompiledFormat::Func(fmt.func));
|
||||
pub fn compile(
|
||||
&self,
|
||||
db: &crate::database::Database,
|
||||
visited: &mut std::collections::HashSet<String>,
|
||||
) {
|
||||
if self.obj.compiled_properties.get().is_some() {
|
||||
return;
|
||||
}
|
||||
|
||||
if let Some(pattern_str) = &self.obj.pattern
|
||||
&& let Ok(re) = regex::Regex::new(pattern_str)
|
||||
{
|
||||
self.obj.compiled_pattern = Some(crate::database::schema::CompiledRegex(re));
|
||||
if let Some(id) = &self.obj.id {
|
||||
if !visited.insert(id.clone()) {
|
||||
return; // Break cyclical resolution
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(format_str) = &self.obj.format {
|
||||
if let Some(fmt) = crate::database::formats::FORMATS.get(format_str.as_str()) {
|
||||
let _ = self
|
||||
.obj
|
||||
.compiled_format
|
||||
.set(crate::database::schema::CompiledFormat::Func(fmt.func));
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_str) = &self.obj.pattern {
|
||||
if let Ok(re) = regex::Regex::new(pattern_str) {
|
||||
let _ = self
|
||||
.obj
|
||||
.compiled_pattern
|
||||
.set(crate::database::schema::CompiledRegex(re));
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_props) = &self.obj.pattern_properties {
|
||||
@ -239,73 +292,354 @@ impl Schema {
|
||||
}
|
||||
}
|
||||
if !compiled.is_empty() {
|
||||
self.obj.compiled_pattern_properties = Some(compiled);
|
||||
let _ = self.obj.compiled_pattern_properties.set(compiled);
|
||||
}
|
||||
}
|
||||
|
||||
let mut props = std::collections::BTreeMap::new();
|
||||
|
||||
// 1. Resolve INHERITANCE dependencies first
|
||||
if let Some(ref_id) = &self.obj.r#ref {
|
||||
if let Some(parent) = db.schemas.get(ref_id) {
|
||||
parent.compile(db, visited);
|
||||
if let Some(p_props) = parent.obj.compiled_properties.get() {
|
||||
props.extend(p_props.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn harvest(&mut self, to_insert: &mut Vec<(String, Schema)>) {
|
||||
if let Some(all_of) = &self.obj.all_of {
|
||||
for ao in all_of {
|
||||
ao.compile(db, visited);
|
||||
if let Some(ao_props) = ao.obj.compiled_properties.get() {
|
||||
props.extend(ao_props.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(then_schema) = &self.obj.then_ {
|
||||
then_schema.compile(db, visited);
|
||||
if let Some(t_props) = then_schema.obj.compiled_properties.get() {
|
||||
props.extend(t_props.clone());
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(else_schema) = &self.obj.else_ {
|
||||
else_schema.compile(db, visited);
|
||||
if let Some(e_props) = else_schema.obj.compiled_properties.get() {
|
||||
props.extend(e_props.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Add local properties
|
||||
if let Some(local_props) = &self.obj.properties {
|
||||
for (k, v) in local_props {
|
||||
props.insert(k.clone(), v.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Set the OnceLock!
|
||||
let _ = self.obj.compiled_properties.set(props.clone());
|
||||
let mut names: Vec<String> = props.keys().cloned().collect();
|
||||
names.sort();
|
||||
let _ = self.obj.compiled_property_names.set(names);
|
||||
|
||||
// 4. Compute Edges natively
|
||||
let schema_edges = self.compile_edges(db, visited, &props);
|
||||
let _ = self.obj.compiled_edges.set(schema_edges);
|
||||
|
||||
// 5. Build our inline children properties recursively NOW! (Depth-first search)
|
||||
if let Some(local_props) = &self.obj.properties {
|
||||
for child in local_props.values() {
|
||||
child.compile(db, visited);
|
||||
}
|
||||
}
|
||||
if let Some(items) = &self.obj.items {
|
||||
items.compile(db, visited);
|
||||
}
|
||||
if let Some(pattern_props) = &self.obj.pattern_properties {
|
||||
for child in pattern_props.values() {
|
||||
child.compile(db, visited);
|
||||
}
|
||||
}
|
||||
if let Some(additional_props) = &self.obj.additional_properties {
|
||||
additional_props.compile(db, visited);
|
||||
}
|
||||
if let Some(one_of) = &self.obj.one_of {
|
||||
for child in one_of {
|
||||
child.compile(db, visited);
|
||||
}
|
||||
}
|
||||
if let Some(arr) = &self.obj.prefix_items {
|
||||
for child in arr {
|
||||
child.compile(db, visited);
|
||||
}
|
||||
}
|
||||
if let Some(child) = &self.obj.not {
|
||||
child.compile(db, visited);
|
||||
}
|
||||
if let Some(child) = &self.obj.contains {
|
||||
child.compile(db, visited);
|
||||
}
|
||||
if let Some(child) = &self.obj.property_names {
|
||||
child.compile(db, visited);
|
||||
}
|
||||
if let Some(child) = &self.obj.if_ {
|
||||
child.compile(db, visited);
|
||||
}
|
||||
|
||||
if let Some(id) = &self.obj.id {
|
||||
visited.remove(id);
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(unused_variables)]
|
||||
fn validate_identifier(id: &str, field_name: &str) -> Result<(), String> {
|
||||
#[cfg(not(test))]
|
||||
for c in id.chars() {
|
||||
if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' {
|
||||
return Err(format!("Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.]", c, field_name, id));
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn collect_schemas(
|
||||
&mut self,
|
||||
tracking_path: Option<String>,
|
||||
to_insert: &mut Vec<(String, Schema)>,
|
||||
) -> Result<(), String> {
|
||||
if let Some(id) = &self.obj.id {
|
||||
Self::validate_identifier(id, "$id")?;
|
||||
to_insert.push((id.clone(), self.clone()));
|
||||
}
|
||||
self.map_children(|child| child.harvest(to_insert));
|
||||
if let Some(r#ref) = &self.obj.r#ref {
|
||||
Self::validate_identifier(r#ref, "$ref")?;
|
||||
}
|
||||
if let Some(family) = &self.obj.family {
|
||||
Self::validate_identifier(family, "$family")?;
|
||||
}
|
||||
|
||||
pub fn map_children<F>(&mut self, mut f: F)
|
||||
where
|
||||
F: FnMut(&mut Schema),
|
||||
{
|
||||
// Is this schema an inline ad-hoc composition?
|
||||
// Meaning it has a tracking context, lacks an explicit $id, but extends an Entity ref with explicit properties!
|
||||
if self.obj.id.is_none() && self.obj.r#ref.is_some() && self.obj.properties.is_some() {
|
||||
if let Some(ref path) = tracking_path {
|
||||
to_insert.push((path.clone(), self.clone()));
|
||||
}
|
||||
}
|
||||
|
||||
// Provide the path origin to children natively, prioritizing the explicit `$id` boundary if one exists
|
||||
let origin_path = self.obj.id.clone().or(tracking_path);
|
||||
|
||||
self.collect_child_schemas(origin_path, to_insert)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn collect_child_schemas(
|
||||
&mut self,
|
||||
origin_path: Option<String>,
|
||||
to_insert: &mut Vec<(String, Schema)>,
|
||||
) -> Result<(), String> {
|
||||
if let Some(props) = &mut self.obj.properties {
|
||||
for v in props.values_mut() {
|
||||
for (k, v) in props.iter_mut() {
|
||||
let mut inner = (**v).clone();
|
||||
f(&mut inner);
|
||||
let next_path = origin_path.as_ref().map(|o| format!("{}/{}", o, k));
|
||||
inner.collect_schemas(next_path, to_insert)?;
|
||||
*v = Arc::new(inner);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_props) = &mut self.obj.pattern_properties {
|
||||
for v in pattern_props.values_mut() {
|
||||
for (k, v) in pattern_props.iter_mut() {
|
||||
let mut inner = (**v).clone();
|
||||
f(&mut inner);
|
||||
let next_path = origin_path.as_ref().map(|o| format!("{}/{}", o, k));
|
||||
inner.collect_schemas(next_path, to_insert)?;
|
||||
*v = Arc::new(inner);
|
||||
}
|
||||
}
|
||||
|
||||
let mut map_arr = |arr: &mut Vec<Arc<Schema>>| {
|
||||
let mut map_arr = |arr: &mut Vec<Arc<Schema>>| -> Result<(), String> {
|
||||
for v in arr.iter_mut() {
|
||||
let mut inner = (**v).clone();
|
||||
f(&mut inner);
|
||||
inner.collect_schemas(origin_path.clone(), to_insert)?;
|
||||
*v = Arc::new(inner);
|
||||
}
|
||||
Ok(())
|
||||
};
|
||||
|
||||
if let Some(arr) = &mut self.obj.prefix_items {
|
||||
map_arr(arr);
|
||||
}
|
||||
if let Some(arr) = &mut self.obj.all_of {
|
||||
map_arr(arr);
|
||||
}
|
||||
if let Some(arr) = &mut self.obj.one_of {
|
||||
map_arr(arr);
|
||||
}
|
||||
if let Some(arr) = &mut self.obj.prefix_items { map_arr(arr)?; }
|
||||
if let Some(arr) = &mut self.obj.all_of { map_arr(arr)?; }
|
||||
if let Some(arr) = &mut self.obj.one_of { map_arr(arr)?; }
|
||||
|
||||
let mut map_opt = |opt: &mut Option<Arc<Schema>>| {
|
||||
let mut map_opt = |opt: &mut Option<Arc<Schema>>, pass_path: bool| -> Result<(), String> {
|
||||
if let Some(v) = opt {
|
||||
let mut inner = (**v).clone();
|
||||
f(&mut inner);
|
||||
let next = if pass_path { origin_path.clone() } else { None };
|
||||
inner.collect_schemas(next, to_insert)?;
|
||||
*v = Arc::new(inner);
|
||||
}
|
||||
Ok(())
|
||||
};
|
||||
|
||||
map_opt(&mut self.obj.additional_properties);
|
||||
map_opt(&mut self.obj.items);
|
||||
map_opt(&mut self.obj.contains);
|
||||
map_opt(&mut self.obj.property_names);
|
||||
map_opt(&mut self.obj.not);
|
||||
map_opt(&mut self.obj.if_);
|
||||
map_opt(&mut self.obj.then_);
|
||||
map_opt(&mut self.obj.else_);
|
||||
map_opt(&mut self.obj.additional_properties, false)?;
|
||||
|
||||
// `items` absolutely must inherit the EXACT property path assigned to the Array wrapper!
|
||||
// This allows nested Arrays enclosing bare Entity structs to correctly register as the boundary mapping.
|
||||
map_opt(&mut self.obj.items, true)?;
|
||||
|
||||
map_opt(&mut self.obj.not, false)?;
|
||||
map_opt(&mut self.obj.contains, false)?;
|
||||
map_opt(&mut self.obj.property_names, false)?;
|
||||
map_opt(&mut self.obj.if_, false)?;
|
||||
map_opt(&mut self.obj.then_, false)?;
|
||||
map_opt(&mut self.obj.else_, false)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn compile_edges(
|
||||
&self,
|
||||
db: &crate::database::Database,
|
||||
visited: &mut std::collections::HashSet<String>,
|
||||
props: &std::collections::BTreeMap<String, std::sync::Arc<Schema>>,
|
||||
) -> std::collections::BTreeMap<String, crate::database::edge::Edge> {
|
||||
let mut schema_edges = std::collections::BTreeMap::new();
|
||||
let mut parent_type_name = None;
|
||||
if let Some(family) = &self.obj.family {
|
||||
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||
} else if let Some(id) = &self.obj.id {
|
||||
parent_type_name = Some(id.split('.').next_back().unwrap_or("").to_string());
|
||||
} else if let Some(ref_id) = &self.obj.r#ref {
|
||||
parent_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
||||
}
|
||||
|
||||
if let Some(p_type) = parent_type_name {
|
||||
if db.types.contains_key(&p_type) {
|
||||
for (prop_name, prop_schema) in props {
|
||||
let mut child_type_name = None;
|
||||
let mut target_schema = prop_schema.clone();
|
||||
|
||||
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) =
|
||||
&prop_schema.obj.type_
|
||||
{
|
||||
if t == "array" {
|
||||
if let Some(items) = &prop_schema.obj.items {
|
||||
target_schema = items.clone();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(family) = &target_schema.obj.family {
|
||||
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||
} else if let Some(ref_id) = target_schema.obj.r#ref.as_ref() {
|
||||
child_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
||||
} else if let Some(arr) = &target_schema.obj.one_of {
|
||||
if let Some(first) = arr.first() {
|
||||
if let Some(ref_id) = first.obj.id.as_ref().or(first.obj.r#ref.as_ref()) {
|
||||
child_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(c_type) = child_type_name {
|
||||
if db.types.contains_key(&c_type) {
|
||||
target_schema.compile(db, visited);
|
||||
if let Some(compiled_target_props) = target_schema.obj.compiled_properties.get() {
|
||||
let keys_for_ambiguity: Vec<String> =
|
||||
compiled_target_props.keys().cloned().collect();
|
||||
if let Some((relation, is_forward)) =
|
||||
resolve_relation(db, &p_type, &c_type, prop_name, Some(&keys_for_ambiguity))
|
||||
{
|
||||
schema_edges.insert(
|
||||
prop_name.clone(),
|
||||
crate::database::edge::Edge {
|
||||
constraint: relation.constraint.clone(),
|
||||
forward: is_forward,
|
||||
},
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
schema_edges
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn resolve_relation<'a>(
|
||||
db: &'a crate::database::Database,
|
||||
parent_type: &str,
|
||||
child_type: &str,
|
||||
prop_name: &str,
|
||||
relative_keys: Option<&Vec<String>>,
|
||||
) -> Option<(&'a crate::database::relation::Relation, bool)> {
|
||||
if parent_type == "entity" && child_type == "entity" {
|
||||
return None;
|
||||
}
|
||||
|
||||
let p_def = db.types.get(parent_type)?;
|
||||
let c_def = db.types.get(child_type)?;
|
||||
|
||||
let mut matching_rels = Vec::new();
|
||||
let mut directions = Vec::new();
|
||||
|
||||
for rel in db.relations.values() {
|
||||
let is_forward = p_def.hierarchy.contains(&rel.source_type)
|
||||
&& c_def.hierarchy.contains(&rel.destination_type);
|
||||
let is_reverse = p_def.hierarchy.contains(&rel.destination_type)
|
||||
&& c_def.hierarchy.contains(&rel.source_type);
|
||||
|
||||
if is_forward {
|
||||
matching_rels.push(rel);
|
||||
directions.push(true);
|
||||
} else if is_reverse {
|
||||
matching_rels.push(rel);
|
||||
directions.push(false);
|
||||
}
|
||||
}
|
||||
|
||||
if matching_rels.is_empty() {
|
||||
return None;
|
||||
}
|
||||
|
||||
if matching_rels.len() == 1 {
|
||||
return Some((matching_rels[0], directions[0]));
|
||||
}
|
||||
|
||||
let mut chosen_idx = 0;
|
||||
let mut resolved = false;
|
||||
|
||||
for (i, rel) in matching_rels.iter().enumerate() {
|
||||
if let Some(prefix) = &rel.prefix {
|
||||
if prop_name.starts_with(prefix)
|
||||
|| prefix.starts_with(prop_name)
|
||||
|| prefix.replace("_", "") == prop_name.replace("_", "")
|
||||
{
|
||||
chosen_idx = i;
|
||||
resolved = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !resolved && relative_keys.is_some() {
|
||||
let keys = relative_keys.unwrap();
|
||||
let mut missing_prefix_ids = Vec::new();
|
||||
for (i, rel) in matching_rels.iter().enumerate() {
|
||||
if let Some(prefix) = &rel.prefix {
|
||||
if !keys.contains(prefix) {
|
||||
missing_prefix_ids.push(i);
|
||||
}
|
||||
}
|
||||
}
|
||||
if missing_prefix_ids.len() == 1 {
|
||||
chosen_idx = missing_prefix_ids[0];
|
||||
}
|
||||
}
|
||||
|
||||
Some((matching_rels[chosen_idx], directions[chosen_idx]))
|
||||
}
|
||||
|
||||
impl<'de> Deserialize<'de> for Schema {
|
||||
|
||||
@ -60,7 +60,7 @@ pub fn jspg_setup(database: JsonB) -> JsonB {
|
||||
}
|
||||
|
||||
#[cfg_attr(not(test), pg_extern)]
|
||||
pub fn jspg_merge(data: JsonB) -> JsonB {
|
||||
pub fn jspg_merge(schema_id: &str, data: JsonB) -> JsonB {
|
||||
// Try to acquire a read lock to get a clone of the Engine Arc
|
||||
let engine_opt = {
|
||||
let lock = GLOBAL_JSPG.read().unwrap();
|
||||
@ -69,7 +69,7 @@ pub fn jspg_merge(data: JsonB) -> JsonB {
|
||||
|
||||
match engine_opt {
|
||||
Some(engine) => {
|
||||
let drop = engine.merger.merge(data.0);
|
||||
let drop = engine.merger.merge(schema_id, data.0);
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
None => jspg_failure(),
|
||||
|
||||
@ -21,10 +21,26 @@ impl Merger {
|
||||
}
|
||||
}
|
||||
|
||||
pub fn merge(&self, data: Value) -> crate::drop::Drop {
|
||||
pub fn merge(&self, schema_id: &str, data: Value) -> crate::drop::Drop {
|
||||
let mut notifications_queue = Vec::new();
|
||||
|
||||
let result = self.merge_internal(data.clone(), &mut notifications_queue);
|
||||
let target_schema = match self.db.schemas.get(schema_id) {
|
||||
Some(s) => Arc::new(s.clone()),
|
||||
None => {
|
||||
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||
code: "MERGE_FAILED".to_string(),
|
||||
message: format!("Unknown schema_id: {}", schema_id),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: "".to_string(),
|
||||
cause: None,
|
||||
context: Some(data),
|
||||
schema: None,
|
||||
},
|
||||
}]);
|
||||
}
|
||||
};
|
||||
|
||||
let result = self.merge_internal(target_schema, data.clone(), &mut notifications_queue);
|
||||
|
||||
let val_resolved = match result {
|
||||
Ok(val) => val,
|
||||
@ -88,24 +104,35 @@ impl Merger {
|
||||
|
||||
pub(crate) fn merge_internal(
|
||||
&self,
|
||||
schema: Arc<crate::database::schema::Schema>,
|
||||
data: Value,
|
||||
notifications: &mut Vec<String>,
|
||||
) -> Result<Value, String> {
|
||||
match data {
|
||||
Value::Array(items) => self.merge_array(items, notifications),
|
||||
Value::Object(map) => self.merge_object(map, notifications),
|
||||
Value::Array(items) => self.merge_array(schema, items, notifications),
|
||||
Value::Object(map) => self.merge_object(schema, map, notifications),
|
||||
_ => Err("Invalid merge payload: root must be an Object or Array".to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
fn merge_array(
|
||||
&self,
|
||||
schema: Arc<crate::database::schema::Schema>,
|
||||
items: Vec<Value>,
|
||||
notifications: &mut Vec<String>,
|
||||
) -> Result<Value, String> {
|
||||
let mut item_schema = schema.clone();
|
||||
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &schema.obj.type_ {
|
||||
if t == "array" {
|
||||
if let Some(items_def) = &schema.obj.items {
|
||||
item_schema = items_def.clone();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut resolved_items = Vec::new();
|
||||
for item in items {
|
||||
let resolved = self.merge_internal(item, notifications)?;
|
||||
let resolved = self.merge_internal(item_schema.clone(), item, notifications)?;
|
||||
resolved_items.push(resolved);
|
||||
}
|
||||
Ok(Value::Array(resolved_items))
|
||||
@ -113,6 +140,7 @@ impl Merger {
|
||||
|
||||
fn merge_object(
|
||||
&self,
|
||||
schema: Arc<crate::database::schema::Schema>,
|
||||
obj: serde_json::Map<String, Value>,
|
||||
notifications: &mut Vec<String>,
|
||||
) -> Result<Value, String> {
|
||||
@ -128,25 +156,49 @@ impl Merger {
|
||||
None => return Err(format!("Unknown entity type: {}", type_name)),
|
||||
};
|
||||
|
||||
// 1. Segment the entity: fields in type_def.fields are database fields, others are relationships
|
||||
let compiled_props = match schema.obj.compiled_properties.get() {
|
||||
Some(props) => props,
|
||||
None => return Err("Schema has no compiled properties for merging".to_string()),
|
||||
};
|
||||
|
||||
let mut entity_fields = serde_json::Map::new();
|
||||
let mut entity_objects = serde_json::Map::new();
|
||||
let mut entity_arrays = serde_json::Map::new();
|
||||
let mut entity_objects = std::collections::BTreeMap::new();
|
||||
let mut entity_arrays = std::collections::BTreeMap::new();
|
||||
|
||||
for (k, v) in obj {
|
||||
let is_field = type_def.fields.contains(&k) || k == "created";
|
||||
// Always retain system and unmapped core fields natively implicitly mapped to the Postgres tables
|
||||
if k == "id" || k == "type" || k == "created" {
|
||||
entity_fields.insert(k.clone(), v.clone());
|
||||
continue;
|
||||
}
|
||||
|
||||
if let Some(prop_schema) = compiled_props.get(&k) {
|
||||
let mut is_edge = false;
|
||||
if let Some(edges) = schema.obj.compiled_edges.get() {
|
||||
if edges.contains_key(&k) {
|
||||
is_edge = true;
|
||||
}
|
||||
}
|
||||
|
||||
if is_edge {
|
||||
let typeof_v = match &v {
|
||||
Value::Object(_) => "object",
|
||||
Value::Array(_) => "array",
|
||||
_ => "other",
|
||||
_ => "field", // Malformed edge data?
|
||||
};
|
||||
|
||||
if is_field {
|
||||
entity_fields.insert(k, v);
|
||||
} else if typeof_v == "object" {
|
||||
entity_objects.insert(k, v);
|
||||
if typeof_v == "object" {
|
||||
entity_objects.insert(k.clone(), (v.clone(), prop_schema.clone()));
|
||||
} else if typeof_v == "array" {
|
||||
entity_arrays.insert(k, v);
|
||||
entity_arrays.insert(k.clone(), (v.clone(), prop_schema.clone()));
|
||||
} else {
|
||||
entity_fields.insert(k.clone(), v.clone());
|
||||
}
|
||||
} else {
|
||||
// Not an edge! It's a raw Postgres column (e.g., JSONB, text[])
|
||||
entity_fields.insert(k.clone(), v.clone());
|
||||
}
|
||||
} else if type_def.fields.contains(&k) {
|
||||
entity_fields.insert(k.clone(), v.clone());
|
||||
}
|
||||
}
|
||||
|
||||
@ -156,7 +208,6 @@ impl Merger {
|
||||
let mut entity_change_kind = None;
|
||||
let mut entity_fetched = None;
|
||||
|
||||
// 2. Pre-stage the entity (for non-relationships)
|
||||
if !type_def.relationship {
|
||||
let (fields, kind, fetched) =
|
||||
self.stage_entity(entity_fields.clone(), type_def, &user_id, ×tamp)?;
|
||||
@ -167,44 +218,41 @@ impl Merger {
|
||||
|
||||
let mut entity_response = serde_json::Map::new();
|
||||
|
||||
// 3. Handle related objects
|
||||
for (relation_name, relative_val) in entity_objects {
|
||||
for (relation_name, (relative_val, rel_schema)) in entity_objects {
|
||||
let mut relative = match relative_val {
|
||||
Value::Object(m) => m,
|
||||
_ => continue,
|
||||
};
|
||||
|
||||
// Attempt to extract relative object type name
|
||||
let relative_type_name = match relative.get("type").and_then(|v| v.as_str()) {
|
||||
Some(t) => t,
|
||||
Some(t) => t.to_string(),
|
||||
None => continue,
|
||||
};
|
||||
|
||||
let relative_keys: Vec<String> = relative.keys().cloned().collect();
|
||||
|
||||
// Call central Database O(1) graph logic
|
||||
let relative_relation = self.db.get_relation(
|
||||
&type_def.name,
|
||||
relative_type_name,
|
||||
&relation_name,
|
||||
Some(&relative_keys),
|
||||
);
|
||||
|
||||
if let Some((relation, parent_is_source)) = relative_relation {
|
||||
if let Some(compiled_edges) = schema.obj.compiled_edges.get() {
|
||||
println!("Compiled Edges keys for relation {}: {:?}", relation_name, compiled_edges.keys().collect::<Vec<_>>());
|
||||
if let Some(edge) = compiled_edges.get(&relation_name) {
|
||||
println!("FOUND EDGE {} -> {:?}", relation_name, edge.constraint);
|
||||
if let Some(relation) = self.db.relations.get(&edge.constraint) {
|
||||
let parent_is_source = edge.forward;
|
||||
|
||||
if parent_is_source {
|
||||
// Parent holds FK to Child. Child MUST be generated FIRST.
|
||||
if !relative.contains_key("organization_id") {
|
||||
if let Some(org_id) = entity_fields.get("organization_id") {
|
||||
relative.insert("organization_id".to_string(), org_id.clone());
|
||||
}
|
||||
}
|
||||
|
||||
let merged_relative = match self.merge_internal(Value::Object(relative), notifications)? {
|
||||
let mut merged_relative = match self.merge_internal(rel_schema.clone(), Value::Object(relative), notifications)? {
|
||||
Value::Object(m) => m,
|
||||
_ => continue,
|
||||
};
|
||||
|
||||
merged_relative.insert(
|
||||
"type".to_string(),
|
||||
Value::String(relative_type_name),
|
||||
);
|
||||
|
||||
Self::apply_entity_relation(
|
||||
&mut entity_fields,
|
||||
&relation.source_columns,
|
||||
@ -213,7 +261,6 @@ impl Merger {
|
||||
);
|
||||
entity_response.insert(relation_name, Value::Object(merged_relative));
|
||||
} else {
|
||||
// Child holds FK back to Parent.
|
||||
if !relative.contains_key("organization_id") {
|
||||
if let Some(org_id) = entity_fields.get("organization_id") {
|
||||
relative.insert("organization_id".to_string(), org_id.clone());
|
||||
@ -227,7 +274,7 @@ impl Merger {
|
||||
&entity_fields,
|
||||
);
|
||||
|
||||
let merged_relative = match self.merge_internal(Value::Object(relative), notifications)? {
|
||||
let merged_relative = match self.merge_internal(rel_schema.clone(), Value::Object(relative), notifications)? {
|
||||
Value::Object(m) => m,
|
||||
_ => continue,
|
||||
};
|
||||
@ -236,8 +283,9 @@ impl Merger {
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Post-stage the entity (for relationships)
|
||||
if type_def.relationship {
|
||||
let (fields, kind, fetched) =
|
||||
self.stage_entity(entity_fields.clone(), type_def, &user_id, ×tamp)?;
|
||||
@ -246,7 +294,6 @@ impl Merger {
|
||||
entity_fetched = fetched;
|
||||
}
|
||||
|
||||
// 5. Process the main entity fields
|
||||
self.merge_entity_fields(
|
||||
entity_change_kind.as_deref().unwrap_or(""),
|
||||
&type_name,
|
||||
@ -255,13 +302,11 @@ impl Merger {
|
||||
entity_fetched.as_ref(),
|
||||
)?;
|
||||
|
||||
// Add main entity fields to response
|
||||
for (k, v) in &entity_fields {
|
||||
entity_response.insert(k.clone(), v.clone());
|
||||
}
|
||||
|
||||
// 6. Handle related arrays
|
||||
for (relation_name, relative_val) in entity_arrays {
|
||||
for (relation_name, (relative_val, rel_schema)) in entity_arrays {
|
||||
let relative_arr = match relative_val {
|
||||
Value::Array(a) => a,
|
||||
_ => continue,
|
||||
@ -271,28 +316,9 @@ impl Merger {
|
||||
continue;
|
||||
}
|
||||
|
||||
let first_relative = match &relative_arr[0] {
|
||||
Value::Object(m) => m,
|
||||
_ => continue,
|
||||
};
|
||||
|
||||
// Attempt to extract relative object type name
|
||||
let relative_type_name = match first_relative.get("type").and_then(|v| v.as_str()) {
|
||||
Some(t) => t,
|
||||
None => continue,
|
||||
};
|
||||
|
||||
let relative_keys: Vec<String> = first_relative.keys().cloned().collect();
|
||||
|
||||
// Call central Database O(1) graph logic
|
||||
let relative_relation = self.db.get_relation(
|
||||
&type_def.name,
|
||||
relative_type_name,
|
||||
&relation_name,
|
||||
Some(&relative_keys),
|
||||
);
|
||||
|
||||
if let Some((relation, _)) = relative_relation {
|
||||
if let Some(compiled_edges) = schema.obj.compiled_edges.get() {
|
||||
if let Some(edge) = compiled_edges.get(&relation_name) {
|
||||
if let Some(relation) = self.db.relations.get(&edge.constraint) {
|
||||
let mut relative_responses = Vec::new();
|
||||
for relative_item_val in relative_arr {
|
||||
if let Value::Object(mut relative_item) = relative_item_val {
|
||||
@ -309,8 +335,17 @@ impl Merger {
|
||||
&entity_fields,
|
||||
);
|
||||
|
||||
let mut item_schema = rel_schema.clone();
|
||||
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) = &rel_schema.obj.type_ {
|
||||
if t == "array" {
|
||||
if let Some(items_def) = &rel_schema.obj.items {
|
||||
item_schema = items_def.clone();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let merged_relative =
|
||||
match self.merge_internal(Value::Object(relative_item), notifications)? {
|
||||
match self.merge_internal(item_schema, Value::Object(relative_item), notifications)? {
|
||||
Value::Object(m) => m,
|
||||
_ => continue,
|
||||
};
|
||||
@ -321,6 +356,8 @@ impl Merger {
|
||||
entity_response.insert(relation_name, Value::Array(relative_responses));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 7. Perform change tracking dynamically suppressing noise based on type bounds!
|
||||
let notify_sql = self.merge_entity_change(
|
||||
@ -366,6 +403,23 @@ impl Merger {
|
||||
> {
|
||||
let type_name = type_def.name.as_str();
|
||||
|
||||
// 🚀 Anchor Short-Circuit Optimization
|
||||
// An anchor is STRICTLY a struct containing merely an `id` and `type`.
|
||||
// We aggressively bypass Database SPI `SELECT` fetches because there are no primitive
|
||||
// mutations to apply to the row. PostgreSQL inherently protects relationships via Foreign Keys downstream.
|
||||
let is_anchor = entity_fields.len() == 2
|
||||
&& entity_fields.contains_key("id")
|
||||
&& entity_fields.contains_key("type");
|
||||
|
||||
let has_valid_id = entity_fields
|
||||
.get("id")
|
||||
.and_then(|v| v.as_str())
|
||||
.map_or(false, |s| !s.is_empty());
|
||||
|
||||
if is_anchor && has_valid_id {
|
||||
return Ok((entity_fields, None, None));
|
||||
}
|
||||
|
||||
let entity_fetched = self.fetch_entity(&entity_fields, type_def)?;
|
||||
|
||||
let system_keys = vec![
|
||||
|
||||
@ -12,6 +12,7 @@ pub struct Node<'a> {
|
||||
pub parent_alias: String,
|
||||
pub parent_type_aliases: Option<std::sync::Arc<std::collections::HashMap<String, String>>>,
|
||||
pub parent_type: Option<&'a crate::database::r#type::Type>,
|
||||
pub parent_schema: Option<std::sync::Arc<crate::database::schema::Schema>>,
|
||||
pub property_name: Option<String>,
|
||||
pub depth: usize,
|
||||
pub ast_path: String,
|
||||
@ -19,11 +20,7 @@ pub struct Node<'a> {
|
||||
|
||||
impl<'a> Compiler<'a> {
|
||||
/// Compiles a JSON schema into a nested PostgreSQL query returning JSONB
|
||||
pub fn compile(
|
||||
&self,
|
||||
schema_id: &str,
|
||||
filter_keys: &[String],
|
||||
) -> Result<String, String> {
|
||||
pub fn compile(&self, schema_id: &str, filter_keys: &[String]) -> Result<String, String> {
|
||||
let schema = self
|
||||
.db
|
||||
.schemas
|
||||
@ -43,6 +40,7 @@ impl<'a> Compiler<'a> {
|
||||
parent_alias: "t1".to_string(),
|
||||
parent_type_aliases: None,
|
||||
parent_type: None,
|
||||
parent_schema: None,
|
||||
property_name: None,
|
||||
depth: 0,
|
||||
ast_path: String::new(),
|
||||
@ -66,11 +64,7 @@ impl<'a> Compiler<'a> {
|
||||
|
||||
fn compile_array(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
||||
if let Some(items) = &node.schema.obj.items {
|
||||
let next_path = if node.ast_path.is_empty() {
|
||||
String::from("#")
|
||||
} else {
|
||||
format!("{}.#", node.ast_path)
|
||||
};
|
||||
let next_path = node.ast_path.clone();
|
||||
|
||||
if let Some(ref_id) = &items.obj.r#ref {
|
||||
if let Some(type_def) = self.db.types.get(ref_id) {
|
||||
@ -247,12 +241,12 @@ impl<'a> Compiler<'a> {
|
||||
if fam_type_def.variations.len() == 1 {
|
||||
let mut bypass_schema = crate::database::schema::Schema::default();
|
||||
bypass_schema.obj.r#ref = Some(family_target.clone());
|
||||
bypass_schema.compile(self.db, &mut std::collections::HashSet::new());
|
||||
|
||||
let mut bypass_node = node.clone();
|
||||
bypass_node.schema = std::sync::Arc::new(bypass_schema);
|
||||
|
||||
let mut bypassed_args =
|
||||
self.compile_select_clause(r#type, table_aliases, bypass_node)?;
|
||||
let mut bypassed_args = self.compile_select_clause(r#type, table_aliases, bypass_node)?;
|
||||
select_args.append(&mut bypassed_args);
|
||||
} else {
|
||||
let mut family_schemas = Vec::new();
|
||||
@ -263,6 +257,7 @@ impl<'a> Compiler<'a> {
|
||||
for variation in &sorted_fam_variations {
|
||||
let mut ref_schema = crate::database::schema::Schema::default();
|
||||
ref_schema.obj.r#ref = Some(variation.clone());
|
||||
ref_schema.compile(self.db, &mut std::collections::HashSet::new());
|
||||
family_schemas.push(std::sync::Arc::new(ref_schema));
|
||||
}
|
||||
|
||||
@ -400,7 +395,7 @@ impl<'a> Compiler<'a> {
|
||||
) -> Result<Vec<String>, String> {
|
||||
let mut select_args = Vec::new();
|
||||
let grouped_fields = r#type.grouped_fields.as_ref().and_then(|v| v.as_object());
|
||||
let merged_props = self.get_merged_properties(node.schema.as_ref());
|
||||
let merged_props = node.schema.obj.compiled_properties.get().unwrap();
|
||||
let mut sorted_keys: Vec<&String> = merged_props.keys().collect();
|
||||
sorted_keys.sort();
|
||||
|
||||
@ -449,21 +444,21 @@ impl<'a> Compiler<'a> {
|
||||
}
|
||||
}
|
||||
|
||||
let mut child_node = node.clone();
|
||||
child_node.parent_alias = owner_alias.clone();
|
||||
let arc_aliases = std::sync::Arc::new(table_aliases.clone());
|
||||
child_node.parent_type_aliases = Some(arc_aliases);
|
||||
child_node.parent_type = Some(r#type);
|
||||
child_node.property_name = Some(prop_key.clone());
|
||||
child_node.depth += 1;
|
||||
let next_path = if node.ast_path.is_empty() {
|
||||
let child_node = Node {
|
||||
schema: std::sync::Arc::clone(prop_schema),
|
||||
parent_alias: owner_alias.clone(),
|
||||
parent_type_aliases: Some(std::sync::Arc::new(table_aliases.clone())),
|
||||
parent_type: Some(r#type),
|
||||
parent_schema: Some(std::sync::Arc::clone(&node.schema)),
|
||||
property_name: Some(prop_key.clone()),
|
||||
depth: node.depth + 1,
|
||||
ast_path: if node.ast_path.is_empty() {
|
||||
prop_key.clone()
|
||||
} else {
|
||||
format!("{}.{}", node.ast_path, prop_key)
|
||||
format!("{}/{}", node.ast_path, prop_key)
|
||||
},
|
||||
};
|
||||
|
||||
child_node.ast_path = next_path;
|
||||
child_node.schema = std::sync::Arc::clone(prop_schema);
|
||||
|
||||
let (val_sql, val_type) = self.compile_node(child_node)?;
|
||||
|
||||
@ -491,10 +486,23 @@ impl<'a> Compiler<'a> {
|
||||
.unwrap_or_else(|| base_alias.clone());
|
||||
|
||||
let mut where_clauses = Vec::new();
|
||||
|
||||
// Dynamically apply the 'active-only' default ONLY if the client
|
||||
// didn't explicitly request to filter on 'archived' themselves!
|
||||
let has_archived_override = self.filter_keys.iter().any(|k| k == "archived");
|
||||
|
||||
if !has_archived_override {
|
||||
where_clauses.push(format!("NOT {}.archived", entity_alias));
|
||||
}
|
||||
|
||||
self.compile_filter_conditions(r#type, type_aliases, &node, &base_alias, &mut where_clauses);
|
||||
self.compile_relation_conditions(r#type, type_aliases, &node, &base_alias, &mut where_clauses)?;
|
||||
self.compile_relation_conditions(
|
||||
r#type,
|
||||
type_aliases,
|
||||
&node,
|
||||
&base_alias,
|
||||
&mut where_clauses,
|
||||
)?;
|
||||
|
||||
Ok(where_clauses)
|
||||
}
|
||||
@ -509,7 +517,10 @@ impl<'a> Compiler<'a> {
|
||||
for (t_name, fields_val) in gf {
|
||||
if let Some(fields_arr) = fields_val.as_array() {
|
||||
if fields_arr.iter().any(|v| v.as_str() == Some(field_name)) {
|
||||
return type_aliases.get(t_name).cloned().unwrap_or_else(|| base_alias.to_string());
|
||||
return type_aliases
|
||||
.get(t_name)
|
||||
.cloned()
|
||||
.unwrap_or_else(|| base_alias.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -571,15 +582,15 @@ impl<'a> Compiler<'a> {
|
||||
let op = parts.next().unwrap_or("$eq");
|
||||
|
||||
let field_name = if node.ast_path.is_empty() {
|
||||
if full_field_path.contains('.') || full_field_path.contains('#') {
|
||||
if full_field_path.contains('/') {
|
||||
continue;
|
||||
}
|
||||
full_field_path
|
||||
} else {
|
||||
let prefix = format!("{}.", node.ast_path);
|
||||
let prefix = format!("{}/", node.ast_path);
|
||||
if full_field_path.starts_with(&prefix) {
|
||||
let remainder = &full_field_path[prefix.len()..];
|
||||
if remainder.contains('.') || remainder.contains('#') {
|
||||
if remainder.contains('/') {
|
||||
continue;
|
||||
}
|
||||
remainder
|
||||
@ -606,13 +617,31 @@ impl<'a> Compiler<'a> {
|
||||
));
|
||||
} else {
|
||||
let sql_op = match op {
|
||||
"$eq" => if is_ilike { "ILIKE" } else { "=" },
|
||||
"$ne" => if is_ilike { "NOT ILIKE" } else { "!=" },
|
||||
"$eq" => {
|
||||
if is_ilike {
|
||||
"ILIKE"
|
||||
} else {
|
||||
"="
|
||||
}
|
||||
}
|
||||
"$ne" => {
|
||||
if is_ilike {
|
||||
"NOT ILIKE"
|
||||
} else {
|
||||
"!="
|
||||
}
|
||||
}
|
||||
"$gt" => ">",
|
||||
"$gte" => ">=",
|
||||
"$lt" => "<",
|
||||
"$lte" => "<=",
|
||||
_ => if is_ilike { "ILIKE" } else { "=" },
|
||||
_ => {
|
||||
if is_ilike {
|
||||
"ILIKE"
|
||||
} else {
|
||||
"="
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
let param_sql = if is_ilike && (op == "$eq" || op == "$ne") {
|
||||
@ -639,20 +668,20 @@ impl<'a> Compiler<'a> {
|
||||
) -> Result<(), String> {
|
||||
if let Some(prop_ref) = &node.property_name {
|
||||
let prop = prop_ref.as_str();
|
||||
println!("DEBUG: Eval prop: {}", prop);
|
||||
|
||||
let mut parent_relation_alias = node.parent_alias.clone();
|
||||
let mut child_relation_alias = base_alias.to_string();
|
||||
|
||||
if let Some(parent_type) = node.parent_type {
|
||||
let merged_props = self.get_merged_properties(node.schema.as_ref());
|
||||
let relative_keys: Vec<String> = merged_props.keys().cloned().collect();
|
||||
|
||||
let (relation, is_parent_source) = self
|
||||
.db
|
||||
.get_relation(&parent_type.name, &r#type.name, prop, Some(&relative_keys))
|
||||
.ok_or_else(|| {
|
||||
if let Some(parent_schema) = &node.parent_schema {
|
||||
if let Some(compiled_edges) = parent_schema.obj.compiled_edges.get() {
|
||||
if let Some(edge) = compiled_edges.get(prop) {
|
||||
let is_parent_source = edge.forward;
|
||||
let relation = self.db.relations.get(&edge.constraint).ok_or_else(|| {
|
||||
format!(
|
||||
"Could not dynamically resolve database relation mapping for {} -> {} on property {}",
|
||||
parent_type.name, r#type.name, prop
|
||||
"Could not find exact relation constraint {} statically mapped from {} -> {} property {}",
|
||||
edge.constraint, parent_type.name, r#type.name, prop
|
||||
)
|
||||
})?;
|
||||
|
||||
@ -693,27 +722,9 @@ impl<'a> Compiler<'a> {
|
||||
where_clauses.push(sql_string);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn get_merged_properties(
|
||||
&self,
|
||||
schema: &crate::database::schema::Schema,
|
||||
) -> std::collections::BTreeMap<String, Arc<crate::database::schema::Schema>> {
|
||||
let mut props = std::collections::BTreeMap::new();
|
||||
|
||||
if let Some(ref_id) = &schema.obj.r#ref {
|
||||
if let Some(parent_schema) = self.db.schemas.get(ref_id) {
|
||||
props.extend(self.get_merged_properties(parent_schema));
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(local_props) = &schema.obj.properties {
|
||||
for (k, v) in local_props {
|
||||
props.insert(k.clone(), v.clone());
|
||||
}
|
||||
}
|
||||
|
||||
props
|
||||
}
|
||||
}
|
||||
|
||||
@ -54,6 +54,45 @@ impl Queryer {
|
||||
self.execute_sql(schema_id, &sql, &args)
|
||||
}
|
||||
|
||||
fn extract_filters(
|
||||
prefix: String,
|
||||
val: &serde_json::Value,
|
||||
entries: &mut Vec<(String, serde_json::Value)>,
|
||||
) -> Result<(), String> {
|
||||
if let Some(obj) = val.as_object() {
|
||||
let mut is_op_obj = false;
|
||||
if let Some(first_key) = obj.keys().next() {
|
||||
if first_key.starts_with('$') {
|
||||
is_op_obj = true;
|
||||
}
|
||||
}
|
||||
|
||||
if is_op_obj {
|
||||
for (op, op_val) in obj {
|
||||
if !op.starts_with('$') {
|
||||
return Err(format!("Filter operator must start with '$', got: {}", op));
|
||||
}
|
||||
entries.push((format!("{}:{}", prefix, op), op_val.clone()));
|
||||
}
|
||||
} else {
|
||||
for (k, v) in obj {
|
||||
let next_prefix = if prefix.is_empty() {
|
||||
k.clone()
|
||||
} else {
|
||||
format!("{}/{}", prefix, k)
|
||||
};
|
||||
Self::extract_filters(next_prefix, v, entries)?;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return Err(format!(
|
||||
"Filter for path '{}' must be an operator object like {{$eq: ...}} or a nested map.",
|
||||
prefix
|
||||
));
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn parse_filter_entries(
|
||||
&self,
|
||||
filters_map: Option<&serde_json::Map<String, serde_json::Value>>,
|
||||
@ -61,19 +100,7 @@ impl Queryer {
|
||||
let mut filter_entries: Vec<(String, serde_json::Value)> = Vec::new();
|
||||
if let Some(fm) = filters_map {
|
||||
for (key, val) in fm {
|
||||
if let Some(obj) = val.as_object() {
|
||||
for (op, op_val) in obj {
|
||||
if !op.starts_with('$') {
|
||||
return Err(format!("Filter operator must start with '$', got: {}", op));
|
||||
}
|
||||
filter_entries.push((format!("{}:{}", key, op), op_val.clone()));
|
||||
}
|
||||
} else {
|
||||
return Err(format!(
|
||||
"Filter for field '{}' must be an object with operators like $eq, $in, etc.",
|
||||
key
|
||||
));
|
||||
}
|
||||
Self::extract_filters(key.clone(), val, &mut filter_entries)?;
|
||||
}
|
||||
}
|
||||
filter_entries.sort_by(|a, b| a.0.cmp(&b.0));
|
||||
|
||||
@ -1451,6 +1451,12 @@ fn test_queryer_0_6() {
|
||||
crate::tests::runner::run_test_case(&path, 0, 6).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_queryer_0_7() {
|
||||
let path = format!("{}/fixtures/queryer.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 7).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_not_0_0() {
|
||||
let path = format!("{}/fixtures/not.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -8542,3 +8548,15 @@ fn test_merger_0_8() {
|
||||
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 8).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_merger_0_9() {
|
||||
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 9).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_merger_0_10() {
|
||||
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 10).unwrap();
|
||||
}
|
||||
|
||||
@ -10,7 +10,7 @@ fn test_library_api() {
|
||||
// 1. Initially, schemas are not cached.
|
||||
|
||||
// Expected uninitialized drop format: errors + null response
|
||||
let uninitialized_drop = jspg_validate("test_schema", JsonB(json!({})));
|
||||
let uninitialized_drop = jspg_validate("source_schema", JsonB(json!({})));
|
||||
assert_eq!(
|
||||
uninitialized_drop.0,
|
||||
json!({
|
||||
@ -27,17 +27,44 @@ fn test_library_api() {
|
||||
let db_json = json!({
|
||||
"puncs": [],
|
||||
"enums": [],
|
||||
"relations": [],
|
||||
"types": [{
|
||||
"relations": [
|
||||
{
|
||||
"id": "11111111-1111-1111-1111-111111111111",
|
||||
"type": "relation",
|
||||
"constraint": "fk_test_target",
|
||||
"source_type": "source_schema",
|
||||
"source_columns": ["target_id"],
|
||||
"destination_type": "target_schema",
|
||||
"destination_columns": ["id"],
|
||||
"prefix": "target"
|
||||
}
|
||||
],
|
||||
"types": [
|
||||
{
|
||||
"name": "source_schema",
|
||||
"hierarchy": ["source_schema", "entity"],
|
||||
"schemas": [{
|
||||
"$id": "test_schema",
|
||||
"$id": "source_schema",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": { "type": "string" }
|
||||
"name": { "type": "string" },
|
||||
"target": { "$ref": "target_schema" }
|
||||
},
|
||||
"required": ["name"]
|
||||
}]
|
||||
},
|
||||
{
|
||||
"name": "target_schema",
|
||||
"hierarchy": ["target_schema", "entity"],
|
||||
"schemas": [{
|
||||
"$id": "target_schema",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"value": { "type": "number" }
|
||||
}
|
||||
}]
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
let cache_drop = jspg_setup(JsonB(db_json));
|
||||
@ -56,20 +83,39 @@ fn test_library_api() {
|
||||
json!({
|
||||
"type": "drop",
|
||||
"response": {
|
||||
"test_schema": {
|
||||
"$id": "test_schema",
|
||||
"source_schema": {
|
||||
"$id": "source_schema",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": { "type": "string" }
|
||||
"name": { "type": "string" },
|
||||
"target": {
|
||||
"$ref": "target_schema",
|
||||
"compiledProperties": ["value"]
|
||||
}
|
||||
},
|
||||
"required": ["name"]
|
||||
"required": ["name"],
|
||||
"compiledProperties": ["name", "target"],
|
||||
"compiledEdges": {
|
||||
"target": {
|
||||
"constraint": "fk_test_target",
|
||||
"forward": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"target_schema": {
|
||||
"$id": "target_schema",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"value": { "type": "number" }
|
||||
},
|
||||
"compiledProperties": ["value"]
|
||||
}
|
||||
}
|
||||
})
|
||||
);
|
||||
|
||||
// 4. Validate Happy Path
|
||||
let happy_drop = jspg_validate("test_schema", JsonB(json!({"name": "Neo"})));
|
||||
let happy_drop = jspg_validate("source_schema", JsonB(json!({"name": "Neo"})));
|
||||
assert_eq!(
|
||||
happy_drop.0,
|
||||
json!({
|
||||
@ -79,7 +125,7 @@ fn test_library_api() {
|
||||
);
|
||||
|
||||
// 5. Validate Unhappy Path
|
||||
let unhappy_drop = jspg_validate("test_schema", JsonB(json!({"wrong": "data"})));
|
||||
let unhappy_drop = jspg_validate("source_schema", JsonB(json!({"wrong": "data"})));
|
||||
assert_eq!(
|
||||
unhappy_drop.0,
|
||||
json!({
|
||||
|
||||
@ -99,7 +99,7 @@ impl Case {
|
||||
let merger = Merger::new(db.clone());
|
||||
|
||||
let test_data = self.data.clone().unwrap_or(Value::Null);
|
||||
let result = merger.merge(test_data);
|
||||
let result = merger.merge(&self.schema_id, test_data);
|
||||
|
||||
let expected_success = self.expect.as_ref().map(|e| e.success).unwrap_or(false);
|
||||
let got_success = result.errors.is_empty();
|
||||
|
||||
@ -8,7 +8,7 @@ impl<'a> ValidationContext<'a> {
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
let current = self.instance;
|
||||
if let Some(ref compiled_fmt) = self.schema.compiled_format {
|
||||
if let Some(compiled_fmt) = self.schema.compiled_format.get() {
|
||||
match compiled_fmt {
|
||||
crate::database::schema::CompiledFormat::Func(f) => {
|
||||
let should = if let Some(s) = current.as_str() {
|
||||
|
||||
@ -13,13 +13,18 @@ impl<'a> ValidationContext<'a> {
|
||||
) -> Result<bool, ValidationError> {
|
||||
let current = self.instance;
|
||||
if let Some(obj) = current.as_object() {
|
||||
// Entity Bound Implicit Type Validation
|
||||
if let Some(lookup_key) = self.schema.id.as_ref().or(self.schema.r#ref.as_ref()) {
|
||||
let base_type_name = lookup_key.split('.').next_back().unwrap_or("").to_string();
|
||||
if let Some(type_def) = self.db.types.get(&base_type_name)
|
||||
&& let Some(type_val) = obj.get("type")
|
||||
// Entity implicit type validation
|
||||
// Use the specific schema id or ref as a fallback
|
||||
if let Some(identifier) = self.schema.id.as_ref().or(self.schema.r#ref.as_ref()) {
|
||||
// Kick in if the data object has a type field
|
||||
if let Some(type_val) = obj.get("type")
|
||||
&& let Some(type_str) = type_val.as_str()
|
||||
{
|
||||
// Get the string or the final segment as the base
|
||||
let base = identifier.split('.').next_back().unwrap_or("").to_string();
|
||||
// Check if the base is a global type name
|
||||
if let Some(type_def) = self.db.types.get(&base) {
|
||||
// Ensure the instance type is a variation of the global type
|
||||
if type_def.variations.contains(type_str) {
|
||||
// Ensure it passes strict mode
|
||||
result.evaluated_keys.insert("type".to_string());
|
||||
@ -33,8 +38,15 @@ impl<'a> ValidationContext<'a> {
|
||||
path: format!("{}/type", self.path),
|
||||
});
|
||||
}
|
||||
} else {
|
||||
// Ad-Hoc schemas natively use strict schema discriminator strings instead of variation inheritance
|
||||
if type_str == identifier {
|
||||
result.evaluated_keys.insert("type".to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(min) = self.schema.min_properties
|
||||
&& (obj.len() as f64) < min
|
||||
{
|
||||
@ -44,6 +56,7 @@ impl<'a> ValidationContext<'a> {
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
|
||||
if let Some(max) = self.schema.max_properties
|
||||
&& (obj.len() as f64) > max
|
||||
{
|
||||
@ -53,6 +66,7 @@ impl<'a> ValidationContext<'a> {
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
|
||||
if let Some(ref req) = self.schema.required {
|
||||
for field in req {
|
||||
if !obj.contains_key(field) {
|
||||
@ -114,10 +128,14 @@ impl<'a> ValidationContext<'a> {
|
||||
|
||||
// Entity Bound Implicit Type Interception
|
||||
if key == "type"
|
||||
&& let Some(lookup_key) = sub_schema.id.as_ref().or(sub_schema.r#ref.as_ref())
|
||||
&& let Some(schema_bound) = sub_schema.id.as_ref().or(sub_schema.r#ref.as_ref())
|
||||
{
|
||||
let base_type_name = lookup_key.split('.').next_back().unwrap_or("").to_string();
|
||||
if let Some(type_def) = self.db.types.get(&base_type_name)
|
||||
let physical_type_name = schema_bound
|
||||
.split('.')
|
||||
.next_back()
|
||||
.unwrap_or("")
|
||||
.to_string();
|
||||
if let Some(type_def) = self.db.types.get(&physical_type_name)
|
||||
&& let Some(instance_type) = child_instance.as_str()
|
||||
&& type_def.variations.contains(instance_type)
|
||||
{
|
||||
@ -133,7 +151,7 @@ impl<'a> ValidationContext<'a> {
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref compiled_pp) = self.schema.compiled_pattern_properties {
|
||||
if let Some(compiled_pp) = self.schema.compiled_pattern_properties.get() {
|
||||
for (compiled_re, sub_schema) in compiled_pp {
|
||||
for (key, child_instance) in obj {
|
||||
if compiled_re.0.is_match(key) {
|
||||
@ -165,7 +183,7 @@ impl<'a> ValidationContext<'a> {
|
||||
{
|
||||
locally_matched = true;
|
||||
}
|
||||
if !locally_matched && let Some(ref compiled_pp) = self.schema.compiled_pattern_properties
|
||||
if !locally_matched && let Some(compiled_pp) = self.schema.compiled_pattern_properties.get()
|
||||
{
|
||||
for (compiled_re, _) in compiled_pp {
|
||||
if compiled_re.0.is_match(key) {
|
||||
|
||||
@ -28,7 +28,7 @@ impl<'a> ValidationContext<'a> {
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(ref compiled_re) = self.schema.compiled_pattern {
|
||||
if let Some(compiled_re) = self.schema.compiled_pattern.get() {
|
||||
if !compiled_re.0.is_match(s) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "PATTERN_VIOLATED".to_string(),
|
||||
|
||||
Reference in New Issue
Block a user