Compare commits
9 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 8730a828c6 | |||
| 776a442374 | |||
| 5c1779651c | |||
| 6c047e326d | |||
| 7876567ae7 | |||
| 06f6a587de | |||
| 29d8dfb608 | |||
| 5b36ecf06c | |||
| 76467a6fed |
18
GEMINI.md
18
GEMINI.md
@ -24,10 +24,14 @@ To support high-throughput operations while allowing for runtime updates (e.g.,
|
|||||||
4. **Lock-Free Reads**: Incoming operations acquire a read lock just long enough to clone the `Arc` inside an `RwLock<Option<Arc<Validator>>>`, ensuring zero blocking during schema updates.
|
4. **Lock-Free Reads**: Incoming operations acquire a read lock just long enough to clone the `Arc` inside an `RwLock<Option<Arc<Validator>>>`, ensuring zero blocking during schema updates.
|
||||||
|
|
||||||
### Relational Edge Resolution
|
### Relational Edge Resolution
|
||||||
When compiling nested object graphs or arrays, the JSPG engine must dynamically infer which Postgres Foreign Key constraint correctly bridges the parent to the nested schema. It utilizes a strict 3-step hierarchical resolution:
|
When compiling nested object graphs or arrays, the JSPG engine must dynamically infer which Postgres Foreign Key constraint correctly bridges the parent to the nested schema. To guarantee deterministic SQL generation, it utilizes a strict, multi-step algebraic resolution process applied during the `OnceLock` Compilation phase:
|
||||||
1. **Direct Prefix Match**: If an explicitly prefixed Foreign Key (e.g. `fk_invoice_counterparty_entity` -> `prefix: "counterparty"`) matches the exact name of the requested schema property (e.g. `{"counterparty": {...}}`), it is instantly selected.
|
|
||||||
2. **Base Edge Fallback (1:M)**: If no explicit prefix directly matches the property name, the compiler filters for explicitly one remaining relation with a `null` prefix (e.g. `fk_invoice_line_invoice` -> `prefix: null`). A `null` prefix mathematically denotes the standard structural parent-child ownership edge (bypassing any M:M ambiguity) and is safely picked over explicit (but unmatched) property edges.
|
1. **Graph Locality Boundary**: Before evaluating constraints, the engine ensures the parent and child types do not belong strictly to the same inheritance lineage (e.g., `invoice` -> `activity`). Structural inheritance edges are handled natively by the payload merger, so relational edge discovery is intentionally bypassed.
|
||||||
3. **Ambiguity Elimination (M:M)**: If multiple explicitly prefixed relations remain (which happens by design in Many-to-Many junction tables like `contact` utilizing `fk_relationship_source` and `fk_relationship_target`), the compiler uses a process of elimination. It checks which of the prefix names the child schema *natively consumes* as an outbound property (e.g. `contact` defines `{ "target": ... }`). It considers that prefix "used up" and mathematically deduces the *remaining* explicitly prefixed relation (`"source"`) must be the inbound link from the parent.
|
2. **Structural Cardinality Filtration**: If the JSON Schema requires an Array collection (`{"type": "array"}`), JSPG mathematically rejects pure scalar Forward constraints (where the parent holds a single UUID pointer), logically narrowing the possibilities to Reverse (1:N) or Junction (M:M) constraints.
|
||||||
|
3. **Exact Prefix Match**: If an explicitly prefixed Foreign Key (e.g. `fk_invoice_counterparty_entity` -> `prefix: "counterparty"`) directly matches the name of the requested schema property (e.g. `{"counterparty": {...}}`), it is instantly selected.
|
||||||
|
4. **Ambiguity Elimination (M:M Twin Deduction)**: If multiple explicitly prefixed relations remain (which happens by design in Many-to-Many junction tables like `contact` or `role`), the compiler inspects the actual compiled child JSON schema AST. If it observes the child natively consumes one of the prefixes as an explicit outbound property (e.g. `contact` explicitly defining `{ "target": ... }`), it considers that arrow "used up". It mathematically deduces that its exact twin providing reverse ownership (`"source"`) MUST be the inbound link mapping from the parent.
|
||||||
|
5. **Implicit Base Fallback (1:M)**: If no explicit prefix matches, and M:M deduction fails, the compiler filters for exactly one remaining relation with a `null` prefix (e.g. `fk_invoice_line_invoice` -> `prefix: null`). A `null` prefix mathematically denotes the core structural parent-child ownership edge and is used safely as a fallback.
|
||||||
|
6. **Deterministic Abort**: If the engine exhausts all deduction pathways and the edge remains ambiguous, it explicitly aborts schema compilation (`returns None`) rather than silently generating unpredictable SQL.
|
||||||
|
|
||||||
### Global API Reference
|
### Global API Reference
|
||||||
These functions operate on the global `GLOBAL_JSPG` engine instance and provide administrative boundaries:
|
These functions operate on the global `GLOBAL_JSPG` engine instance and provide administrative boundaries:
|
||||||
@ -52,8 +56,8 @@ JSPG implements specific extensions to the Draft 2020-12 standard to support the
|
|||||||
|
|
||||||
#### A. Polymorphism & Referencing (`$ref`, `$family`, and Native Types)
|
#### A. Polymorphism & Referencing (`$ref`, `$family`, and Native Types)
|
||||||
* **Native Type Discrimination (`variations`)**: Schemas defined inside a Postgres `type` are Entities. The validator securely and implicitly manages their `"type"` property. If an entity inherits from `user`, incoming JSON can safely define `{"type": "person"}` without errors, thanks to `compiled_variations` inheritance.
|
* **Native Type Discrimination (`variations`)**: Schemas defined inside a Postgres `type` are Entities. The validator securely and implicitly manages their `"type"` property. If an entity inherits from `user`, incoming JSON can safely define `{"type": "person"}` without errors, thanks to `compiled_variations` inheritance.
|
||||||
* **Structural Inheritance & Viral Infection (`$ref`)**: `$ref` is used exclusively for structural inheritance, *never* for union creation. A Punc request schema that `$ref`s an Entity virally inherits all physical database polymorphism rules for that target.
|
* **Structural Inheritance & Viral Infection (`$ref`)**: `$ref` is used exclusively for structural inheritance and explicit composition, *never* for union creation. A `$ref` ALWAYS targets a specific, *single* schema struct (e.g., `full.person`). It represents an explicit, known structural shape. A Punc request schema that `$ref`s an Entity virally inherits all physical database polymorphism rules for that target.
|
||||||
* **Shape Polymorphism (`$family`)**: Auto-expands polymorphic API lists based on an abstract **Descendants Graph**. If `{"$family": "widget"}` is used, the Validator dynamically identifies *every* schema in the registry that `$ref`s `widget` (e.g., `stock.widget`, `task.widget`) and evaluates the JSON against all of them.
|
* **Shape Polymorphism (`$family`)**: Unlike `$ref`, `$family` ALWAYS targets an abstract *table lineage* (e.g., `organization` or `widget`). It instructs the engine to dynamically expand the response payload into multiple possible schema shapes based on the row's physical database `type`. If `{"$family": "widget"}` is used, the Validator dynamically identifies *every* schema in the registry that `$ref`s `widget` (e.g., `stock.widget`, `task.widget`) and recursively evaluates the JSON against all of them.
|
||||||
* **Strict Matches & Depth Heuristic**: Polymorphic structures MUST match exactly **one** schema permutation. If multiple inherited struct permutations pass, JSPG applies the **Depth Heuristic Tie-Breaker**, selecting the candidate deepest in the inheritance tree.
|
* **Strict Matches & Depth Heuristic**: Polymorphic structures MUST match exactly **one** schema permutation. If multiple inherited struct permutations pass, JSPG applies the **Depth Heuristic Tie-Breaker**, selecting the candidate deepest in the inheritance tree.
|
||||||
|
|
||||||
#### B. Dot-Notation Schema Resolution & Database Mapping
|
#### B. Dot-Notation Schema Resolution & Database Mapping
|
||||||
@ -113,7 +117,7 @@ The Queryer transforms Postgres into a pre-compiled Semantic Query Engine, desig
|
|||||||
|
|
||||||
* **Caching Strategy (DashMap SQL Caching)**: The Queryer securely caches its compiled, static SQL string templates per schema permutation inside the `GLOBAL_JSPG` concurrent `DashMap`. This eliminates recursive AST schema crawling on consecutive requests. Furthermore, it evaluates the strings via Postgres SPI (Server Programming Interface) Prepared Statements, leveraging native database caching of execution plans for extreme performance.
|
* **Caching Strategy (DashMap SQL Caching)**: The Queryer securely caches its compiled, static SQL string templates per schema permutation inside the `GLOBAL_JSPG` concurrent `DashMap`. This eliminates recursive AST schema crawling on consecutive requests. Furthermore, it evaluates the strings via Postgres SPI (Server Programming Interface) Prepared Statements, leveraging native database caching of execution plans for extreme performance.
|
||||||
* **Schema-to-SQL Compilation**: Compiles JSON Schema ASTs spanning deep arrays directly into static, pre-planned SQL multi-JOIN queries. This explicitly features the `Smart Merge` evaluation engine which natively translates properties through `allOf` and `$ref` inheritances, mapping JSON fields specifically to their physical database table aliases during translation.
|
* **Schema-to-SQL Compilation**: Compiles JSON Schema ASTs spanning deep arrays directly into static, pre-planned SQL multi-JOIN queries. This explicitly features the `Smart Merge` evaluation engine which natively translates properties through `allOf` and `$ref` inheritances, mapping JSON fields specifically to their physical database table aliases during translation.
|
||||||
* **Dynamic Filtering**: Binds parameters natively through `cue.filters` objects. The queryer enforces a strict, structured, MongoDB-style operator syntax to map incoming JSON request paths directly to their originating structural table columns.
|
* **Dynamic Filtering**: Binds parameters natively through `cue.filters` objects. The queryer enforces a strict, structured, MongoDB-style operator syntax to map incoming JSON request constraints directly to their originating structural table columns. Filters support both flat path notation (e.g., `"contacts/is_primary": {...}`) and deeply nested recursive JSON structures (e.g., `{"contacts": {"is_primary": {...}}}`). The queryer recursively traverses and flattens these structures at AST compilation time.
|
||||||
* **Equality / Inequality**: `{"$eq": value}`, `{"$ne": value}` automatically map to `=` and `!=`.
|
* **Equality / Inequality**: `{"$eq": value}`, `{"$ne": value}` automatically map to `=` and `!=`.
|
||||||
* **Comparison**: `{"$gt": ...}`, `{"$gte": ...}`, `{"$lt": ...}`, `{"$lte": ...}` directly compile to Postgres comparison operators (`> `, `>=`, `<`, `<=`).
|
* **Comparison**: `{"$gt": ...}`, `{"$gte": ...}`, `{"$lt": ...}`, `{"$lte": ...}` directly compile to Postgres comparison operators (`> `, `>=`, `<`, `<=`).
|
||||||
* **Array Inclusion**: `{"$in": [values]}`, `{"$nin": [values]}` use native `jsonb_array_elements_text()` bindings to enforce `IN` and `NOT IN` logic without runtime SQL injection risks.
|
* **Array Inclusion**: `{"$in": [values]}`, `{"$nin": [values]}` use native `jsonb_array_elements_text()` bindings to enforce `IN` and `NOT IN` logic without runtime SQL injection risks.
|
||||||
|
|||||||
0
agreego.sql
Normal file
0
agreego.sql
Normal file
388
fixtures/database.json
Normal file
388
fixtures/database.json
Normal file
@ -0,0 +1,388 @@
|
|||||||
|
[
|
||||||
|
{
|
||||||
|
"description": "Edge missing - 0 relations",
|
||||||
|
"database": {
|
||||||
|
"types": [
|
||||||
|
{
|
||||||
|
"id": "11111111-1111-1111-1111-111111111111",
|
||||||
|
"type": "type",
|
||||||
|
"name": "org",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"org"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"org"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "full.org",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"missing_users": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$ref": "full.user"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "22222222-2222-2222-2222-222222222222",
|
||||||
|
"type": "type",
|
||||||
|
"name": "user",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"user"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"user"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "full.user",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"relations": []
|
||||||
|
},
|
||||||
|
"tests": [
|
||||||
|
{
|
||||||
|
"description": "throws EDGE_MISSING when 0 relations exist between org and user",
|
||||||
|
"action": "compile",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "EDGE_MISSING"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Edge missing - array cardinality rejection",
|
||||||
|
"database": {
|
||||||
|
"types": [
|
||||||
|
{
|
||||||
|
"id": "11111111-1111-1111-1111-111111111111",
|
||||||
|
"type": "type",
|
||||||
|
"name": "parent",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"parent"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"parent"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "full.parent",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"children": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$ref": "full.child"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "22222222-2222-2222-2222-222222222222",
|
||||||
|
"type": "type",
|
||||||
|
"name": "child",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"child"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"child"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "full.child",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"relations": [
|
||||||
|
{
|
||||||
|
"id": "33333333-3333-3333-3333-333333333333",
|
||||||
|
"type": "relation",
|
||||||
|
"constraint": "fk_parent_child",
|
||||||
|
"source_type": "parent",
|
||||||
|
"source_columns": [
|
||||||
|
"child_id"
|
||||||
|
],
|
||||||
|
"destination_type": "child",
|
||||||
|
"destination_columns": [
|
||||||
|
"id"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"tests": [
|
||||||
|
{
|
||||||
|
"description": "throws EDGE_MISSING because a Forward scaler edge cannot mathematically fulfill an Array collection",
|
||||||
|
"action": "compile",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "EDGE_MISSING"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Ambiguous type relations - multiple unprefixed relations",
|
||||||
|
"database": {
|
||||||
|
"types": [
|
||||||
|
{
|
||||||
|
"id": "11111111-1111-1111-1111-111111111111",
|
||||||
|
"type": "type",
|
||||||
|
"name": "invoice",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"invoice"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"invoice"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "full.invoice",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"activities": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$ref": "full.activity"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "22222222-2222-2222-2222-222222222222",
|
||||||
|
"type": "type",
|
||||||
|
"name": "activity",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"activity"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"activity"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "full.activity",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"relations": [
|
||||||
|
{
|
||||||
|
"id": "33333333-3333-3333-3333-333333333333",
|
||||||
|
"type": "relation",
|
||||||
|
"constraint": "fk_activity_invoice_1",
|
||||||
|
"source_type": "activity",
|
||||||
|
"source_columns": [
|
||||||
|
"invoice_id_1"
|
||||||
|
],
|
||||||
|
"destination_type": "invoice",
|
||||||
|
"destination_columns": [
|
||||||
|
"id"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "44444444-4444-4444-4444-444444444444",
|
||||||
|
"type": "relation",
|
||||||
|
"constraint": "fk_activity_invoice_2",
|
||||||
|
"source_type": "activity",
|
||||||
|
"source_columns": [
|
||||||
|
"invoice_id_2"
|
||||||
|
],
|
||||||
|
"destination_type": "invoice",
|
||||||
|
"destination_columns": [
|
||||||
|
"id"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"tests": [
|
||||||
|
{
|
||||||
|
"description": "throws AMBIGUOUS_TYPE_RELATIONS when fallback encounters multiple naked constraints",
|
||||||
|
"action": "compile",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "AMBIGUOUS_TYPE_RELATIONS"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Ambiguous type relations - M:M twin deduction failure",
|
||||||
|
"database": {
|
||||||
|
"types": [
|
||||||
|
{
|
||||||
|
"id": "11111111-1111-1111-1111-111111111111",
|
||||||
|
"type": "type",
|
||||||
|
"name": "actor",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"actor"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"actor"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "full.actor",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"ambiguous_edge": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$ref": "empty.junction"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "22222222-2222-2222-2222-222222222222",
|
||||||
|
"type": "type",
|
||||||
|
"name": "junction",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"junction"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"junction"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "empty.junction",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"relations": [
|
||||||
|
{
|
||||||
|
"id": "33333333-3333-3333-3333-333333333333",
|
||||||
|
"type": "relation",
|
||||||
|
"constraint": "fk_junction_source_actor",
|
||||||
|
"source_type": "junction",
|
||||||
|
"source_columns": [
|
||||||
|
"source_id"
|
||||||
|
],
|
||||||
|
"destination_type": "actor",
|
||||||
|
"destination_columns": [
|
||||||
|
"id"
|
||||||
|
],
|
||||||
|
"prefix": "source"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "44444444-4444-4444-4444-444444444444",
|
||||||
|
"type": "relation",
|
||||||
|
"constraint": "fk_junction_target_actor",
|
||||||
|
"source_type": "junction",
|
||||||
|
"source_columns": [
|
||||||
|
"target_id"
|
||||||
|
],
|
||||||
|
"destination_type": "actor",
|
||||||
|
"destination_columns": [
|
||||||
|
"id"
|
||||||
|
],
|
||||||
|
"prefix": "target"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"tests": [
|
||||||
|
{
|
||||||
|
"description": "throws AMBIGUOUS_TYPE_RELATIONS because child doesn't explicitly expose 'source' or 'target' for twin deduction",
|
||||||
|
"action": "compile",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "AMBIGUOUS_TYPE_RELATIONS"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Database type parse failed",
|
||||||
|
"database": {
|
||||||
|
"types": [
|
||||||
|
{
|
||||||
|
"id": [
|
||||||
|
"must",
|
||||||
|
"be",
|
||||||
|
"string",
|
||||||
|
"to",
|
||||||
|
"fail"
|
||||||
|
],
|
||||||
|
"type": "type",
|
||||||
|
"name": "failure",
|
||||||
|
"module": "test",
|
||||||
|
"source": "test",
|
||||||
|
"hierarchy": [
|
||||||
|
"failure"
|
||||||
|
],
|
||||||
|
"variations": [
|
||||||
|
"failure"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"tests": [
|
||||||
|
{
|
||||||
|
"description": "throws DATABASE_TYPE_PARSE_FAILED when metadata completely fails Serde typing",
|
||||||
|
"action": "compile",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "DATABASE_TYPE_PARSE_FAILED"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
@ -142,7 +142,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "CONST_VIOLATED",
|
"code": "CONST_VIOLATED",
|
||||||
"path": "con"
|
"details": { "path": "con" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@ -154,8 +154,8 @@
|
|||||||
"success": false,
|
"success": false,
|
||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "FAMILY_MISMATCH",
|
"code": "NO_FAMILY_MATCH",
|
||||||
"path": ""
|
"details": { "path": "" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@ -47,8 +47,8 @@
|
|||||||
"success": false,
|
"success": false,
|
||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "TYPE_MISMATCH",
|
"code": "INVALID_TYPE",
|
||||||
"path": "base_prop"
|
"details": { "path": "base_prop" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -109,7 +109,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "REQUIRED_FIELD_MISSING",
|
"code": "REQUIRED_FIELD_MISSING",
|
||||||
"path": "a"
|
"details": { "path": "a" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -126,7 +126,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "REQUIRED_FIELD_MISSING",
|
"code": "REQUIRED_FIELD_MISSING",
|
||||||
"path": "b"
|
"details": { "path": "b" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -195,8 +195,8 @@
|
|||||||
"success": false,
|
"success": false,
|
||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "DEPENDENCY_FAILED",
|
"code": "DEPENDENCY_MISSING",
|
||||||
"path": "base_dep"
|
"details": { "path": "" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -213,8 +213,8 @@
|
|||||||
"success": false,
|
"success": false,
|
||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "DEPENDENCY_FAILED",
|
"code": "DEPENDENCY_MISSING",
|
||||||
"path": "child_dep"
|
"details": { "path": "" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@ -123,7 +123,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "INVALID_TYPE",
|
"code": "INVALID_TYPE",
|
||||||
"path": "primitives/1"
|
"details": { "path": "primitives/1" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -147,7 +147,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "REQUIRED_FIELD_MISSING",
|
"code": "REQUIRED_FIELD_MISSING",
|
||||||
"path": "ad_hoc_objects/1/name"
|
"details": { "path": "ad_hoc_objects/1/name" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -173,7 +173,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "MINIMUM_VIOLATED",
|
"code": "MINIMUM_VIOLATED",
|
||||||
"path": "entities/entity-beta/value"
|
"details": { "path": "entities/entity-beta/value" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -204,7 +204,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "INVALID_TYPE",
|
"code": "INVALID_TYPE",
|
||||||
"path": "deep_entities/parent-omega/nested/child-beta/flag"
|
"details": { "path": "deep_entities/parent-omega/nested/child-beta/flag" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@ -4,20 +4,32 @@
|
|||||||
"database": {
|
"database": {
|
||||||
"puncs": [
|
"puncs": [
|
||||||
{
|
{
|
||||||
"name": "get_entities",
|
"name": "get_organization",
|
||||||
"schemas": [
|
"schemas": [
|
||||||
{
|
{
|
||||||
"$id": "get_entities.response",
|
"$id": "get_organization.response",
|
||||||
"$family": "organization"
|
"$ref": "organization"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "get_persons",
|
"name": "get_organizations",
|
||||||
"schemas": [
|
"schemas": [
|
||||||
{
|
{
|
||||||
"$id": "get_persons.response",
|
"$id": "get_organizations.response",
|
||||||
"$family": "base.person"
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$family": "organization"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "get_person",
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "get_person.response",
|
||||||
|
"$family": "person"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@ -122,14 +134,12 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
"field_types": {
|
"field_types": {
|
||||||
"id": "uuid",
|
"id": "uuid",
|
||||||
"name": "text",
|
|
||||||
"archived": "boolean",
|
"archived": "boolean",
|
||||||
"created_at": "timestamptz",
|
"created_at": "timestamptz",
|
||||||
"type": "text"
|
"type": "text"
|
||||||
@ -146,9 +156,6 @@
|
|||||||
"type": {
|
"type": {
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"name": {
|
|
||||||
"type": "string"
|
|
||||||
},
|
|
||||||
"archived": {
|
"archived": {
|
||||||
"type": "boolean"
|
"type": "boolean"
|
||||||
},
|
},
|
||||||
@ -165,7 +172,6 @@
|
|||||||
"fields": [
|
"fields": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -200,11 +206,12 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
"organization": []
|
"organization": [
|
||||||
|
"name"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
"field_types": {
|
"field_types": {
|
||||||
"id": "uuid",
|
"id": "uuid",
|
||||||
@ -227,6 +234,17 @@
|
|||||||
"bot",
|
"bot",
|
||||||
"organization",
|
"organization",
|
||||||
"person"
|
"person"
|
||||||
|
],
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "organization",
|
||||||
|
"$ref": "entity",
|
||||||
|
"properties": {
|
||||||
|
"name": {
|
||||||
|
"type": "string"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -248,11 +266,12 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
"organization": [],
|
"organization": [
|
||||||
|
"name"
|
||||||
|
],
|
||||||
"bot": [
|
"bot": [
|
||||||
"token"
|
"token"
|
||||||
]
|
]
|
||||||
@ -301,11 +320,12 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
"organization": [],
|
"organization": [
|
||||||
|
"name"
|
||||||
|
],
|
||||||
"person": [
|
"person": [
|
||||||
"first_name",
|
"first_name",
|
||||||
"last_name",
|
"last_name",
|
||||||
@ -324,7 +344,7 @@
|
|||||||
},
|
},
|
||||||
"schemas": [
|
"schemas": [
|
||||||
{
|
{
|
||||||
"$id": "base.person",
|
"$id": "person",
|
||||||
"$ref": "organization",
|
"$ref": "organization",
|
||||||
"properties": {
|
"properties": {
|
||||||
"first_name": {
|
"first_name": {
|
||||||
@ -340,12 +360,12 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"$id": "light.person",
|
"$id": "light.person",
|
||||||
"$ref": "base.person",
|
"$ref": "person",
|
||||||
"properties": {}
|
"properties": {}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"$id": "full.person",
|
"$id": "full.person",
|
||||||
"$ref": "base.person",
|
"$ref": "person",
|
||||||
"properties": {
|
"properties": {
|
||||||
"phone_numbers": {
|
"phone_numbers": {
|
||||||
"type": "array",
|
"type": "array",
|
||||||
@ -422,7 +442,6 @@
|
|||||||
"target_type",
|
"target_type",
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -430,7 +449,6 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -449,7 +467,6 @@
|
|||||||
"source_type": "text",
|
"source_type": "text",
|
||||||
"target_id": "uuid",
|
"target_id": "uuid",
|
||||||
"target_type": "text",
|
"target_type": "text",
|
||||||
"name": "text",
|
|
||||||
"created_at": "timestamptz"
|
"created_at": "timestamptz"
|
||||||
},
|
},
|
||||||
"schemas": [
|
"schemas": [
|
||||||
@ -480,7 +497,6 @@
|
|||||||
"target_type",
|
"target_type",
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -488,7 +504,6 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -511,7 +526,6 @@
|
|||||||
"target_id": "uuid",
|
"target_id": "uuid",
|
||||||
"target_type": "text",
|
"target_type": "text",
|
||||||
"is_primary": "boolean",
|
"is_primary": "boolean",
|
||||||
"name": "text",
|
|
||||||
"created_at": "timestamptz"
|
"created_at": "timestamptz"
|
||||||
},
|
},
|
||||||
"schemas": [
|
"schemas": [
|
||||||
@ -539,7 +553,6 @@
|
|||||||
"number",
|
"number",
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -547,7 +560,6 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -560,7 +572,6 @@
|
|||||||
"type": "text",
|
"type": "text",
|
||||||
"archived": "boolean",
|
"archived": "boolean",
|
||||||
"number": "text",
|
"number": "text",
|
||||||
"name": "text",
|
|
||||||
"created_at": "timestamptz"
|
"created_at": "timestamptz"
|
||||||
},
|
},
|
||||||
"schemas": [
|
"schemas": [
|
||||||
@ -588,7 +599,6 @@
|
|||||||
"address",
|
"address",
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -596,7 +606,6 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -609,7 +618,6 @@
|
|||||||
"type": "text",
|
"type": "text",
|
||||||
"archived": "boolean",
|
"archived": "boolean",
|
||||||
"address": "text",
|
"address": "text",
|
||||||
"name": "text",
|
|
||||||
"created_at": "timestamptz"
|
"created_at": "timestamptz"
|
||||||
},
|
},
|
||||||
"schemas": [
|
"schemas": [
|
||||||
@ -637,7 +645,6 @@
|
|||||||
"city",
|
"city",
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -645,7 +652,6 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"archived",
|
"archived",
|
||||||
"created_at"
|
"created_at"
|
||||||
],
|
],
|
||||||
@ -658,7 +664,6 @@
|
|||||||
"type": "text",
|
"type": "text",
|
||||||
"archived": "boolean",
|
"archived": "boolean",
|
||||||
"city": "text",
|
"city": "text",
|
||||||
"name": "text",
|
|
||||||
"created_at": "timestamptz"
|
"created_at": "timestamptz"
|
||||||
},
|
},
|
||||||
"schemas": [
|
"schemas": [
|
||||||
@ -696,7 +701,7 @@
|
|||||||
"$ref": "order",
|
"$ref": "order",
|
||||||
"properties": {
|
"properties": {
|
||||||
"customer": {
|
"customer": {
|
||||||
"$ref": "base.person"
|
"$ref": "person"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
@ -705,7 +710,7 @@
|
|||||||
"$ref": "order",
|
"$ref": "order",
|
||||||
"properties": {
|
"properties": {
|
||||||
"customer": {
|
"customer": {
|
||||||
"$ref": "base.person"
|
"$ref": "person"
|
||||||
},
|
},
|
||||||
"lines": {
|
"lines": {
|
||||||
"type": "array",
|
"type": "array",
|
||||||
@ -723,7 +728,6 @@
|
|||||||
"fields": [
|
"fields": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"total",
|
"total",
|
||||||
"customer_id",
|
"customer_id",
|
||||||
"created_at",
|
"created_at",
|
||||||
@ -746,7 +750,6 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"created_at",
|
"created_at",
|
||||||
"created_by",
|
"created_by",
|
||||||
"modified_at",
|
"modified_at",
|
||||||
@ -762,7 +765,6 @@
|
|||||||
"field_types": {
|
"field_types": {
|
||||||
"id": "uuid",
|
"id": "uuid",
|
||||||
"type": "text",
|
"type": "text",
|
||||||
"name": "text",
|
|
||||||
"archived": "boolean",
|
"archived": "boolean",
|
||||||
"total": "numeric",
|
"total": "numeric",
|
||||||
"customer_id": "uuid",
|
"customer_id": "uuid",
|
||||||
@ -803,7 +805,6 @@
|
|||||||
"fields": [
|
"fields": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"order_id",
|
"order_id",
|
||||||
"product",
|
"product",
|
||||||
"price",
|
"price",
|
||||||
@ -824,7 +825,6 @@
|
|||||||
"entity": [
|
"entity": [
|
||||||
"id",
|
"id",
|
||||||
"type",
|
"type",
|
||||||
"name",
|
|
||||||
"created_at",
|
"created_at",
|
||||||
"created_by",
|
"created_by",
|
||||||
"modified_at",
|
"modified_at",
|
||||||
@ -838,7 +838,6 @@
|
|||||||
"field_types": {
|
"field_types": {
|
||||||
"id": "uuid",
|
"id": "uuid",
|
||||||
"type": "text",
|
"type": "text",
|
||||||
"name": "text",
|
|
||||||
"archived": "boolean",
|
"archived": "boolean",
|
||||||
"order_id": "uuid",
|
"order_id": "uuid",
|
||||||
"product": "text",
|
"product": "text",
|
||||||
@ -852,31 +851,6 @@
|
|||||||
"order_line"
|
"order_line"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
|
||||||
"schemas": [
|
|
||||||
{
|
|
||||||
"$id": "entity",
|
|
||||||
"type": "object",
|
|
||||||
"properties": {}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"$id": "organization",
|
|
||||||
"type": "object",
|
|
||||||
"$ref": "entity",
|
|
||||||
"properties": {}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"$id": "bot",
|
|
||||||
"type": "object",
|
|
||||||
"$ref": "bot",
|
|
||||||
"properties": {}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"$id": "person",
|
|
||||||
"type": "object",
|
|
||||||
"$ref": "base.person",
|
|
||||||
"properties": {}
|
|
||||||
}
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
"tests": [
|
"tests": [
|
||||||
@ -892,7 +866,6 @@
|
|||||||
" 'archived', entity_1.archived,",
|
" 'archived', entity_1.archived,",
|
||||||
" 'created_at', entity_1.created_at,",
|
" 'created_at', entity_1.created_at,",
|
||||||
" 'id', entity_1.id,",
|
" 'id', entity_1.id,",
|
||||||
" 'name', entity_1.name,",
|
|
||||||
" 'type', entity_1.type)",
|
" 'type', entity_1.type)",
|
||||||
"FROM agreego.entity entity_1",
|
"FROM agreego.entity entity_1",
|
||||||
"WHERE NOT entity_1.archived)"
|
"WHERE NOT entity_1.archived)"
|
||||||
@ -915,22 +888,6 @@
|
|||||||
"123e4567-e89b-12d3-a456-426614174001"
|
"123e4567-e89b-12d3-a456-426614174001"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
"name": {
|
|
||||||
"$eq": "Jane%",
|
|
||||||
"$ne": "John%",
|
|
||||||
"$gt": "A",
|
|
||||||
"$gte": "B",
|
|
||||||
"$lt": "Z",
|
|
||||||
"$lte": "Y",
|
|
||||||
"$in": [
|
|
||||||
"Jane",
|
|
||||||
"John"
|
|
||||||
],
|
|
||||||
"$nin": [
|
|
||||||
"Bob",
|
|
||||||
"Alice"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"created_at": {
|
"created_at": {
|
||||||
"$eq": "2023-01-01T00:00:00Z",
|
"$eq": "2023-01-01T00:00:00Z",
|
||||||
"$ne": "2023-01-02T00:00:00Z",
|
"$ne": "2023-01-02T00:00:00Z",
|
||||||
@ -952,7 +909,6 @@
|
|||||||
" 'archived', entity_1.archived,",
|
" 'archived', entity_1.archived,",
|
||||||
" 'created_at', entity_1.created_at,",
|
" 'created_at', entity_1.created_at,",
|
||||||
" 'id', entity_1.id,",
|
" 'id', entity_1.id,",
|
||||||
" 'name', entity_1.name,",
|
|
||||||
" 'type', entity_1.type",
|
" 'type', entity_1.type",
|
||||||
")",
|
")",
|
||||||
"FROM agreego.entity entity_1",
|
"FROM agreego.entity entity_1",
|
||||||
@ -970,14 +926,6 @@
|
|||||||
" AND entity_1.id IN (SELECT value::uuid FROM jsonb_array_elements_text(($10#>>'{}')::jsonb))",
|
" AND entity_1.id IN (SELECT value::uuid FROM jsonb_array_elements_text(($10#>>'{}')::jsonb))",
|
||||||
" AND entity_1.id != ($11#>>'{}')::uuid",
|
" AND entity_1.id != ($11#>>'{}')::uuid",
|
||||||
" AND entity_1.id NOT IN (SELECT value::uuid FROM jsonb_array_elements_text(($12#>>'{}')::jsonb))",
|
" AND entity_1.id NOT IN (SELECT value::uuid FROM jsonb_array_elements_text(($12#>>'{}')::jsonb))",
|
||||||
" AND entity_1.name ILIKE $13#>>'{}'",
|
|
||||||
" AND entity_1.name > ($14#>>'{}')",
|
|
||||||
" AND entity_1.name >= ($15#>>'{}')",
|
|
||||||
" AND entity_1.name IN (SELECT value FROM jsonb_array_elements_text(($16#>>'{}')::jsonb))",
|
|
||||||
" AND entity_1.name < ($17#>>'{}')",
|
|
||||||
" AND entity_1.name <= ($18#>>'{}')",
|
|
||||||
" AND entity_1.name NOT ILIKE $19#>>'{}'",
|
|
||||||
" AND entity_1.name NOT IN (SELECT value FROM jsonb_array_elements_text(($20#>>'{}')::jsonb))",
|
|
||||||
")"
|
")"
|
||||||
]
|
]
|
||||||
]
|
]
|
||||||
@ -986,7 +934,7 @@
|
|||||||
{
|
{
|
||||||
"description": "Person select on base schema",
|
"description": "Person select on base schema",
|
||||||
"action": "query",
|
"action": "query",
|
||||||
"schema_id": "base.person",
|
"schema_id": "person",
|
||||||
"expect": {
|
"expect": {
|
||||||
"success": true,
|
"success": true,
|
||||||
"sql": [
|
"sql": [
|
||||||
@ -998,7 +946,7 @@
|
|||||||
" 'first_name', person_1.first_name,",
|
" 'first_name', person_1.first_name,",
|
||||||
" 'id', entity_3.id,",
|
" 'id', entity_3.id,",
|
||||||
" 'last_name', person_1.last_name,",
|
" 'last_name', person_1.last_name,",
|
||||||
" 'name', entity_3.name,",
|
" 'name', organization_2.name,",
|
||||||
" 'type', entity_3.type)",
|
" 'type', entity_3.type)",
|
||||||
"FROM agreego.person person_1",
|
"FROM agreego.person person_1",
|
||||||
"JOIN agreego.organization organization_2 ON organization_2.id = person_1.id",
|
"JOIN agreego.organization organization_2 ON organization_2.id = person_1.id",
|
||||||
@ -1023,14 +971,12 @@
|
|||||||
" 'created_at', entity_6.created_at,",
|
" 'created_at', entity_6.created_at,",
|
||||||
" 'id', entity_6.id,",
|
" 'id', entity_6.id,",
|
||||||
" 'is_primary', contact_4.is_primary,",
|
" 'is_primary', contact_4.is_primary,",
|
||||||
" 'name', entity_6.name,",
|
|
||||||
" 'target',",
|
" 'target',",
|
||||||
" (SELECT jsonb_build_object(",
|
" (SELECT jsonb_build_object(",
|
||||||
" 'archived', entity_8.archived,",
|
" 'archived', entity_8.archived,",
|
||||||
" 'city', address_7.city,",
|
" 'city', address_7.city,",
|
||||||
" 'created_at', entity_8.created_at,",
|
" 'created_at', entity_8.created_at,",
|
||||||
" 'id', entity_8.id,",
|
" 'id', entity_8.id,",
|
||||||
" 'name', entity_8.name,",
|
|
||||||
" 'type', entity_8.type",
|
" 'type', entity_8.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.address address_7",
|
" FROM agreego.address address_7",
|
||||||
@ -1055,7 +1001,6 @@
|
|||||||
" 'created_at', entity_11.created_at,",
|
" 'created_at', entity_11.created_at,",
|
||||||
" 'id', entity_11.id,",
|
" 'id', entity_11.id,",
|
||||||
" 'is_primary', contact_9.is_primary,",
|
" 'is_primary', contact_9.is_primary,",
|
||||||
" 'name', entity_11.name,",
|
|
||||||
" 'target', CASE",
|
" 'target', CASE",
|
||||||
" WHEN entity_11.target_type = 'address' THEN",
|
" WHEN entity_11.target_type = 'address' THEN",
|
||||||
" ((SELECT jsonb_build_object(",
|
" ((SELECT jsonb_build_object(",
|
||||||
@ -1063,7 +1008,6 @@
|
|||||||
" 'city', address_16.city,",
|
" 'city', address_16.city,",
|
||||||
" 'created_at', entity_17.created_at,",
|
" 'created_at', entity_17.created_at,",
|
||||||
" 'id', entity_17.id,",
|
" 'id', entity_17.id,",
|
||||||
" 'name', entity_17.name,",
|
|
||||||
" 'type', entity_17.type",
|
" 'type', entity_17.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.address address_16",
|
" FROM agreego.address address_16",
|
||||||
@ -1077,7 +1021,6 @@
|
|||||||
" 'archived', entity_15.archived,",
|
" 'archived', entity_15.archived,",
|
||||||
" 'created_at', entity_15.created_at,",
|
" 'created_at', entity_15.created_at,",
|
||||||
" 'id', entity_15.id,",
|
" 'id', entity_15.id,",
|
||||||
" 'name', entity_15.name,",
|
|
||||||
" 'type', entity_15.type",
|
" 'type', entity_15.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.email_address email_address_14",
|
" FROM agreego.email_address email_address_14",
|
||||||
@ -1090,7 +1033,6 @@
|
|||||||
" 'archived', entity_13.archived,",
|
" 'archived', entity_13.archived,",
|
||||||
" 'created_at', entity_13.created_at,",
|
" 'created_at', entity_13.created_at,",
|
||||||
" 'id', entity_13.id,",
|
" 'id', entity_13.id,",
|
||||||
" 'name', entity_13.name,",
|
|
||||||
" 'number', phone_number_12.number,",
|
" 'number', phone_number_12.number,",
|
||||||
" 'type', entity_13.type",
|
" 'type', entity_13.type",
|
||||||
" )",
|
" )",
|
||||||
@ -1115,14 +1057,12 @@
|
|||||||
" 'created_at', entity_20.created_at,",
|
" 'created_at', entity_20.created_at,",
|
||||||
" 'id', entity_20.id,",
|
" 'id', entity_20.id,",
|
||||||
" 'is_primary', contact_18.is_primary,",
|
" 'is_primary', contact_18.is_primary,",
|
||||||
" 'name', entity_20.name,",
|
|
||||||
" 'target',",
|
" 'target',",
|
||||||
" (SELECT jsonb_build_object(",
|
" (SELECT jsonb_build_object(",
|
||||||
" 'address', email_address_21.address,",
|
" 'address', email_address_21.address,",
|
||||||
" 'archived', entity_22.archived,",
|
" 'archived', entity_22.archived,",
|
||||||
" 'created_at', entity_22.created_at,",
|
" 'created_at', entity_22.created_at,",
|
||||||
" 'id', entity_22.id,",
|
" 'id', entity_22.id,",
|
||||||
" 'name', entity_22.name,",
|
|
||||||
" 'type', entity_22.type",
|
" 'type', entity_22.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.email_address email_address_21",
|
" FROM agreego.email_address email_address_21",
|
||||||
@ -1142,20 +1082,18 @@
|
|||||||
" 'first_name', person_1.first_name,",
|
" 'first_name', person_1.first_name,",
|
||||||
" 'id', entity_3.id,",
|
" 'id', entity_3.id,",
|
||||||
" 'last_name', person_1.last_name,",
|
" 'last_name', person_1.last_name,",
|
||||||
" 'name', entity_3.name,",
|
" 'name', organization_2.name,",
|
||||||
" 'phone_numbers',",
|
" 'phone_numbers',",
|
||||||
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||||
" 'archived', entity_25.archived,",
|
" 'archived', entity_25.archived,",
|
||||||
" 'created_at', entity_25.created_at,",
|
" 'created_at', entity_25.created_at,",
|
||||||
" 'id', entity_25.id,",
|
" 'id', entity_25.id,",
|
||||||
" 'is_primary', contact_23.is_primary,",
|
" 'is_primary', contact_23.is_primary,",
|
||||||
" 'name', entity_25.name,",
|
|
||||||
" 'target',",
|
" 'target',",
|
||||||
" (SELECT jsonb_build_object(",
|
" (SELECT jsonb_build_object(",
|
||||||
" 'archived', entity_27.archived,",
|
" 'archived', entity_27.archived,",
|
||||||
" 'created_at', entity_27.created_at,",
|
" 'created_at', entity_27.created_at,",
|
||||||
" 'id', entity_27.id,",
|
" 'id', entity_27.id,",
|
||||||
" 'name', entity_27.name,",
|
|
||||||
" 'number', phone_number_26.number,",
|
" 'number', phone_number_26.number,",
|
||||||
" 'type', entity_27.type",
|
" 'type', entity_27.type",
|
||||||
" )",
|
" )",
|
||||||
@ -1208,8 +1146,10 @@
|
|||||||
"$eq": true,
|
"$eq": true,
|
||||||
"$ne": false
|
"$ne": false
|
||||||
},
|
},
|
||||||
"contacts/is_primary": {
|
"contacts": {
|
||||||
"$eq": true
|
"is_primary": {
|
||||||
|
"$eq": true
|
||||||
|
}
|
||||||
},
|
},
|
||||||
"created_at": {
|
"created_at": {
|
||||||
"$eq": "2020-01-01T00:00:00Z",
|
"$eq": "2020-01-01T00:00:00Z",
|
||||||
@ -1248,8 +1188,12 @@
|
|||||||
"$eq": "%Doe%",
|
"$eq": "%Doe%",
|
||||||
"$ne": "%Smith%"
|
"$ne": "%Smith%"
|
||||||
},
|
},
|
||||||
"phone_numbers/target/number": {
|
"phone_numbers": {
|
||||||
"$eq": "555-1234"
|
"target": {
|
||||||
|
"number": {
|
||||||
|
"$eq": "555-1234"
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"expect": {
|
"expect": {
|
||||||
@ -1263,14 +1207,12 @@
|
|||||||
" 'created_at', entity_6.created_at,",
|
" 'created_at', entity_6.created_at,",
|
||||||
" 'id', entity_6.id,",
|
" 'id', entity_6.id,",
|
||||||
" 'is_primary', contact_4.is_primary,",
|
" 'is_primary', contact_4.is_primary,",
|
||||||
" 'name', entity_6.name,",
|
|
||||||
" 'target',",
|
" 'target',",
|
||||||
" (SELECT jsonb_build_object(",
|
" (SELECT jsonb_build_object(",
|
||||||
" 'archived', entity_8.archived,",
|
" 'archived', entity_8.archived,",
|
||||||
" 'city', address_7.city,",
|
" 'city', address_7.city,",
|
||||||
" 'created_at', entity_8.created_at,",
|
" 'created_at', entity_8.created_at,",
|
||||||
" 'id', entity_8.id,",
|
" 'id', entity_8.id,",
|
||||||
" 'name', entity_8.name,",
|
|
||||||
" 'type', entity_8.type",
|
" 'type', entity_8.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.address address_7",
|
" FROM agreego.address address_7",
|
||||||
@ -1295,7 +1237,6 @@
|
|||||||
" 'created_at', entity_11.created_at,",
|
" 'created_at', entity_11.created_at,",
|
||||||
" 'id', entity_11.id,",
|
" 'id', entity_11.id,",
|
||||||
" 'is_primary', contact_9.is_primary,",
|
" 'is_primary', contact_9.is_primary,",
|
||||||
" 'name', entity_11.name,",
|
|
||||||
" 'target', CASE",
|
" 'target', CASE",
|
||||||
" WHEN entity_11.target_type = 'address' THEN",
|
" WHEN entity_11.target_type = 'address' THEN",
|
||||||
" ((SELECT jsonb_build_object(",
|
" ((SELECT jsonb_build_object(",
|
||||||
@ -1303,7 +1244,6 @@
|
|||||||
" 'city', address_16.city,",
|
" 'city', address_16.city,",
|
||||||
" 'created_at', entity_17.created_at,",
|
" 'created_at', entity_17.created_at,",
|
||||||
" 'id', entity_17.id,",
|
" 'id', entity_17.id,",
|
||||||
" 'name', entity_17.name,",
|
|
||||||
" 'type', entity_17.type",
|
" 'type', entity_17.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.address address_16",
|
" FROM agreego.address address_16",
|
||||||
@ -1317,7 +1257,6 @@
|
|||||||
" 'archived', entity_15.archived,",
|
" 'archived', entity_15.archived,",
|
||||||
" 'created_at', entity_15.created_at,",
|
" 'created_at', entity_15.created_at,",
|
||||||
" 'id', entity_15.id,",
|
" 'id', entity_15.id,",
|
||||||
" 'name', entity_15.name,",
|
|
||||||
" 'type', entity_15.type",
|
" 'type', entity_15.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.email_address email_address_14",
|
" FROM agreego.email_address email_address_14",
|
||||||
@ -1330,7 +1269,6 @@
|
|||||||
" 'archived', entity_13.archived,",
|
" 'archived', entity_13.archived,",
|
||||||
" 'created_at', entity_13.created_at,",
|
" 'created_at', entity_13.created_at,",
|
||||||
" 'id', entity_13.id,",
|
" 'id', entity_13.id,",
|
||||||
" 'name', entity_13.name,",
|
|
||||||
" 'number', phone_number_12.number,",
|
" 'number', phone_number_12.number,",
|
||||||
" 'type', entity_13.type",
|
" 'type', entity_13.type",
|
||||||
" )",
|
" )",
|
||||||
@ -1356,14 +1294,12 @@
|
|||||||
" 'created_at', entity_20.created_at,",
|
" 'created_at', entity_20.created_at,",
|
||||||
" 'id', entity_20.id,",
|
" 'id', entity_20.id,",
|
||||||
" 'is_primary', contact_18.is_primary,",
|
" 'is_primary', contact_18.is_primary,",
|
||||||
" 'name', entity_20.name,",
|
|
||||||
" 'target',",
|
" 'target',",
|
||||||
" (SELECT jsonb_build_object(",
|
" (SELECT jsonb_build_object(",
|
||||||
" 'address', email_address_21.address,",
|
" 'address', email_address_21.address,",
|
||||||
" 'archived', entity_22.archived,",
|
" 'archived', entity_22.archived,",
|
||||||
" 'created_at', entity_22.created_at,",
|
" 'created_at', entity_22.created_at,",
|
||||||
" 'id', entity_22.id,",
|
" 'id', entity_22.id,",
|
||||||
" 'name', entity_22.name,",
|
|
||||||
" 'type', entity_22.type",
|
" 'type', entity_22.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.email_address email_address_21",
|
" FROM agreego.email_address email_address_21",
|
||||||
@ -1383,20 +1319,18 @@
|
|||||||
" 'first_name', person_1.first_name,",
|
" 'first_name', person_1.first_name,",
|
||||||
" 'id', entity_3.id,",
|
" 'id', entity_3.id,",
|
||||||
" 'last_name', person_1.last_name,",
|
" 'last_name', person_1.last_name,",
|
||||||
" 'name', entity_3.name,",
|
" 'name', organization_2.name,",
|
||||||
" 'phone_numbers',",
|
" 'phone_numbers',",
|
||||||
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||||
" 'archived', entity_25.archived,",
|
" 'archived', entity_25.archived,",
|
||||||
" 'created_at', entity_25.created_at,",
|
" 'created_at', entity_25.created_at,",
|
||||||
" 'id', entity_25.id,",
|
" 'id', entity_25.id,",
|
||||||
" 'is_primary', contact_23.is_primary,",
|
" 'is_primary', contact_23.is_primary,",
|
||||||
" 'name', entity_25.name,",
|
|
||||||
" 'target',",
|
" 'target',",
|
||||||
" (SELECT jsonb_build_object(",
|
" (SELECT jsonb_build_object(",
|
||||||
" 'archived', entity_27.archived,",
|
" 'archived', entity_27.archived,",
|
||||||
" 'created_at', entity_27.created_at,",
|
" 'created_at', entity_27.created_at,",
|
||||||
" 'id', entity_27.id,",
|
" 'id', entity_27.id,",
|
||||||
" 'name', entity_27.name,",
|
|
||||||
" 'number', phone_number_26.number,",
|
" 'number', phone_number_26.number,",
|
||||||
" 'type', entity_27.type",
|
" 'type', entity_27.type",
|
||||||
" )",
|
" )",
|
||||||
@ -1469,14 +1403,12 @@
|
|||||||
" 'created_at', entity_3.created_at,",
|
" 'created_at', entity_3.created_at,",
|
||||||
" 'id', entity_3.id,",
|
" 'id', entity_3.id,",
|
||||||
" 'is_primary', contact_1.is_primary,",
|
" 'is_primary', contact_1.is_primary,",
|
||||||
" 'name', entity_3.name,",
|
|
||||||
" 'target',",
|
" 'target',",
|
||||||
" (SELECT jsonb_build_object(",
|
" (SELECT jsonb_build_object(",
|
||||||
" 'address', email_address_4.address,",
|
" 'address', email_address_4.address,",
|
||||||
" 'archived', entity_5.archived,",
|
" 'archived', entity_5.archived,",
|
||||||
" 'created_at', entity_5.created_at,",
|
" 'created_at', entity_5.created_at,",
|
||||||
" 'id', entity_5.id,",
|
" 'id', entity_5.id,",
|
||||||
" 'name', entity_5.name,",
|
|
||||||
" 'type', entity_5.type",
|
" 'type', entity_5.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.email_address email_address_4",
|
" FROM agreego.email_address email_address_4",
|
||||||
@ -1515,7 +1447,7 @@
|
|||||||
" 'first_name', person_3.first_name,",
|
" 'first_name', person_3.first_name,",
|
||||||
" 'id', entity_5.id,",
|
" 'id', entity_5.id,",
|
||||||
" 'last_name', person_3.last_name,",
|
" 'last_name', person_3.last_name,",
|
||||||
" 'name', entity_5.name,",
|
" 'name', organization_4.name,",
|
||||||
" 'type', entity_5.type",
|
" 'type', entity_5.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.person person_3",
|
" FROM agreego.person person_3",
|
||||||
@ -1531,7 +1463,6 @@
|
|||||||
" 'archived', entity_7.archived,",
|
" 'archived', entity_7.archived,",
|
||||||
" 'created_at', entity_7.created_at,",
|
" 'created_at', entity_7.created_at,",
|
||||||
" 'id', entity_7.id,",
|
" 'id', entity_7.id,",
|
||||||
" 'name', entity_7.name,",
|
|
||||||
" 'order_id', order_line_6.order_id,",
|
" 'order_id', order_line_6.order_id,",
|
||||||
" 'price', order_line_6.price,",
|
" 'price', order_line_6.price,",
|
||||||
" 'product', order_line_6.product,",
|
" 'product', order_line_6.product,",
|
||||||
@ -1542,7 +1473,6 @@
|
|||||||
" WHERE",
|
" WHERE",
|
||||||
" NOT entity_7.archived",
|
" NOT entity_7.archived",
|
||||||
" AND order_line_6.order_id = order_1.id),",
|
" AND order_line_6.order_id = order_1.id),",
|
||||||
" 'name', entity_2.name,",
|
|
||||||
" 'total', order_1.total,",
|
" 'total', order_1.total,",
|
||||||
" 'type', entity_2.type",
|
" 'type', entity_2.type",
|
||||||
")",
|
")",
|
||||||
@ -1554,14 +1484,36 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"description": "Base entity family select on polymorphic tree",
|
"description": "Organization select via a punc response with ref",
|
||||||
"action": "query",
|
"action": "query",
|
||||||
"schema_id": "get_entities.response",
|
"schema_id": "get_organization.response",
|
||||||
"expect": {
|
"expect": {
|
||||||
"success": true,
|
"success": true,
|
||||||
"sql": [
|
"sql": [
|
||||||
[
|
[
|
||||||
"(SELECT jsonb_build_object(",
|
"(SELECT jsonb_build_object(",
|
||||||
|
" 'archived', entity_2.archived,",
|
||||||
|
" 'created_at', entity_2.created_at,",
|
||||||
|
" 'id', entity_2.id,",
|
||||||
|
" 'name', organization_1.name,",
|
||||||
|
" 'type', entity_2.type",
|
||||||
|
")",
|
||||||
|
"FROM agreego.organization organization_1",
|
||||||
|
"JOIN agreego.entity entity_2 ON entity_2.id = organization_1.id",
|
||||||
|
"WHERE NOT entity_2.archived)"
|
||||||
|
]
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Organizations select via a punc response with family",
|
||||||
|
"action": "query",
|
||||||
|
"schema_id": "get_organizations.response",
|
||||||
|
"expect": {
|
||||||
|
"success": true,
|
||||||
|
"sql": [
|
||||||
|
[
|
||||||
|
"(SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||||
" 'id', organization_1.id,",
|
" 'id', organization_1.id,",
|
||||||
" 'type', CASE",
|
" 'type', CASE",
|
||||||
" WHEN organization_1.type = 'bot' THEN",
|
" WHEN organization_1.type = 'bot' THEN",
|
||||||
@ -1569,7 +1521,7 @@
|
|||||||
" 'archived', entity_5.archived,",
|
" 'archived', entity_5.archived,",
|
||||||
" 'created_at', entity_5.created_at,",
|
" 'created_at', entity_5.created_at,",
|
||||||
" 'id', entity_5.id,",
|
" 'id', entity_5.id,",
|
||||||
" 'name', entity_5.name,",
|
" 'name', organization_4.name,",
|
||||||
" 'token', bot_3.token,",
|
" 'token', bot_3.token,",
|
||||||
" 'type', entity_5.type",
|
" 'type', entity_5.type",
|
||||||
" )",
|
" )",
|
||||||
@ -1582,7 +1534,7 @@
|
|||||||
" 'archived', entity_7.archived,",
|
" 'archived', entity_7.archived,",
|
||||||
" 'created_at', entity_7.created_at,",
|
" 'created_at', entity_7.created_at,",
|
||||||
" 'id', entity_7.id,",
|
" 'id', entity_7.id,",
|
||||||
" 'name', entity_7.name,",
|
" 'name', organization_6.name,",
|
||||||
" 'type', entity_7.type",
|
" 'type', entity_7.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.organization organization_6",
|
" FROM agreego.organization organization_6",
|
||||||
@ -1596,7 +1548,7 @@
|
|||||||
" 'first_name', person_8.first_name,",
|
" 'first_name', person_8.first_name,",
|
||||||
" 'id', entity_10.id,",
|
" 'id', entity_10.id,",
|
||||||
" 'last_name', person_8.last_name,",
|
" 'last_name', person_8.last_name,",
|
||||||
" 'name', entity_10.name,",
|
" 'name', organization_9.name,",
|
||||||
" 'type', entity_10.type",
|
" 'type', entity_10.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.person person_8",
|
" FROM agreego.person person_8",
|
||||||
@ -1604,7 +1556,7 @@
|
|||||||
" JOIN agreego.entity entity_10 ON entity_10.id = organization_9.id",
|
" JOIN agreego.entity entity_10 ON entity_10.id = organization_9.id",
|
||||||
" WHERE NOT entity_10.archived))",
|
" WHERE NOT entity_10.archived))",
|
||||||
" ELSE NULL END",
|
" ELSE NULL END",
|
||||||
")",
|
")), '[]'::jsonb)",
|
||||||
"FROM agreego.organization organization_1",
|
"FROM agreego.organization organization_1",
|
||||||
"JOIN agreego.entity entity_2 ON entity_2.id = organization_1.id",
|
"JOIN agreego.entity entity_2 ON entity_2.id = organization_1.id",
|
||||||
"WHERE NOT entity_2.archived)"
|
"WHERE NOT entity_2.archived)"
|
||||||
@ -1613,7 +1565,33 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"description": "Root Array SQL evaluation for Order fetching Light Order",
|
"description": "Person select via a punc response with family",
|
||||||
|
"action": "query",
|
||||||
|
"schema_id": "get_person.response",
|
||||||
|
"expect": {
|
||||||
|
"success": true,
|
||||||
|
"sql": [
|
||||||
|
[
|
||||||
|
"(SELECT jsonb_build_object(",
|
||||||
|
" 'age', person_1.age,",
|
||||||
|
" 'archived', entity_3.archived,",
|
||||||
|
" 'created_at', entity_3.created_at,",
|
||||||
|
" 'first_name', person_1.first_name,",
|
||||||
|
" 'id', entity_3.id,",
|
||||||
|
" 'last_name', person_1.last_name,",
|
||||||
|
" 'name', organization_2.name,",
|
||||||
|
" 'type', entity_3.type",
|
||||||
|
")",
|
||||||
|
"FROM agreego.person person_1",
|
||||||
|
"JOIN agreego.organization organization_2 ON organization_2.id = person_1.id",
|
||||||
|
"JOIN agreego.entity entity_3 ON entity_3.id = organization_2.id",
|
||||||
|
"WHERE NOT entity_3.archived)"
|
||||||
|
]
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Orders select via a punc with items",
|
||||||
"action": "query",
|
"action": "query",
|
||||||
"schema_id": "get_orders.response",
|
"schema_id": "get_orders.response",
|
||||||
"expect": {
|
"expect": {
|
||||||
@ -1631,7 +1609,7 @@
|
|||||||
" 'first_name', person_3.first_name,",
|
" 'first_name', person_3.first_name,",
|
||||||
" 'id', entity_5.id,",
|
" 'id', entity_5.id,",
|
||||||
" 'last_name', person_3.last_name,",
|
" 'last_name', person_3.last_name,",
|
||||||
" 'name', entity_5.name,",
|
" 'name', organization_4.name,",
|
||||||
" 'type', entity_5.type",
|
" 'type', entity_5.type",
|
||||||
" )",
|
" )",
|
||||||
" FROM agreego.person person_3",
|
" FROM agreego.person person_3",
|
||||||
@ -1642,7 +1620,6 @@
|
|||||||
" AND order_1.customer_id = person_3.id),",
|
" AND order_1.customer_id = person_3.id),",
|
||||||
" 'customer_id', order_1.customer_id,",
|
" 'customer_id', order_1.customer_id,",
|
||||||
" 'id', entity_2.id,",
|
" 'id', entity_2.id,",
|
||||||
" 'name', entity_2.name,",
|
|
||||||
" 'total', order_1.total,",
|
" 'total', order_1.total,",
|
||||||
" 'type', entity_2.type",
|
" 'type', entity_2.type",
|
||||||
")), '[]'::jsonb)",
|
")), '[]'::jsonb)",
|
||||||
|
|||||||
@ -676,8 +676,8 @@
|
|||||||
"success": false,
|
"success": false,
|
||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "TYPE_MISMATCH",
|
"code": "CONST_VIOLATED",
|
||||||
"path": "type"
|
"details": { "path": "type" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -781,8 +781,8 @@
|
|||||||
"success": false,
|
"success": false,
|
||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "TYPE_MISMATCH",
|
"code": "CONST_VIOLATED",
|
||||||
"path": "type"
|
"details": { "path": "type" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
2
flows
2
flows
Submodule flows updated: a7b0f5dc4d...4d61e13e00
@ -44,8 +44,8 @@ impl MockExecutor {
|
|||||||
|
|
||||||
#[cfg(test)]
|
#[cfg(test)]
|
||||||
impl DatabaseExecutor for MockExecutor {
|
impl DatabaseExecutor for MockExecutor {
|
||||||
fn query(&self, sql: &str, _args: Option<&[Value]>) -> Result<Value, String> {
|
fn query(&self, sql: &str, _args: Option<Vec<Value>>) -> Result<Value, String> {
|
||||||
println!("DEBUG SQL QUERY: {}", sql);
|
println!("JSPG_SQL: {}", sql);
|
||||||
MOCK_STATE.with(|state| {
|
MOCK_STATE.with(|state| {
|
||||||
let mut s = state.borrow_mut();
|
let mut s = state.borrow_mut();
|
||||||
s.captured_queries.push(sql.to_string());
|
s.captured_queries.push(sql.to_string());
|
||||||
@ -65,8 +65,8 @@ impl DatabaseExecutor for MockExecutor {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
fn execute(&self, sql: &str, _args: Option<&[Value]>) -> Result<(), String> {
|
fn execute(&self, sql: &str, _args: Option<Vec<Value>>) -> Result<(), String> {
|
||||||
println!("DEBUG SQL EXECUTE: {}", sql);
|
println!("JSPG_SQL: {}", sql);
|
||||||
MOCK_STATE.with(|state| {
|
MOCK_STATE.with(|state| {
|
||||||
let mut s = state.borrow_mut();
|
let mut s = state.borrow_mut();
|
||||||
s.captured_queries.push(sql.to_string());
|
s.captured_queries.push(sql.to_string());
|
||||||
@ -170,7 +170,7 @@ fn parse_and_match_mocks(sql: &str, mocks: &[Value]) -> Option<Vec<Value>> {
|
|||||||
.unwrap_or("")
|
.unwrap_or("")
|
||||||
.trim_matches('"');
|
.trim_matches('"');
|
||||||
let right = part[eq_idx + 1..].trim().trim_matches('\'');
|
let right = part[eq_idx + 1..].trim().trim_matches('\'');
|
||||||
|
|
||||||
let mock_val_str = match mock_obj.get(left) {
|
let mock_val_str = match mock_obj.get(left) {
|
||||||
Some(Value::String(s)) => s.clone(),
|
Some(Value::String(s)) => s.clone(),
|
||||||
Some(Value::Number(n)) => n.to_string(),
|
Some(Value::Number(n)) => n.to_string(),
|
||||||
@ -189,12 +189,12 @@ fn parse_and_match_mocks(sql: &str, mocks: &[Value]) -> Option<Vec<Value>> {
|
|||||||
.last()
|
.last()
|
||||||
.unwrap_or("")
|
.unwrap_or("")
|
||||||
.trim_matches('"');
|
.trim_matches('"');
|
||||||
|
|
||||||
let mock_val_str = match mock_obj.get(left) {
|
let mock_val_str = match mock_obj.get(left) {
|
||||||
Some(Value::Null) => "null".to_string(),
|
Some(Value::Null) => "null".to_string(),
|
||||||
_ => "".to_string(),
|
_ => "".to_string(),
|
||||||
};
|
};
|
||||||
|
|
||||||
if mock_val_str != "null" {
|
if mock_val_str != "null" {
|
||||||
branch_matches = false;
|
branch_matches = false;
|
||||||
break;
|
break;
|
||||||
|
|||||||
@ -9,10 +9,10 @@ use serde_json::Value;
|
|||||||
/// without a live Postgres SPI connection.
|
/// without a live Postgres SPI connection.
|
||||||
pub trait DatabaseExecutor: Send + Sync {
|
pub trait DatabaseExecutor: Send + Sync {
|
||||||
/// Executes a query expecting a single JSONB return, representing rows.
|
/// Executes a query expecting a single JSONB return, representing rows.
|
||||||
fn query(&self, sql: &str, args: Option<&[Value]>) -> Result<Value, String>;
|
fn query(&self, sql: &str, args: Option<Vec<Value>>) -> Result<Value, String>;
|
||||||
|
|
||||||
/// Executes an operation (INSERT, UPDATE, DELETE, or pg_notify) that does not return rows.
|
/// Executes an operation (INSERT, UPDATE, DELETE, or pg_notify) that does not return rows.
|
||||||
fn execute(&self, sql: &str, args: Option<&[Value]>) -> Result<(), String>;
|
fn execute(&self, sql: &str, args: Option<Vec<Value>>) -> Result<(), String>;
|
||||||
|
|
||||||
/// Returns the current authenticated user's ID
|
/// Returns the current authenticated user's ID
|
||||||
fn auth_user_id(&self) -> Result<String, String>;
|
fn auth_user_id(&self) -> Result<String, String>;
|
||||||
|
|||||||
@ -67,15 +67,11 @@ impl SpiExecutor {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl DatabaseExecutor for SpiExecutor {
|
impl DatabaseExecutor for SpiExecutor {
|
||||||
fn query(&self, sql: &str, args: Option<&[Value]>) -> Result<Value, String> {
|
fn query(&self, sql: &str, args: Option<Vec<Value>>) -> Result<Value, String> {
|
||||||
let mut json_args = Vec::new();
|
|
||||||
let mut args_with_oid: Vec<pgrx::datum::DatumWithOid> = Vec::new();
|
let mut args_with_oid: Vec<pgrx::datum::DatumWithOid> = Vec::new();
|
||||||
if let Some(params) = args {
|
if let Some(params) = args {
|
||||||
for val in params {
|
for val in params {
|
||||||
json_args.push(pgrx::JsonB(val.clone()));
|
args_with_oid.push(pgrx::datum::DatumWithOid::from(pgrx::JsonB(val)));
|
||||||
}
|
|
||||||
for j_val in json_args.into_iter() {
|
|
||||||
args_with_oid.push(pgrx::datum::DatumWithOid::from(j_val));
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -98,15 +94,11 @@ impl DatabaseExecutor for SpiExecutor {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
fn execute(&self, sql: &str, args: Option<&[Value]>) -> Result<(), String> {
|
fn execute(&self, sql: &str, args: Option<Vec<Value>>) -> Result<(), String> {
|
||||||
let mut json_args = Vec::new();
|
|
||||||
let mut args_with_oid: Vec<pgrx::datum::DatumWithOid> = Vec::new();
|
let mut args_with_oid: Vec<pgrx::datum::DatumWithOid> = Vec::new();
|
||||||
if let Some(params) = args {
|
if let Some(params) = args {
|
||||||
for val in params {
|
for val in params {
|
||||||
json_args.push(pgrx::JsonB(val.clone()));
|
args_with_oid.push(pgrx::datum::DatumWithOid::from(pgrx::JsonB(val)));
|
||||||
}
|
|
||||||
for j_val in json_args.into_iter() {
|
|
||||||
args_with_oid.push(pgrx::datum::DatumWithOid::from(j_val));
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -38,7 +38,7 @@ pub struct Database {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl Database {
|
impl Database {
|
||||||
pub fn new(val: &serde_json::Value) -> Result<Self, crate::drop::Drop> {
|
pub fn new(val: &serde_json::Value) -> (Self, crate::drop::Drop) {
|
||||||
let mut db = Self {
|
let mut db = Self {
|
||||||
enums: HashMap::new(),
|
enums: HashMap::new(),
|
||||||
types: HashMap::new(),
|
types: HashMap::new(),
|
||||||
@ -53,18 +53,38 @@ impl Database {
|
|||||||
executor: Box::new(MockExecutor::new()),
|
executor: Box::new(MockExecutor::new()),
|
||||||
};
|
};
|
||||||
|
|
||||||
|
let mut errors = Vec::new();
|
||||||
|
|
||||||
if let Some(arr) = val.get("enums").and_then(|v| v.as_array()) {
|
if let Some(arr) = val.get("enums").and_then(|v| v.as_array()) {
|
||||||
for item in arr {
|
for item in arr {
|
||||||
if let Ok(def) = serde_json::from_value::<Enum>(item.clone()) {
|
match serde_json::from_value::<Enum>(item.clone()) {
|
||||||
db.enums.insert(def.name.clone(), def);
|
Ok(def) => {
|
||||||
|
db.enums.insert(def.name.clone(), def);
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
errors.push(crate::drop::Error {
|
||||||
|
code: "DATABASE_ENUM_PARSE_FAILED".to_string(),
|
||||||
|
message: format!("Failed to parse database enum: {}", e),
|
||||||
|
details: crate::drop::ErrorDetails::default(),
|
||||||
|
});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(arr) = val.get("types").and_then(|v| v.as_array()) {
|
if let Some(arr) = val.get("types").and_then(|v| v.as_array()) {
|
||||||
for item in arr {
|
for item in arr {
|
||||||
if let Ok(def) = serde_json::from_value::<Type>(item.clone()) {
|
match serde_json::from_value::<Type>(item.clone()) {
|
||||||
db.types.insert(def.name.clone(), def);
|
Ok(def) => {
|
||||||
|
db.types.insert(def.name.clone(), def);
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
errors.push(crate::drop::Error {
|
||||||
|
code: "DATABASE_TYPE_PARSE_FAILED".to_string(),
|
||||||
|
message: format!("Failed to parse database type: {}", e),
|
||||||
|
details: crate::drop::ErrorDetails::default(),
|
||||||
|
});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -80,16 +100,11 @@ impl Database {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
return Err(crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
errors.push(crate::drop::Error {
|
||||||
code: "DATABASE_RELATION_PARSE_FAILED".to_string(),
|
code: "DATABASE_RELATION_PARSE_FAILED".to_string(),
|
||||||
message: format!("Failed to parse database relation: {}", e),
|
message: format!("Failed to parse database relation: {}", e),
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails::default(),
|
||||||
path: "".to_string(),
|
});
|
||||||
cause: None,
|
|
||||||
context: None,
|
|
||||||
schema: None,
|
|
||||||
},
|
|
||||||
}]));
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -97,28 +112,51 @@ impl Database {
|
|||||||
|
|
||||||
if let Some(arr) = val.get("puncs").and_then(|v| v.as_array()) {
|
if let Some(arr) = val.get("puncs").and_then(|v| v.as_array()) {
|
||||||
for item in arr {
|
for item in arr {
|
||||||
if let Ok(def) = serde_json::from_value::<Punc>(item.clone()) {
|
match serde_json::from_value::<Punc>(item.clone()) {
|
||||||
db.puncs.insert(def.name.clone(), def);
|
Ok(def) => {
|
||||||
|
db.puncs.insert(def.name.clone(), def);
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
errors.push(crate::drop::Error {
|
||||||
|
code: "DATABASE_PUNC_PARSE_FAILED".to_string(),
|
||||||
|
message: format!("Failed to parse database punc: {}", e),
|
||||||
|
details: crate::drop::ErrorDetails::default(),
|
||||||
|
});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(arr) = val.get("schemas").and_then(|v| v.as_array()) {
|
if let Some(arr) = val.get("schemas").and_then(|v| v.as_array()) {
|
||||||
for (i, item) in arr.iter().enumerate() {
|
for (i, item) in arr.iter().enumerate() {
|
||||||
if let Ok(mut schema) = serde_json::from_value::<Schema>(item.clone()) {
|
match serde_json::from_value::<Schema>(item.clone()) {
|
||||||
let id = schema
|
Ok(mut schema) => {
|
||||||
.obj
|
let id = schema
|
||||||
.id
|
.obj
|
||||||
.clone()
|
.id
|
||||||
.unwrap_or_else(|| format!("schema_{}", i));
|
.clone()
|
||||||
schema.obj.id = Some(id.clone());
|
.unwrap_or_else(|| format!("schema_{}", i));
|
||||||
db.schemas.insert(id, schema);
|
schema.obj.id = Some(id.clone());
|
||||||
|
db.schemas.insert(id, schema);
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
errors.push(crate::drop::Error {
|
||||||
|
code: "DATABASE_SCHEMA_PARSE_FAILED".to_string(),
|
||||||
|
message: format!("Failed to parse database schema: {}", e),
|
||||||
|
details: crate::drop::ErrorDetails::default(),
|
||||||
|
});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
db.compile()?;
|
db.compile(&mut errors);
|
||||||
Ok(db)
|
let drop = if errors.is_empty() {
|
||||||
|
crate::drop::Drop::success()
|
||||||
|
} else {
|
||||||
|
crate::drop::Drop::with_errors(errors)
|
||||||
|
};
|
||||||
|
(db, drop)
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Override the default executor for unit testing
|
/// Override the default executor for unit testing
|
||||||
@ -128,12 +166,12 @@ impl Database {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// Executes a query expecting a single JSONB array return, representing rows.
|
/// Executes a query expecting a single JSONB array return, representing rows.
|
||||||
pub fn query(&self, sql: &str, args: Option<&[Value]>) -> Result<Value, String> {
|
pub fn query(&self, sql: &str, args: Option<Vec<Value>>) -> Result<Value, String> {
|
||||||
self.executor.query(sql, args)
|
self.executor.query(sql, args)
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Executes an operation (INSERT, UPDATE, DELETE, or pg_notify) that does not return rows.
|
/// Executes an operation (INSERT, UPDATE, DELETE, or pg_notify) that does not return rows.
|
||||||
pub fn execute(&self, sql: &str, args: Option<&[Value]>) -> Result<(), String> {
|
pub fn execute(&self, sql: &str, args: Option<Vec<Value>>) -> Result<(), String> {
|
||||||
self.executor.execute(sql, args)
|
self.executor.execute(sql, args)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -147,68 +185,48 @@ impl Database {
|
|||||||
self.executor.timestamp()
|
self.executor.timestamp()
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn compile(&mut self) -> Result<(), crate::drop::Drop> {
|
pub fn compile(&mut self, errors: &mut Vec<crate::drop::Error>) {
|
||||||
let mut harvested = Vec::new();
|
let mut harvested = Vec::new();
|
||||||
for schema in self.schemas.values_mut() {
|
for schema in self.schemas.values_mut() {
|
||||||
if let Err(msg) = schema.collect_schemas(None, &mut harvested) {
|
schema.collect_schemas(None, &mut harvested, errors);
|
||||||
return Err(crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
|
||||||
code: "SCHEMA_VALIDATION_FAILED".to_string(),
|
|
||||||
message: msg,
|
|
||||||
details: crate::drop::ErrorDetails { path: "".to_string(), cause: None, context: None, schema: None },
|
|
||||||
}]));
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
self.schemas.extend(harvested);
|
self.schemas.extend(harvested);
|
||||||
|
|
||||||
if let Err(msg) = self.collect_schemas() {
|
self.collect_schemas(errors);
|
||||||
return Err(crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
|
||||||
code: "SCHEMA_VALIDATION_FAILED".to_string(),
|
|
||||||
message: msg,
|
|
||||||
details: crate::drop::ErrorDetails {
|
|
||||||
path: "".to_string(),
|
|
||||||
cause: None,
|
|
||||||
context: None,
|
|
||||||
schema: None,
|
|
||||||
},
|
|
||||||
}]));
|
|
||||||
}
|
|
||||||
self.collect_depths();
|
self.collect_depths();
|
||||||
self.collect_descendants();
|
self.collect_descendants();
|
||||||
|
|
||||||
// Mathematically evaluate all property inheritances, formats, schemas, and foreign key edges topographically over OnceLocks
|
// Mathematically evaluate all property inheritances, formats, schemas, and foreign key edges topographically over OnceLocks
|
||||||
let mut visited = std::collections::HashSet::new();
|
let mut visited = std::collections::HashSet::new();
|
||||||
for schema in self.schemas.values() {
|
for schema in self.schemas.values() {
|
||||||
schema.compile(self, &mut visited);
|
schema.compile(self, &mut visited, errors);
|
||||||
}
|
}
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn collect_schemas(&mut self) -> Result<(), String> {
|
fn collect_schemas(&mut self, errors: &mut Vec<crate::drop::Error>) {
|
||||||
let mut to_insert = Vec::new();
|
let mut to_insert = Vec::new();
|
||||||
|
|
||||||
// Pass 1: Extract all Schemas structurally off top level definitions into the master registry.
|
// Pass 1: Extract all Schemas structurally off top level definitions into the master registry.
|
||||||
// Validate every node recursively via string filters natively!
|
// Validate every node recursively via string filters natively!
|
||||||
for type_def in self.types.values() {
|
for type_def in self.types.values() {
|
||||||
for mut schema in type_def.schemas.clone() {
|
for mut schema in type_def.schemas.clone() {
|
||||||
schema.collect_schemas(None, &mut to_insert)?;
|
schema.collect_schemas(None, &mut to_insert, errors);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
for punc_def in self.puncs.values() {
|
for punc_def in self.puncs.values() {
|
||||||
for mut schema in punc_def.schemas.clone() {
|
for mut schema in punc_def.schemas.clone() {
|
||||||
schema.collect_schemas(None, &mut to_insert)?;
|
schema.collect_schemas(None, &mut to_insert, errors);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
for enum_def in self.enums.values() {
|
for enum_def in self.enums.values() {
|
||||||
for mut schema in enum_def.schemas.clone() {
|
for mut schema in enum_def.schemas.clone() {
|
||||||
schema.collect_schemas(None, &mut to_insert)?;
|
schema.collect_schemas(None, &mut to_insert, errors);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
for (id, schema) in to_insert {
|
for (id, schema) in to_insert {
|
||||||
self.schemas.insert(id, schema);
|
self.schemas.insert(id, schema);
|
||||||
}
|
}
|
||||||
Ok(())
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn collect_depths(&mut self) {
|
fn collect_depths(&mut self) {
|
||||||
@ -247,19 +265,15 @@ impl Database {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Cache generic descendants for $family runtime lookups
|
// Cache exhaustive descendants matrix for generic $family string lookups natively
|
||||||
let mut descendants = HashMap::new();
|
let mut descendants = HashMap::new();
|
||||||
for (id, schema) in &self.schemas {
|
for id in self.schemas.keys() {
|
||||||
if let Some(family_target) = &schema.obj.family {
|
let mut desc_set = HashSet::new();
|
||||||
let mut desc_set = HashSet::new();
|
Self::collect_descendants_recursively(id, &direct_refs, &mut desc_set);
|
||||||
Self::collect_descendants_recursively(family_target, &direct_refs, &mut desc_set);
|
let mut desc_vec: Vec<String> = desc_set.into_iter().collect();
|
||||||
let mut desc_vec: Vec<String> = desc_set.into_iter().collect();
|
desc_vec.sort();
|
||||||
desc_vec.sort();
|
|
||||||
|
|
||||||
// By placing all descendants directly onto the ID mapped location of the Family declaration,
|
descendants.insert(id.clone(), desc_vec);
|
||||||
// we can lookup descendants natively in ValidationContext without AST replacement overrides.
|
|
||||||
descendants.insert(id.clone(), desc_vec);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
self.descendants = descendants;
|
self.descendants = descendants;
|
||||||
}
|
}
|
||||||
|
|||||||
@ -255,6 +255,7 @@ impl Schema {
|
|||||||
&self,
|
&self,
|
||||||
db: &crate::database::Database,
|
db: &crate::database::Database,
|
||||||
visited: &mut std::collections::HashSet<String>,
|
visited: &mut std::collections::HashSet<String>,
|
||||||
|
errors: &mut Vec<crate::drop::Error>,
|
||||||
) {
|
) {
|
||||||
if self.obj.compiled_properties.get().is_some() {
|
if self.obj.compiled_properties.get().is_some() {
|
||||||
return;
|
return;
|
||||||
@ -301,7 +302,7 @@ impl Schema {
|
|||||||
// 1. Resolve INHERITANCE dependencies first
|
// 1. Resolve INHERITANCE dependencies first
|
||||||
if let Some(ref_id) = &self.obj.r#ref {
|
if let Some(ref_id) = &self.obj.r#ref {
|
||||||
if let Some(parent) = db.schemas.get(ref_id) {
|
if let Some(parent) = db.schemas.get(ref_id) {
|
||||||
parent.compile(db, visited);
|
parent.compile(db, visited, errors);
|
||||||
if let Some(p_props) = parent.obj.compiled_properties.get() {
|
if let Some(p_props) = parent.obj.compiled_properties.get() {
|
||||||
props.extend(p_props.clone());
|
props.extend(p_props.clone());
|
||||||
}
|
}
|
||||||
@ -310,7 +311,7 @@ impl Schema {
|
|||||||
|
|
||||||
if let Some(all_of) = &self.obj.all_of {
|
if let Some(all_of) = &self.obj.all_of {
|
||||||
for ao in all_of {
|
for ao in all_of {
|
||||||
ao.compile(db, visited);
|
ao.compile(db, visited, errors);
|
||||||
if let Some(ao_props) = ao.obj.compiled_properties.get() {
|
if let Some(ao_props) = ao.obj.compiled_properties.get() {
|
||||||
props.extend(ao_props.clone());
|
props.extend(ao_props.clone());
|
||||||
}
|
}
|
||||||
@ -318,14 +319,14 @@ impl Schema {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if let Some(then_schema) = &self.obj.then_ {
|
if let Some(then_schema) = &self.obj.then_ {
|
||||||
then_schema.compile(db, visited);
|
then_schema.compile(db, visited, errors);
|
||||||
if let Some(t_props) = then_schema.obj.compiled_properties.get() {
|
if let Some(t_props) = then_schema.obj.compiled_properties.get() {
|
||||||
props.extend(t_props.clone());
|
props.extend(t_props.clone());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(else_schema) = &self.obj.else_ {
|
if let Some(else_schema) = &self.obj.else_ {
|
||||||
else_schema.compile(db, visited);
|
else_schema.compile(db, visited, errors);
|
||||||
if let Some(e_props) = else_schema.obj.compiled_properties.get() {
|
if let Some(e_props) = else_schema.obj.compiled_properties.get() {
|
||||||
props.extend(e_props.clone());
|
props.extend(e_props.clone());
|
||||||
}
|
}
|
||||||
@ -345,47 +346,47 @@ impl Schema {
|
|||||||
let _ = self.obj.compiled_property_names.set(names);
|
let _ = self.obj.compiled_property_names.set(names);
|
||||||
|
|
||||||
// 4. Compute Edges natively
|
// 4. Compute Edges natively
|
||||||
let schema_edges = self.compile_edges(db, visited, &props);
|
let schema_edges = self.compile_edges(db, visited, &props, errors);
|
||||||
let _ = self.obj.compiled_edges.set(schema_edges);
|
let _ = self.obj.compiled_edges.set(schema_edges);
|
||||||
|
|
||||||
// 5. Build our inline children properties recursively NOW! (Depth-first search)
|
// 5. Build our inline children properties recursively NOW! (Depth-first search)
|
||||||
if let Some(local_props) = &self.obj.properties {
|
if let Some(local_props) = &self.obj.properties {
|
||||||
for child in local_props.values() {
|
for child in local_props.values() {
|
||||||
child.compile(db, visited);
|
child.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if let Some(items) = &self.obj.items {
|
if let Some(items) = &self.obj.items {
|
||||||
items.compile(db, visited);
|
items.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
if let Some(pattern_props) = &self.obj.pattern_properties {
|
if let Some(pattern_props) = &self.obj.pattern_properties {
|
||||||
for child in pattern_props.values() {
|
for child in pattern_props.values() {
|
||||||
child.compile(db, visited);
|
child.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if let Some(additional_props) = &self.obj.additional_properties {
|
if let Some(additional_props) = &self.obj.additional_properties {
|
||||||
additional_props.compile(db, visited);
|
additional_props.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
if let Some(one_of) = &self.obj.one_of {
|
if let Some(one_of) = &self.obj.one_of {
|
||||||
for child in one_of {
|
for child in one_of {
|
||||||
child.compile(db, visited);
|
child.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if let Some(arr) = &self.obj.prefix_items {
|
if let Some(arr) = &self.obj.prefix_items {
|
||||||
for child in arr {
|
for child in arr {
|
||||||
child.compile(db, visited);
|
child.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if let Some(child) = &self.obj.not {
|
if let Some(child) = &self.obj.not {
|
||||||
child.compile(db, visited);
|
child.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
if let Some(child) = &self.obj.contains {
|
if let Some(child) = &self.obj.contains {
|
||||||
child.compile(db, visited);
|
child.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
if let Some(child) = &self.obj.property_names {
|
if let Some(child) = &self.obj.property_names {
|
||||||
child.compile(db, visited);
|
child.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
if let Some(child) = &self.obj.if_ {
|
if let Some(child) = &self.obj.if_ {
|
||||||
child.compile(db, visited);
|
child.compile(db, visited, errors);
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(id) = &self.obj.id {
|
if let Some(id) = &self.obj.id {
|
||||||
@ -394,30 +395,38 @@ impl Schema {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[allow(unused_variables)]
|
#[allow(unused_variables)]
|
||||||
fn validate_identifier(id: &str, field_name: &str) -> Result<(), String> {
|
fn validate_identifier(id: &str, field_name: &str, errors: &mut Vec<crate::drop::Error>) {
|
||||||
#[cfg(not(test))]
|
#[cfg(not(test))]
|
||||||
for c in id.chars() {
|
for c in id.chars() {
|
||||||
if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' {
|
if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' {
|
||||||
return Err(format!("Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.]", c, field_name, id));
|
errors.push(crate::drop::Error {
|
||||||
|
code: "INVALID_IDENTIFIER".to_string(),
|
||||||
|
message: format!(
|
||||||
|
"Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.]",
|
||||||
|
c, field_name, id
|
||||||
|
),
|
||||||
|
details: crate::drop::ErrorDetails::default(),
|
||||||
|
});
|
||||||
|
return;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Ok(())
|
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn collect_schemas(
|
pub fn collect_schemas(
|
||||||
&mut self,
|
&mut self,
|
||||||
tracking_path: Option<String>,
|
tracking_path: Option<String>,
|
||||||
to_insert: &mut Vec<(String, Schema)>,
|
to_insert: &mut Vec<(String, Schema)>,
|
||||||
) -> Result<(), String> {
|
errors: &mut Vec<crate::drop::Error>,
|
||||||
|
) {
|
||||||
if let Some(id) = &self.obj.id {
|
if let Some(id) = &self.obj.id {
|
||||||
Self::validate_identifier(id, "$id")?;
|
Self::validate_identifier(id, "$id", errors);
|
||||||
to_insert.push((id.clone(), self.clone()));
|
to_insert.push((id.clone(), self.clone()));
|
||||||
}
|
}
|
||||||
if let Some(r#ref) = &self.obj.r#ref {
|
if let Some(r#ref) = &self.obj.r#ref {
|
||||||
Self::validate_identifier(r#ref, "$ref")?;
|
Self::validate_identifier(r#ref, "$ref", errors);
|
||||||
}
|
}
|
||||||
if let Some(family) = &self.obj.family {
|
if let Some(family) = &self.obj.family {
|
||||||
Self::validate_identifier(family, "$family")?;
|
Self::validate_identifier(family, "$family", errors);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Is this schema an inline ad-hoc composition?
|
// Is this schema an inline ad-hoc composition?
|
||||||
@ -431,20 +440,20 @@ impl Schema {
|
|||||||
// Provide the path origin to children natively, prioritizing the explicit `$id` boundary if one exists
|
// Provide the path origin to children natively, prioritizing the explicit `$id` boundary if one exists
|
||||||
let origin_path = self.obj.id.clone().or(tracking_path);
|
let origin_path = self.obj.id.clone().or(tracking_path);
|
||||||
|
|
||||||
self.collect_child_schemas(origin_path, to_insert)?;
|
self.collect_child_schemas(origin_path, to_insert, errors);
|
||||||
Ok(())
|
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn collect_child_schemas(
|
pub fn collect_child_schemas(
|
||||||
&mut self,
|
&mut self,
|
||||||
origin_path: Option<String>,
|
origin_path: Option<String>,
|
||||||
to_insert: &mut Vec<(String, Schema)>,
|
to_insert: &mut Vec<(String, Schema)>,
|
||||||
) -> Result<(), String> {
|
errors: &mut Vec<crate::drop::Error>,
|
||||||
|
) {
|
||||||
if let Some(props) = &mut self.obj.properties {
|
if let Some(props) = &mut self.obj.properties {
|
||||||
for (k, v) in props.iter_mut() {
|
for (k, v) in props.iter_mut() {
|
||||||
let mut inner = (**v).clone();
|
let mut inner = (**v).clone();
|
||||||
let next_path = origin_path.as_ref().map(|o| format!("{}/{}", o, k));
|
let next_path = origin_path.as_ref().map(|o| format!("{}/{}", o, k));
|
||||||
inner.collect_schemas(next_path, to_insert)?;
|
inner.collect_schemas(next_path, to_insert, errors);
|
||||||
*v = Arc::new(inner);
|
*v = Arc::new(inner);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -453,80 +462,102 @@ impl Schema {
|
|||||||
for (k, v) in pattern_props.iter_mut() {
|
for (k, v) in pattern_props.iter_mut() {
|
||||||
let mut inner = (**v).clone();
|
let mut inner = (**v).clone();
|
||||||
let next_path = origin_path.as_ref().map(|o| format!("{}/{}", o, k));
|
let next_path = origin_path.as_ref().map(|o| format!("{}/{}", o, k));
|
||||||
inner.collect_schemas(next_path, to_insert)?;
|
inner.collect_schemas(next_path, to_insert, errors);
|
||||||
*v = Arc::new(inner);
|
*v = Arc::new(inner);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let mut map_arr = |arr: &mut Vec<Arc<Schema>>| -> Result<(), String> {
|
let mut map_arr = |arr: &mut Vec<Arc<Schema>>| {
|
||||||
for v in arr.iter_mut() {
|
for v in arr.iter_mut() {
|
||||||
let mut inner = (**v).clone();
|
let mut inner = (**v).clone();
|
||||||
inner.collect_schemas(origin_path.clone(), to_insert)?;
|
inner.collect_schemas(origin_path.clone(), to_insert, errors);
|
||||||
*v = Arc::new(inner);
|
*v = Arc::new(inner);
|
||||||
}
|
}
|
||||||
Ok(())
|
|
||||||
};
|
};
|
||||||
|
|
||||||
if let Some(arr) = &mut self.obj.prefix_items { map_arr(arr)?; }
|
if let Some(arr) = &mut self.obj.prefix_items {
|
||||||
if let Some(arr) = &mut self.obj.all_of { map_arr(arr)?; }
|
map_arr(arr);
|
||||||
if let Some(arr) = &mut self.obj.one_of { map_arr(arr)?; }
|
}
|
||||||
|
if let Some(arr) = &mut self.obj.all_of {
|
||||||
|
map_arr(arr);
|
||||||
|
}
|
||||||
|
if let Some(arr) = &mut self.obj.one_of {
|
||||||
|
map_arr(arr);
|
||||||
|
}
|
||||||
|
|
||||||
let mut map_opt = |opt: &mut Option<Arc<Schema>>, pass_path: bool| -> Result<(), String> {
|
let mut map_opt = |opt: &mut Option<Arc<Schema>>, pass_path: bool| {
|
||||||
if let Some(v) = opt {
|
if let Some(v) = opt {
|
||||||
let mut inner = (**v).clone();
|
let mut inner = (**v).clone();
|
||||||
let next = if pass_path { origin_path.clone() } else { None };
|
let next = if pass_path { origin_path.clone() } else { None };
|
||||||
inner.collect_schemas(next, to_insert)?;
|
inner.collect_schemas(next, to_insert, errors);
|
||||||
*v = Arc::new(inner);
|
*v = Arc::new(inner);
|
||||||
}
|
}
|
||||||
Ok(())
|
|
||||||
};
|
};
|
||||||
|
|
||||||
map_opt(&mut self.obj.additional_properties, false)?;
|
map_opt(&mut self.obj.additional_properties, false);
|
||||||
|
|
||||||
// `items` absolutely must inherit the EXACT property path assigned to the Array wrapper!
|
// `items` absolutely must inherit the EXACT property path assigned to the Array wrapper!
|
||||||
// This allows nested Arrays enclosing bare Entity structs to correctly register as the boundary mapping.
|
// This allows nested Arrays enclosing bare Entity structs to correctly register as the boundary mapping.
|
||||||
map_opt(&mut self.obj.items, true)?;
|
map_opt(&mut self.obj.items, true);
|
||||||
|
|
||||||
map_opt(&mut self.obj.not, false)?;
|
|
||||||
map_opt(&mut self.obj.contains, false)?;
|
|
||||||
map_opt(&mut self.obj.property_names, false)?;
|
|
||||||
map_opt(&mut self.obj.if_, false)?;
|
|
||||||
map_opt(&mut self.obj.then_, false)?;
|
|
||||||
map_opt(&mut self.obj.else_, false)?;
|
|
||||||
|
|
||||||
Ok(())
|
map_opt(&mut self.obj.not, false);
|
||||||
|
map_opt(&mut self.obj.contains, false);
|
||||||
|
map_opt(&mut self.obj.property_names, false);
|
||||||
|
map_opt(&mut self.obj.if_, false);
|
||||||
|
map_opt(&mut self.obj.then_, false);
|
||||||
|
map_opt(&mut self.obj.else_, false);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Dynamically infers and compiles all structural database relationships between this Schema
|
||||||
|
/// and its nested children. This functions recursively traverses the JSON Schema abstract syntax
|
||||||
|
/// tree, identifies physical PostgreSQL table boundaries, and locks the resulting relation
|
||||||
|
/// constraint paths directly onto the `compiled_edges` map in O(1) memory.
|
||||||
pub fn compile_edges(
|
pub fn compile_edges(
|
||||||
&self,
|
&self,
|
||||||
db: &crate::database::Database,
|
db: &crate::database::Database,
|
||||||
visited: &mut std::collections::HashSet<String>,
|
visited: &mut std::collections::HashSet<String>,
|
||||||
props: &std::collections::BTreeMap<String, std::sync::Arc<Schema>>,
|
props: &std::collections::BTreeMap<String, std::sync::Arc<Schema>>,
|
||||||
|
errors: &mut Vec<crate::drop::Error>,
|
||||||
) -> std::collections::BTreeMap<String, crate::database::edge::Edge> {
|
) -> std::collections::BTreeMap<String, crate::database::edge::Edge> {
|
||||||
let mut schema_edges = std::collections::BTreeMap::new();
|
let mut schema_edges = std::collections::BTreeMap::new();
|
||||||
|
|
||||||
|
// Determine the physical Database Table Name this schema structurally represents
|
||||||
|
// Plucks the polymorphic discriminator via dot-notation (e.g. extracting "person" from "full.person")
|
||||||
let mut parent_type_name = None;
|
let mut parent_type_name = None;
|
||||||
if let Some(family) = &self.obj.family {
|
if let Some(family) = &self.obj.family {
|
||||||
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||||
} else if let Some(identifier) = self.obj.identifier() {
|
} else if let Some(identifier) = self.obj.identifier() {
|
||||||
parent_type_name = Some(identifier.split('.').next_back().unwrap_or(&identifier).to_string());
|
parent_type_name = Some(
|
||||||
|
identifier
|
||||||
|
.split('.')
|
||||||
|
.next_back()
|
||||||
|
.unwrap_or(&identifier)
|
||||||
|
.to_string(),
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(p_type) = parent_type_name {
|
if let Some(p_type) = parent_type_name {
|
||||||
|
// Proceed only if the resolved table physically exists within the Postgres Type hierarchy
|
||||||
if db.types.contains_key(&p_type) {
|
if db.types.contains_key(&p_type) {
|
||||||
|
// Iterate over all discovered schema boundaries mapped inside the object
|
||||||
for (prop_name, prop_schema) in props {
|
for (prop_name, prop_schema) in props {
|
||||||
let mut child_type_name = None;
|
let mut child_type_name = None;
|
||||||
let mut target_schema = prop_schema.clone();
|
let mut target_schema = prop_schema.clone();
|
||||||
|
let mut is_array = false;
|
||||||
|
|
||||||
|
// Structurally unpack the inner target entity if the object maps to an array list
|
||||||
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) =
|
if let Some(crate::database::schema::SchemaTypeOrArray::Single(t)) =
|
||||||
&prop_schema.obj.type_
|
&prop_schema.obj.type_
|
||||||
{
|
{
|
||||||
if t == "array" {
|
if t == "array" {
|
||||||
|
is_array = true;
|
||||||
if let Some(items) = &prop_schema.obj.items {
|
if let Some(items) = &prop_schema.obj.items {
|
||||||
target_schema = items.clone();
|
target_schema = items.clone();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Determine the physical Postgres table backing the nested child schema recursively
|
||||||
if let Some(family) = &target_schema.obj.family {
|
if let Some(family) = &target_schema.obj.family {
|
||||||
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||||
} else if let Some(ref_id) = target_schema.obj.identifier() {
|
} else if let Some(ref_id) = target_schema.obj.identifier() {
|
||||||
@ -534,20 +565,33 @@ impl Schema {
|
|||||||
} else if let Some(arr) = &target_schema.obj.one_of {
|
} else if let Some(arr) = &target_schema.obj.one_of {
|
||||||
if let Some(first) = arr.first() {
|
if let Some(first) = arr.first() {
|
||||||
if let Some(ref_id) = first.obj.identifier() {
|
if let Some(ref_id) = first.obj.identifier() {
|
||||||
child_type_name = Some(ref_id.split('.').next_back().unwrap_or(&ref_id).to_string());
|
child_type_name =
|
||||||
|
Some(ref_id.split('.').next_back().unwrap_or(&ref_id).to_string());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(c_type) = child_type_name {
|
if let Some(c_type) = child_type_name {
|
||||||
if db.types.contains_key(&c_type) {
|
if db.types.contains_key(&c_type) {
|
||||||
target_schema.compile(db, visited);
|
// Ensure the child Schema's AST has accurately compiled its own physical property keys so we can
|
||||||
|
// inject them securely for Many-to-Many Twin Deduction disambiguation matching.
|
||||||
|
target_schema.compile(db, visited, errors);
|
||||||
if let Some(compiled_target_props) = target_schema.obj.compiled_properties.get() {
|
if let Some(compiled_target_props) = target_schema.obj.compiled_properties.get() {
|
||||||
let keys_for_ambiguity: Vec<String> =
|
let keys_for_ambiguity: Vec<String> =
|
||||||
compiled_target_props.keys().cloned().collect();
|
compiled_target_props.keys().cloned().collect();
|
||||||
if let Some((relation, is_forward)) =
|
|
||||||
resolve_relation(db, &p_type, &c_type, prop_name, Some(&keys_for_ambiguity))
|
// Interrogate the Database catalog graph to discover the exact Foreign Key Constraint connecting the components
|
||||||
{
|
if let Some((relation, is_forward)) = resolve_relation(
|
||||||
|
db,
|
||||||
|
&p_type,
|
||||||
|
&c_type,
|
||||||
|
prop_name,
|
||||||
|
Some(&keys_for_ambiguity),
|
||||||
|
is_array,
|
||||||
|
self.id.as_deref(),
|
||||||
|
&format!("/{}", prop_name),
|
||||||
|
errors,
|
||||||
|
) {
|
||||||
schema_edges.insert(
|
schema_edges.insert(
|
||||||
prop_name.clone(),
|
prop_name.clone(),
|
||||||
crate::database::edge::Edge {
|
crate::database::edge::Edge {
|
||||||
@ -566,15 +610,22 @@ impl Schema {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Inspects the Postgres pg_constraint relations catalog to securely identify
|
||||||
|
/// the precise Foreign Key connecting a parent and child hierarchy path.
|
||||||
pub(crate) fn resolve_relation<'a>(
|
pub(crate) fn resolve_relation<'a>(
|
||||||
db: &'a crate::database::Database,
|
db: &'a crate::database::Database,
|
||||||
parent_type: &str,
|
parent_type: &str,
|
||||||
child_type: &str,
|
child_type: &str,
|
||||||
prop_name: &str,
|
prop_name: &str,
|
||||||
relative_keys: Option<&Vec<String>>,
|
relative_keys: Option<&Vec<String>>,
|
||||||
|
is_array: bool,
|
||||||
|
schema_id: Option<&str>,
|
||||||
|
path: &str,
|
||||||
|
errors: &mut Vec<crate::drop::Error>,
|
||||||
) -> Option<(&'a crate::database::relation::Relation, bool)> {
|
) -> Option<(&'a crate::database::relation::Relation, bool)> {
|
||||||
|
// Enforce graph locality by ensuring we don't accidentally crawl to pure structural entity boundaries
|
||||||
if parent_type == "entity" && child_type == "entity" {
|
if parent_type == "entity" && child_type == "entity" {
|
||||||
return None;
|
return None;
|
||||||
}
|
}
|
||||||
|
|
||||||
let p_def = db.types.get(parent_type)?;
|
let p_def = db.types.get(parent_type)?;
|
||||||
@ -583,11 +634,25 @@ pub(crate) fn resolve_relation<'a>(
|
|||||||
let mut matching_rels = Vec::new();
|
let mut matching_rels = Vec::new();
|
||||||
let mut directions = Vec::new();
|
let mut directions = Vec::new();
|
||||||
|
|
||||||
for rel in db.relations.values() {
|
// Scour the complete catalog for any Edge matching the inheritance scope of the two objects
|
||||||
let is_forward = p_def.hierarchy.contains(&rel.source_type)
|
// This automatically binds polymorphic structures (e.g. recognizing a relationship targeting User
|
||||||
&& c_def.hierarchy.contains(&rel.destination_type);
|
// also natively binds instances specifically typed as Person).
|
||||||
let is_reverse = p_def.hierarchy.contains(&rel.destination_type)
|
let mut all_rels: Vec<&crate::database::relation::Relation> = db.relations.values().collect();
|
||||||
&& c_def.hierarchy.contains(&rel.source_type);
|
all_rels.sort_by(|a, b| a.constraint.cmp(&b.constraint));
|
||||||
|
|
||||||
|
for rel in all_rels {
|
||||||
|
let mut is_forward =
|
||||||
|
p_def.hierarchy.contains(&rel.source_type) && c_def.hierarchy.contains(&rel.destination_type);
|
||||||
|
let is_reverse =
|
||||||
|
p_def.hierarchy.contains(&rel.destination_type) && c_def.hierarchy.contains(&rel.source_type);
|
||||||
|
|
||||||
|
// Structural Cardinality Filtration:
|
||||||
|
// If the schema requires a collection (Array), it is mathematically impossible for a pure
|
||||||
|
// Forward scalar edge (where the parent holds exactly one UUID pointer) to fulfill a One-to-Many request.
|
||||||
|
// Thus, if it's an array, we fully reject pure Forward edges and only accept Reverse edges (or Junction edges).
|
||||||
|
if is_array && is_forward && !is_reverse {
|
||||||
|
is_forward = false;
|
||||||
|
}
|
||||||
|
|
||||||
if is_forward {
|
if is_forward {
|
||||||
matching_rels.push(rel);
|
matching_rels.push(rel);
|
||||||
@ -598,10 +663,28 @@ pub(crate) fn resolve_relation<'a>(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Abort relation discovery early if no hierarchical inheritance match was found
|
||||||
if matching_rels.is_empty() {
|
if matching_rels.is_empty() {
|
||||||
|
let mut details = crate::drop::ErrorDetails {
|
||||||
|
path: path.to_string(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
if let Some(sid) = schema_id {
|
||||||
|
details.schema = Some(sid.to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
errors.push(crate::drop::Error {
|
||||||
|
code: "EDGE_MISSING".to_string(),
|
||||||
|
message: format!(
|
||||||
|
"No database relation exists between '{}' and '{}' for property '{}'",
|
||||||
|
parent_type, child_type, prop_name
|
||||||
|
),
|
||||||
|
details,
|
||||||
|
});
|
||||||
return None;
|
return None;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Ideal State: The objects only share a solitary structural relation, resolving ambiguity instantly.
|
||||||
if matching_rels.len() == 1 {
|
if matching_rels.len() == 1 {
|
||||||
return Some((matching_rels[0], directions[0]));
|
return Some((matching_rels[0], directions[0]));
|
||||||
}
|
}
|
||||||
@ -609,6 +692,8 @@ pub(crate) fn resolve_relation<'a>(
|
|||||||
let mut chosen_idx = 0;
|
let mut chosen_idx = 0;
|
||||||
let mut resolved = false;
|
let mut resolved = false;
|
||||||
|
|
||||||
|
// Exact Prefix Disambiguation: Determine if the database specifically names this constraint
|
||||||
|
// directly mapping to the JSON Schema property name (e.g., `fk_{child}_{property_name}`)
|
||||||
for (i, rel) in matching_rels.iter().enumerate() {
|
for (i, rel) in matching_rels.iter().enumerate() {
|
||||||
if let Some(prefix) = &rel.prefix {
|
if let Some(prefix) = &rel.prefix {
|
||||||
if prop_name.starts_with(prefix)
|
if prop_name.starts_with(prefix)
|
||||||
@ -622,9 +707,11 @@ pub(crate) fn resolve_relation<'a>(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Complex Subgraph Resolution: The database contains multiple equally explicit foreign key constraints
|
||||||
|
// linking these objects (such as pointing to `source` and `target` in Many-to-Many junction models).
|
||||||
if !resolved && relative_keys.is_some() {
|
if !resolved && relative_keys.is_some() {
|
||||||
// 1. M:M Disambiguation: The child schema explicitly defines an outbound property
|
// Twin Deduction Pass 1: We inspect the exact properties structurally defined inside the compiled payload
|
||||||
// matching one of the relational prefixes (e.g. "target"). We first identify that consumed relation.
|
// to observe which explicit relation arrow the child payload natively consumes.
|
||||||
let keys = relative_keys.unwrap();
|
let keys = relative_keys.unwrap();
|
||||||
let mut consumed_rel_idx = None;
|
let mut consumed_rel_idx = None;
|
||||||
for (i, rel) in matching_rels.iter().enumerate() {
|
for (i, rel) in matching_rels.iter().enumerate() {
|
||||||
@ -636,7 +723,8 @@ pub(crate) fn resolve_relation<'a>(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Then, we find its exact Twin on the same junction boundary that provides the reverse ownership.
|
// Twin Deduction Pass 2: Knowing which arrow points outbound, we can mathematically deduce its twin
|
||||||
|
// providing the reverse ownership on the same junction boundary must be the incoming Edge to the parent.
|
||||||
if let Some(used_idx) = consumed_rel_idx {
|
if let Some(used_idx) = consumed_rel_idx {
|
||||||
let used_rel = matching_rels[used_idx];
|
let used_rel = matching_rels[used_idx];
|
||||||
let mut twin_ids = Vec::new();
|
let mut twin_ids = Vec::new();
|
||||||
@ -657,8 +745,9 @@ pub(crate) fn resolve_relation<'a>(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Implicit Base Fallback: If no complex explicit paths resolve, but exactly one relation
|
||||||
|
// sits entirely naked (without a constraint prefix), it must be the core structural parent ownership.
|
||||||
if !resolved {
|
if !resolved {
|
||||||
// 2. Base 1:M Fallback. If there's EXACTLY ONE relation with a null prefix, it's the base structural edge.
|
|
||||||
let mut null_prefix_ids = Vec::new();
|
let mut null_prefix_ids = Vec::new();
|
||||||
for (i, rel) in matching_rels.iter().enumerate() {
|
for (i, rel) in matching_rels.iter().enumerate() {
|
||||||
if rel.prefix.is_none() {
|
if rel.prefix.is_none() {
|
||||||
@ -667,10 +756,33 @@ pub(crate) fn resolve_relation<'a>(
|
|||||||
}
|
}
|
||||||
if null_prefix_ids.len() == 1 {
|
if null_prefix_ids.len() == 1 {
|
||||||
chosen_idx = null_prefix_ids[0];
|
chosen_idx = null_prefix_ids[0];
|
||||||
// resolved = true;
|
resolved = true;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// If we exhausted all mathematical deduction pathways and STILL cannot isolate a single edge,
|
||||||
|
// we must abort rather than silently guessing. Returning None prevents arbitrary SQL generation
|
||||||
|
// and forces a clean structural error for the architect.
|
||||||
|
if !resolved {
|
||||||
|
let mut details = crate::drop::ErrorDetails {
|
||||||
|
path: path.to_string(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
if let Some(sid) = schema_id {
|
||||||
|
details.schema = Some(sid.to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
errors.push(crate::drop::Error {
|
||||||
|
code: "AMBIGUOUS_TYPE_RELATIONS".to_string(),
|
||||||
|
message: format!(
|
||||||
|
"Ambiguous database relation between '{}' and '{}' for property '{}'",
|
||||||
|
parent_type, child_type, prop_name
|
||||||
|
),
|
||||||
|
details,
|
||||||
|
});
|
||||||
|
return None;
|
||||||
|
}
|
||||||
|
|
||||||
Some((matching_rels[chosen_idx], directions[chosen_idx]))
|
Some((matching_rels[chosen_idx], directions[chosen_idx]))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -64,7 +64,7 @@ pub struct Error {
|
|||||||
pub details: ErrorDetails,
|
pub details: ErrorDetails,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Serialize, Deserialize, Clone)]
|
#[derive(Debug, Serialize, Deserialize, Clone, Default)]
|
||||||
pub struct ErrorDetails {
|
pub struct ErrorDetails {
|
||||||
pub path: String,
|
pub path: String,
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
|||||||
19
src/jspg.rs
19
src/jspg.rs
@ -12,18 +12,21 @@ pub struct Jspg {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl Jspg {
|
impl Jspg {
|
||||||
pub fn new(database_val: &serde_json::Value) -> Result<Self, crate::drop::Drop> {
|
pub fn new(database_val: &serde_json::Value) -> (Self, crate::drop::Drop) {
|
||||||
let database_instance = Database::new(database_val)?;
|
let (database_instance, drop) = Database::new(database_val);
|
||||||
let database = Arc::new(database_instance);
|
let database = Arc::new(database_instance);
|
||||||
let validator = Validator::new(database.clone());
|
let validator = Validator::new(database.clone());
|
||||||
let queryer = Queryer::new(database.clone());
|
let queryer = Queryer::new(database.clone());
|
||||||
let merger = Merger::new(database.clone());
|
let merger = Merger::new(database.clone());
|
||||||
|
|
||||||
Ok(Self {
|
(
|
||||||
database,
|
Self {
|
||||||
validator,
|
database,
|
||||||
queryer,
|
validator,
|
||||||
merger,
|
queryer,
|
||||||
})
|
merger,
|
||||||
|
},
|
||||||
|
drop,
|
||||||
|
)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
21
src/lib.rs
21
src/lib.rs
@ -42,21 +42,16 @@ fn jspg_failure() -> JsonB {
|
|||||||
|
|
||||||
#[cfg_attr(not(test), pg_extern(strict))]
|
#[cfg_attr(not(test), pg_extern(strict))]
|
||||||
pub fn jspg_setup(database: JsonB) -> JsonB {
|
pub fn jspg_setup(database: JsonB) -> JsonB {
|
||||||
match crate::jspg::Jspg::new(&database.0) {
|
let (new_jspg, drop) = crate::jspg::Jspg::new(&database.0);
|
||||||
Ok(new_jspg) => {
|
let new_arc = Arc::new(new_jspg);
|
||||||
let new_arc = Arc::new(new_jspg);
|
|
||||||
|
|
||||||
// 3. ATOMIC SWAP
|
// 3. ATOMIC SWAP
|
||||||
{
|
{
|
||||||
let mut lock = GLOBAL_JSPG.write().unwrap();
|
let mut lock = GLOBAL_JSPG.write().unwrap();
|
||||||
*lock = Some(new_arc);
|
*lock = Some(new_arc);
|
||||||
}
|
|
||||||
|
|
||||||
let drop = crate::drop::Drop::success();
|
|
||||||
JsonB(serde_json::to_value(drop).unwrap())
|
|
||||||
}
|
|
||||||
Err(drop) => JsonB(serde_json::to_value(drop).unwrap()),
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
JsonB(serde_json::to_value(drop).unwrap())
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg_attr(not(test), pg_extern)]
|
#[cfg_attr(not(test), pg_extern)]
|
||||||
|
|||||||
@ -40,7 +40,7 @@ impl Merger {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
let result = self.merge_internal(target_schema, data.clone(), &mut notifications_queue);
|
let result = self.merge_internal(target_schema, data, &mut notifications_queue);
|
||||||
|
|
||||||
let val_resolved = match result {
|
let val_resolved = match result {
|
||||||
Ok(val) => val,
|
Ok(val) => val,
|
||||||
@ -78,7 +78,7 @@ impl Merger {
|
|||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: "".to_string(),
|
path: "".to_string(),
|
||||||
cause: final_cause,
|
cause: final_cause,
|
||||||
context: Some(data),
|
context: None,
|
||||||
schema: None,
|
schema: None,
|
||||||
},
|
},
|
||||||
}]);
|
}]);
|
||||||
@ -238,7 +238,7 @@ impl Merger {
|
|||||||
|
|
||||||
if !type_def.relationship {
|
if !type_def.relationship {
|
||||||
let (fields, kind, fetched, replaces) =
|
let (fields, kind, fetched, replaces) =
|
||||||
self.stage_entity(entity_fields.clone(), type_def, &user_id, ×tamp)?;
|
self.stage_entity(entity_fields, type_def, &user_id, ×tamp)?;
|
||||||
entity_fields = fields;
|
entity_fields = fields;
|
||||||
entity_change_kind = kind;
|
entity_change_kind = kind;
|
||||||
entity_fetched = fetched;
|
entity_fetched = fetched;
|
||||||
@ -259,13 +259,7 @@ impl Merger {
|
|||||||
};
|
};
|
||||||
|
|
||||||
if let Some(compiled_edges) = schema.obj.compiled_edges.get() {
|
if let Some(compiled_edges) = schema.obj.compiled_edges.get() {
|
||||||
println!(
|
|
||||||
"Compiled Edges keys for relation {}: {:?}",
|
|
||||||
relation_name,
|
|
||||||
compiled_edges.keys().collect::<Vec<_>>()
|
|
||||||
);
|
|
||||||
if let Some(edge) = compiled_edges.get(&relation_name) {
|
if let Some(edge) = compiled_edges.get(&relation_name) {
|
||||||
println!("FOUND EDGE {} -> {:?}", relation_name, edge.constraint);
|
|
||||||
if let Some(relation) = self.db.relations.get(&edge.constraint) {
|
if let Some(relation) = self.db.relations.get(&edge.constraint) {
|
||||||
let parent_is_source = edge.forward;
|
let parent_is_source = edge.forward;
|
||||||
|
|
||||||
@ -326,7 +320,7 @@ impl Merger {
|
|||||||
|
|
||||||
if type_def.relationship {
|
if type_def.relationship {
|
||||||
let (fields, kind, fetched, replaces) =
|
let (fields, kind, fetched, replaces) =
|
||||||
self.stage_entity(entity_fields.clone(), type_def, &user_id, ×tamp)?;
|
self.stage_entity(entity_fields, type_def, &user_id, ×tamp)?;
|
||||||
entity_fields = fields;
|
entity_fields = fields;
|
||||||
entity_change_kind = kind;
|
entity_change_kind = kind;
|
||||||
entity_fetched = fetched;
|
entity_fetched = fetched;
|
||||||
|
|||||||
@ -67,7 +67,10 @@ impl<'a> Compiler<'a> {
|
|||||||
if let Some(items) = &node.schema.obj.items {
|
if let Some(items) = &node.schema.obj.items {
|
||||||
let mut resolved_type = None;
|
let mut resolved_type = None;
|
||||||
if let Some(family_target) = items.obj.family.as_ref() {
|
if let Some(family_target) = items.obj.family.as_ref() {
|
||||||
let base_type_name = family_target.split('.').next_back().unwrap_or(family_target);
|
let base_type_name = family_target
|
||||||
|
.split('.')
|
||||||
|
.next_back()
|
||||||
|
.unwrap_or(family_target);
|
||||||
resolved_type = self.db.types.get(base_type_name);
|
resolved_type = self.db.types.get(base_type_name);
|
||||||
} else if let Some(base_type_name) = items.obj.identifier() {
|
} else if let Some(base_type_name) = items.obj.identifier() {
|
||||||
resolved_type = self.db.types.get(&base_type_name);
|
resolved_type = self.db.types.get(&base_type_name);
|
||||||
@ -89,7 +92,10 @@ impl<'a> Compiler<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// 3. Fallback for root execution of standalone non-entity arrays
|
// 3. Fallback for root execution of standalone non-entity arrays
|
||||||
Err("Cannot compile a root array without a valid entity reference or table mapped via `items`.".to_string())
|
Err(
|
||||||
|
"Cannot compile a root array without a valid entity reference or table mapped via `items`."
|
||||||
|
.to_string(),
|
||||||
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn compile_reference(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
fn compile_reference(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
||||||
@ -118,33 +124,28 @@ impl<'a> Compiler<'a> {
|
|||||||
}
|
}
|
||||||
// Handle $family Polymorphism fallbacks for relations
|
// Handle $family Polymorphism fallbacks for relations
|
||||||
if let Some(family_target) = &node.schema.obj.family {
|
if let Some(family_target) = &node.schema.obj.family {
|
||||||
let base_type_name = family_target
|
let mut all_targets = vec![family_target.clone()];
|
||||||
.split('.')
|
if let Some(descendants) = self.db.descendants.get(family_target) {
|
||||||
.next_back()
|
all_targets.extend(descendants.clone());
|
||||||
.unwrap_or(family_target)
|
|
||||||
.to_string();
|
|
||||||
|
|
||||||
if let Some(type_def) = self.db.types.get(&base_type_name) {
|
|
||||||
if type_def.variations.len() == 1 {
|
|
||||||
let mut bypass_schema = crate::database::schema::Schema::default();
|
|
||||||
bypass_schema.obj.r#ref = Some(family_target.clone());
|
|
||||||
let mut bypass_node = node.clone();
|
|
||||||
bypass_node.schema = std::sync::Arc::new(bypass_schema);
|
|
||||||
return self.compile_node(bypass_node);
|
|
||||||
}
|
|
||||||
|
|
||||||
let mut sorted_variations: Vec<String> = type_def.variations.iter().cloned().collect();
|
|
||||||
sorted_variations.sort();
|
|
||||||
|
|
||||||
let mut family_schemas = Vec::new();
|
|
||||||
for variation in &sorted_variations {
|
|
||||||
let mut ref_schema = crate::database::schema::Schema::default();
|
|
||||||
ref_schema.obj.r#ref = Some(variation.clone());
|
|
||||||
family_schemas.push(std::sync::Arc::new(ref_schema));
|
|
||||||
}
|
|
||||||
|
|
||||||
return self.compile_one_of(&family_schemas, node);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if all_targets.len() == 1 {
|
||||||
|
let mut bypass_schema = crate::database::schema::Schema::default();
|
||||||
|
bypass_schema.obj.r#ref = Some(all_targets[0].clone());
|
||||||
|
let mut bypass_node = node.clone();
|
||||||
|
bypass_node.schema = std::sync::Arc::new(bypass_schema);
|
||||||
|
return self.compile_node(bypass_node);
|
||||||
|
}
|
||||||
|
|
||||||
|
all_targets.sort();
|
||||||
|
let mut family_schemas = Vec::new();
|
||||||
|
for variation in &all_targets {
|
||||||
|
let mut ref_schema = crate::database::schema::Schema::default();
|
||||||
|
ref_schema.obj.r#ref = Some(variation.clone());
|
||||||
|
family_schemas.push(std::sync::Arc::new(ref_schema));
|
||||||
|
}
|
||||||
|
|
||||||
|
return self.compile_one_of(&family_schemas, node);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Handle oneOf Polymorphism fallbacks for relations
|
// Handle oneOf Polymorphism fallbacks for relations
|
||||||
@ -224,49 +225,62 @@ impl<'a> Compiler<'a> {
|
|||||||
let mut select_args = Vec::new();
|
let mut select_args = Vec::new();
|
||||||
|
|
||||||
if let Some(family_target) = node.schema.obj.family.as_ref() {
|
if let Some(family_target) = node.schema.obj.family.as_ref() {
|
||||||
let base_type_name = family_target
|
let family_prefix = family_target.rfind('.').map(|idx| &family_target[..idx]);
|
||||||
.split('.')
|
|
||||||
.next_back()
|
|
||||||
.unwrap_or(family_target)
|
|
||||||
.to_string();
|
|
||||||
|
|
||||||
if let Some(fam_type_def) = self.db.types.get(&base_type_name) {
|
let mut all_targets = vec![family_target.clone()];
|
||||||
if fam_type_def.variations.len() == 1 {
|
if let Some(descendants) = self.db.descendants.get(family_target) {
|
||||||
let mut bypass_schema = crate::database::schema::Schema::default();
|
all_targets.extend(descendants.clone());
|
||||||
bypass_schema.obj.r#ref = Some(family_target.clone());
|
}
|
||||||
bypass_schema.compile(self.db, &mut std::collections::HashSet::new());
|
|
||||||
|
|
||||||
|
// Filter targets to EXACTLY match the family_target prefix
|
||||||
|
let mut final_targets = Vec::new();
|
||||||
|
for target in all_targets {
|
||||||
|
let target_prefix = target.rfind('.').map(|idx| &target[..idx]);
|
||||||
|
if target_prefix == family_prefix {
|
||||||
|
final_targets.push(target);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
final_targets.sort();
|
||||||
|
final_targets.dedup();
|
||||||
|
|
||||||
|
if final_targets.len() == 1 {
|
||||||
|
let variation = &final_targets[0];
|
||||||
|
if let Some(target_schema) = self.db.schemas.get(variation) {
|
||||||
let mut bypass_node = node.clone();
|
let mut bypass_node = node.clone();
|
||||||
bypass_node.schema = std::sync::Arc::new(bypass_schema);
|
bypass_node.schema = std::sync::Arc::new(target_schema.clone());
|
||||||
|
|
||||||
let mut bypassed_args = self.compile_select_clause(r#type, table_aliases, bypass_node)?;
|
let mut bypassed_args = self.compile_select_clause(r#type, table_aliases, bypass_node)?;
|
||||||
select_args.append(&mut bypassed_args);
|
select_args.append(&mut bypassed_args);
|
||||||
} else {
|
} else {
|
||||||
let mut family_schemas = Vec::new();
|
return Err(format!("Could not find schema for variation {}", variation));
|
||||||
let mut sorted_fam_variations: Vec<String> =
|
|
||||||
fam_type_def.variations.iter().cloned().collect();
|
|
||||||
sorted_fam_variations.sort();
|
|
||||||
|
|
||||||
for variation in &sorted_fam_variations {
|
|
||||||
let mut ref_schema = crate::database::schema::Schema::default();
|
|
||||||
ref_schema.obj.r#ref = Some(variation.clone());
|
|
||||||
ref_schema.compile(self.db, &mut std::collections::HashSet::new());
|
|
||||||
family_schemas.push(std::sync::Arc::new(ref_schema));
|
|
||||||
}
|
|
||||||
|
|
||||||
let base_alias = table_aliases
|
|
||||||
.get(&r#type.name)
|
|
||||||
.cloned()
|
|
||||||
.unwrap_or_else(|| node.parent_alias.to_string());
|
|
||||||
select_args.push(format!("'id', {}.id", base_alias));
|
|
||||||
let mut case_node = node.clone();
|
|
||||||
case_node.parent_alias = base_alias.clone();
|
|
||||||
let arc_aliases = std::sync::Arc::new(table_aliases.clone());
|
|
||||||
case_node.parent_type_aliases = Some(arc_aliases);
|
|
||||||
|
|
||||||
let (case_sql, _) = self.compile_one_of(&family_schemas, case_node)?;
|
|
||||||
select_args.push(format!("'type', {}", case_sql));
|
|
||||||
}
|
}
|
||||||
|
} else {
|
||||||
|
let mut family_schemas = Vec::new();
|
||||||
|
|
||||||
|
for variation in &final_targets {
|
||||||
|
if let Some(target_schema) = self.db.schemas.get(variation) {
|
||||||
|
family_schemas.push(std::sync::Arc::new(target_schema.clone()));
|
||||||
|
} else {
|
||||||
|
return Err(format!(
|
||||||
|
"Could not find schema metadata for variation {}",
|
||||||
|
variation
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
let base_alias = table_aliases
|
||||||
|
.get(&r#type.name)
|
||||||
|
.cloned()
|
||||||
|
.unwrap_or_else(|| node.parent_alias.to_string());
|
||||||
|
select_args.push(format!("'id', {}.id", base_alias));
|
||||||
|
let mut case_node = node.clone();
|
||||||
|
case_node.parent_alias = base_alias.clone();
|
||||||
|
let arc_aliases = std::sync::Arc::new(table_aliases.clone());
|
||||||
|
case_node.parent_type_aliases = Some(arc_aliases);
|
||||||
|
|
||||||
|
let (case_sql, _) = self.compile_one_of(&family_schemas, case_node)?;
|
||||||
|
select_args.push(format!("'type', {}", case_sql));
|
||||||
}
|
}
|
||||||
} else if let Some(one_of) = &node.schema.obj.one_of {
|
} else if let Some(one_of) = &node.schema.obj.one_of {
|
||||||
let base_alias = table_aliases
|
let base_alias = table_aliases
|
||||||
@ -328,10 +342,7 @@ impl<'a> Compiler<'a> {
|
|||||||
};
|
};
|
||||||
|
|
||||||
for option_schema in schemas {
|
for option_schema in schemas {
|
||||||
if let Some(ref_id) = &option_schema.obj.r#ref {
|
if let Some(base_type_name) = option_schema.obj.identifier() {
|
||||||
// Find the physical type this ref maps to
|
|
||||||
let base_type_name = ref_id.split('.').next_back().unwrap_or("").to_string();
|
|
||||||
|
|
||||||
// Generate the nested SQL for this specific target type
|
// Generate the nested SQL for this specific target type
|
||||||
let mut child_node = node.clone();
|
let mut child_node = node.clone();
|
||||||
child_node.schema = std::sync::Arc::clone(option_schema);
|
child_node.schema = std::sync::Arc::clone(option_schema);
|
||||||
@ -452,7 +463,6 @@ impl<'a> Compiler<'a> {
|
|||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
let (val_sql, val_type) = self.compile_node(child_node)?;
|
let (val_sql, val_type) = self.compile_node(child_node)?;
|
||||||
|
|
||||||
if val_type != "abort" {
|
if val_type != "abort" {
|
||||||
@ -515,7 +525,13 @@ impl<'a> Compiler<'a> {
|
|||||||
// Determine if the property schema resolves to a physical Database Entity
|
// Determine if the property schema resolves to a physical Database Entity
|
||||||
let mut bound_type_name = None;
|
let mut bound_type_name = None;
|
||||||
if let Some(family_target) = prop_schema.obj.family.as_ref() {
|
if let Some(family_target) = prop_schema.obj.family.as_ref() {
|
||||||
bound_type_name = Some(family_target.split('.').next_back().unwrap_or(family_target).to_string());
|
bound_type_name = Some(
|
||||||
|
family_target
|
||||||
|
.split('.')
|
||||||
|
.next_back()
|
||||||
|
.unwrap_or(family_target)
|
||||||
|
.to_string(),
|
||||||
|
);
|
||||||
} else if let Some(lookup_key) = prop_schema.obj.identifier() {
|
} else if let Some(lookup_key) = prop_schema.obj.identifier() {
|
||||||
bound_type_name = Some(lookup_key);
|
bound_type_name = Some(lookup_key);
|
||||||
}
|
}
|
||||||
@ -536,7 +552,10 @@ impl<'a> Compiler<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if let Some(col) = poly_col {
|
if let Some(col) = poly_col {
|
||||||
if let Some(alias) = type_aliases.get(table_to_alias).or_else(|| type_aliases.get(&node.parent_alias)) {
|
if let Some(alias) = type_aliases
|
||||||
|
.get(table_to_alias)
|
||||||
|
.or_else(|| type_aliases.get(&node.parent_alias))
|
||||||
|
{
|
||||||
where_clauses.push(format!("{}.{} = '{}'", alias, col, type_name));
|
where_clauses.push(format!("{}.{} = '{}'", alias, col, type_name));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -710,8 +729,6 @@ impl<'a> Compiler<'a> {
|
|||||||
) -> Result<(), String> {
|
) -> Result<(), String> {
|
||||||
if let Some(prop_ref) = &node.property_name {
|
if let Some(prop_ref) = &node.property_name {
|
||||||
let prop = prop_ref.as_str();
|
let prop = prop_ref.as_str();
|
||||||
println!("DEBUG: Eval prop: {}", prop);
|
|
||||||
|
|
||||||
let mut parent_relation_alias = node.parent_alias.clone();
|
let mut parent_relation_alias = node.parent_alias.clone();
|
||||||
let mut child_relation_alias = base_alias.to_string();
|
let mut child_relation_alias = base_alias.to_string();
|
||||||
|
|
||||||
|
|||||||
@ -51,7 +51,7 @@ impl Queryer {
|
|||||||
};
|
};
|
||||||
|
|
||||||
// 3. Execute via Database Executor
|
// 3. Execute via Database Executor
|
||||||
self.execute_sql(schema_id, &sql, &args)
|
self.execute_sql(schema_id, &sql, args)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn extract_filters(
|
fn extract_filters(
|
||||||
@ -151,7 +151,7 @@ impl Queryer {
|
|||||||
&self,
|
&self,
|
||||||
schema_id: &str,
|
schema_id: &str,
|
||||||
sql: &str,
|
sql: &str,
|
||||||
args: &[serde_json::Value],
|
args: Vec<serde_json::Value>,
|
||||||
) -> crate::drop::Drop {
|
) -> crate::drop::Drop {
|
||||||
match self.db.query(sql, Some(args)) {
|
match self.db.query(sql, Some(args)) {
|
||||||
Ok(serde_json::Value::Array(table)) => {
|
Ok(serde_json::Value::Array(table)) => {
|
||||||
|
|||||||
@ -1463,6 +1463,18 @@ fn test_queryer_0_8() {
|
|||||||
crate::tests::runner::run_test_case(&path, 0, 8).unwrap();
|
crate::tests::runner::run_test_case(&path, 0, 8).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_queryer_0_9() {
|
||||||
|
let path = format!("{}/fixtures/queryer.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 9).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_queryer_0_10() {
|
||||||
|
let path = format!("{}/fixtures/queryer.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 10).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_not_0_0() {
|
fn test_not_0_0() {
|
||||||
let path = format!("{}/fixtures/not.json", env!("CARGO_MANIFEST_DIR"));
|
let path = format!("{}/fixtures/not.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
@ -3467,6 +3479,36 @@ fn test_if_then_else_13_1() {
|
|||||||
crate::tests::runner::run_test_case(&path, 13, 1).unwrap();
|
crate::tests::runner::run_test_case(&path, 13, 1).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_database_0_0() {
|
||||||
|
let path = format!("{}/fixtures/database.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 0).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_database_1_0() {
|
||||||
|
let path = format!("{}/fixtures/database.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 1, 0).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_database_2_0() {
|
||||||
|
let path = format!("{}/fixtures/database.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 2, 0).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_database_3_0() {
|
||||||
|
let path = format!("{}/fixtures/database.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 3, 0).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_database_4_0() {
|
||||||
|
let path = format!("{}/fixtures/database.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 4, 0).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_empty_string_0_0() {
|
fn test_empty_string_0_0() {
|
||||||
let path = format!("{}/fixtures/emptyString.json", env!("CARGO_MANIFEST_DIR"));
|
let path = format!("{}/fixtures/emptyString.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
|||||||
@ -14,7 +14,7 @@ where
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Type alias for easier reading
|
// Type alias for easier reading
|
||||||
type CompiledSuite = Arc<Vec<(Suite, Arc<crate::database::Database>)>>;
|
type CompiledSuite = Arc<Vec<(Suite, Arc<Result<Arc<crate::database::Database>, crate::drop::Drop>>)>>;
|
||||||
|
|
||||||
// Global cache mapping filename -> Vector of (Parsed JSON suite, Compiled Database)
|
// Global cache mapping filename -> Vector of (Parsed JSON suite, Compiled Database)
|
||||||
static CACHE: OnceLock<RwLock<HashMap<String, CompiledSuite>>> = OnceLock::new();
|
static CACHE: OnceLock<RwLock<HashMap<String, CompiledSuite>>> = OnceLock::new();
|
||||||
@ -42,20 +42,13 @@ fn get_cached_file(path: &str) -> CompiledSuite {
|
|||||||
|
|
||||||
let mut compiled_suites = Vec::new();
|
let mut compiled_suites = Vec::new();
|
||||||
for suite in suites {
|
for suite in suites {
|
||||||
let db_result = crate::database::Database::new(&suite.database);
|
let (db, drop) = crate::database::Database::new(&suite.database);
|
||||||
if let Err(drop) = db_result {
|
let compiled_db = if drop.errors.is_empty() {
|
||||||
let error_messages: Vec<String> = drop
|
Ok(Arc::new(db))
|
||||||
.errors
|
} else {
|
||||||
.into_iter()
|
Err(drop)
|
||||||
.map(|e| format!("Error {} at path {}: {}", e.code, e.details.path, e.message))
|
};
|
||||||
.collect();
|
compiled_suites.push((suite, Arc::new(compiled_db)));
|
||||||
panic!(
|
|
||||||
"System Setup Compilation failed for {}:\n{}",
|
|
||||||
path,
|
|
||||||
error_messages.join("\n")
|
|
||||||
);
|
|
||||||
}
|
|
||||||
compiled_suites.push((suite, Arc::new(db_result.unwrap())));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
let new_data = Arc::new(compiled_suites);
|
let new_data = Arc::new(compiled_suites);
|
||||||
@ -85,11 +78,36 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
|||||||
let test = &group.tests[case_idx];
|
let test = &group.tests[case_idx];
|
||||||
let mut failures = Vec::<String>::new();
|
let mut failures = Vec::<String>::new();
|
||||||
|
|
||||||
|
// For validate/merge/query, if setup failed we must structurally fail this test
|
||||||
|
let db_unwrapped = if test.action.as_str() != "compile" {
|
||||||
|
match &**db {
|
||||||
|
Ok(valid_db) => Some(valid_db.clone()),
|
||||||
|
Err(drop) => {
|
||||||
|
let error_messages: Vec<String> = drop
|
||||||
|
.errors
|
||||||
|
.iter()
|
||||||
|
.map(|e| format!("Error {} at path {}: {}", e.code, e.details.path, e.message))
|
||||||
|
.collect();
|
||||||
|
failures.push(format!(
|
||||||
|
"[{}] Cannot run '{}' test '{}': System Setup Compilation structurally failed:\n{}",
|
||||||
|
group.description, test.action, test.description, error_messages.join("\n")
|
||||||
|
));
|
||||||
|
None
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
};
|
||||||
|
|
||||||
|
if !failures.is_empty() {
|
||||||
|
return Err(failures.join("\n"));
|
||||||
|
}
|
||||||
|
|
||||||
// 4. Run Tests
|
// 4. Run Tests
|
||||||
|
|
||||||
match test.action.as_str() {
|
match test.action.as_str() {
|
||||||
"compile" => {
|
"compile" => {
|
||||||
let result = test.run_compile(db.clone());
|
let result = test.run_compile(db);
|
||||||
if let Err(e) = result {
|
if let Err(e) = result {
|
||||||
println!("TEST COMPILE ERROR FOR '{}': {}", test.description, e);
|
println!("TEST COMPILE ERROR FOR '{}': {}", test.description, e);
|
||||||
failures.push(format!(
|
failures.push(format!(
|
||||||
@ -99,7 +117,7 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
"validate" => {
|
"validate" => {
|
||||||
let result = test.run_validate(db.clone());
|
let result = test.run_validate(db_unwrapped.unwrap());
|
||||||
if let Err(e) = result {
|
if let Err(e) = result {
|
||||||
println!("TEST VALIDATE ERROR FOR '{}': {}", test.description, e);
|
println!("TEST VALIDATE ERROR FOR '{}': {}", test.description, e);
|
||||||
failures.push(format!(
|
failures.push(format!(
|
||||||
@ -109,7 +127,7 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
"merge" => {
|
"merge" => {
|
||||||
let result = test.run_merge(db.clone());
|
let result = test.run_merge(db_unwrapped.unwrap());
|
||||||
if let Err(e) = result {
|
if let Err(e) = result {
|
||||||
println!("TEST MERGE ERROR FOR '{}': {}", test.description, e);
|
println!("TEST MERGE ERROR FOR '{}': {}", test.description, e);
|
||||||
failures.push(format!(
|
failures.push(format!(
|
||||||
@ -119,7 +137,7 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
"query" => {
|
"query" => {
|
||||||
let result = test.run_query(db.clone());
|
let result = test.run_query(db_unwrapped.unwrap());
|
||||||
if let Err(e) = result {
|
if let Err(e) = result {
|
||||||
println!("TEST QUERY ERROR FOR '{}': {}", test.description, e);
|
println!("TEST QUERY ERROR FOR '{}': {}", test.description, e);
|
||||||
failures.push(format!(
|
failures.push(format!(
|
||||||
|
|||||||
@ -35,21 +35,21 @@ fn default_action() -> String {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl Case {
|
impl Case {
|
||||||
pub fn run_compile(&self, _db: Arc<Database>) -> Result<(), String> {
|
pub fn run_compile(
|
||||||
let expected_success = self.expect.as_ref().map(|e| e.success).unwrap_or(false);
|
&self,
|
||||||
|
db_res: &Result<Arc<Database>, crate::drop::Drop>,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
let expect = match &self.expect {
|
||||||
|
Some(e) => e,
|
||||||
|
None => return Ok(()),
|
||||||
|
};
|
||||||
|
|
||||||
// We assume db has already been setup and compiled successfully by runner.rs's `jspg_setup`
|
let result = match db_res {
|
||||||
// We just need to check if there are compilation errors vs expected success
|
Ok(_) => crate::drop::Drop::success(),
|
||||||
let got_success = true; // Setup ensures success unless setup fails, which runner handles
|
Err(d) => d.clone(),
|
||||||
|
};
|
||||||
|
|
||||||
if expected_success != got_success {
|
expect.assert_drop(&result)
|
||||||
return Err(format!(
|
|
||||||
"Expected success: {}, Got: {}",
|
|
||||||
expected_success, got_success
|
|
||||||
));
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn run_validate(&self, db: Arc<Database>) -> Result<(), String> {
|
pub fn run_validate(&self, db: Arc<Database>) -> Result<(), String> {
|
||||||
@ -57,8 +57,6 @@ impl Case {
|
|||||||
|
|
||||||
let validator = Validator::new(db);
|
let validator = Validator::new(db);
|
||||||
|
|
||||||
let expected_success = self.expect.as_ref().map(|e| e.success).unwrap_or(false);
|
|
||||||
|
|
||||||
let schema_id = &self.schema_id;
|
let schema_id = &self.schema_id;
|
||||||
if !validator.db.schemas.contains_key(schema_id) {
|
if !validator.db.schemas.contains_key(schema_id) {
|
||||||
return Err(format!(
|
return Err(format!(
|
||||||
@ -70,19 +68,8 @@ impl Case {
|
|||||||
let test_data = self.data.clone().unwrap_or(Value::Null);
|
let test_data = self.data.clone().unwrap_or(Value::Null);
|
||||||
let result = validator.validate(schema_id, &test_data);
|
let result = validator.validate(schema_id, &test_data);
|
||||||
|
|
||||||
let got_valid = result.errors.is_empty();
|
if let Some(expect) = &self.expect {
|
||||||
|
expect.assert_drop(&result)?;
|
||||||
if got_valid != expected_success {
|
|
||||||
let error_msg = if result.errors.is_empty() {
|
|
||||||
"None".to_string()
|
|
||||||
} else {
|
|
||||||
format!("{:?}", result.errors)
|
|
||||||
};
|
|
||||||
|
|
||||||
return Err(format!(
|
|
||||||
"Expected: {}, Got: {}. Errors: {}",
|
|
||||||
expected_success, got_valid, error_msg
|
|
||||||
));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
Ok(())
|
Ok(())
|
||||||
@ -101,24 +88,16 @@ impl Case {
|
|||||||
let test_data = self.data.clone().unwrap_or(Value::Null);
|
let test_data = self.data.clone().unwrap_or(Value::Null);
|
||||||
let result = merger.merge(&self.schema_id, test_data);
|
let result = merger.merge(&self.schema_id, test_data);
|
||||||
|
|
||||||
let expected_success = self.expect.as_ref().map(|e| e.success).unwrap_or(false);
|
let return_val = if let Some(expect) = &self.expect {
|
||||||
let got_success = result.errors.is_empty();
|
if let Err(e) = expect.assert_drop(&result) {
|
||||||
|
Err(format!("Merge {}", e))
|
||||||
let error_msg = if result.errors.is_empty() {
|
} else if result.errors.is_empty() {
|
||||||
"None".to_string()
|
// Only assert SQL if merge succeeded
|
||||||
} else {
|
let queries = db.executor.get_queries();
|
||||||
format!("{:?}", result.errors)
|
expect.assert_pattern(&queries).and_then(|_| expect.assert_sql(&queries))
|
||||||
};
|
} else {
|
||||||
|
Ok(())
|
||||||
let return_val = if expected_success != got_success {
|
}
|
||||||
Err(format!(
|
|
||||||
"Merge Expected: {}, Got: {}. Errors: {}",
|
|
||||||
expected_success, got_success, error_msg
|
|
||||||
))
|
|
||||||
} else if let Some(expect) = &self.expect {
|
|
||||||
let queries = db.executor.get_queries();
|
|
||||||
expect.assert_pattern(&queries)?;
|
|
||||||
expect.assert_sql(&queries)
|
|
||||||
} else {
|
} else {
|
||||||
Ok(())
|
Ok(())
|
||||||
};
|
};
|
||||||
@ -139,24 +118,15 @@ impl Case {
|
|||||||
|
|
||||||
let result = queryer.query(&self.schema_id, self.filters.as_ref());
|
let result = queryer.query(&self.schema_id, self.filters.as_ref());
|
||||||
|
|
||||||
let expected_success = self.expect.as_ref().map(|e| e.success).unwrap_or(false);
|
let return_val = if let Some(expect) = &self.expect {
|
||||||
let got_success = result.errors.is_empty();
|
if let Err(e) = expect.assert_drop(&result) {
|
||||||
|
Err(format!("Query {}", e))
|
||||||
let error_msg = if result.errors.is_empty() {
|
} else if result.errors.is_empty() {
|
||||||
"None".to_string()
|
let queries = db.executor.get_queries();
|
||||||
} else {
|
expect.assert_pattern(&queries).and_then(|_| expect.assert_sql(&queries))
|
||||||
format!("{:?}", result.errors)
|
} else {
|
||||||
};
|
Ok(())
|
||||||
|
}
|
||||||
let return_val = if expected_success != got_success {
|
|
||||||
Err(format!(
|
|
||||||
"Query Expected: {}, Got: {}. Errors: {}",
|
|
||||||
expected_success, got_success, error_msg
|
|
||||||
))
|
|
||||||
} else if let Some(expect) = &self.expect {
|
|
||||||
let queries = db.executor.get_queries();
|
|
||||||
expect.assert_pattern(&queries)?;
|
|
||||||
expect.assert_sql(&queries)
|
|
||||||
} else {
|
} else {
|
||||||
Ok(())
|
Ok(())
|
||||||
};
|
};
|
||||||
|
|||||||
78
src/tests/types/expect/drop.rs
Normal file
78
src/tests/types/expect/drop.rs
Normal file
@ -0,0 +1,78 @@
|
|||||||
|
use super::Expect;
|
||||||
|
|
||||||
|
impl Expect {
|
||||||
|
pub fn assert_drop(&self, drop: &crate::drop::Drop) -> Result<(), String> {
|
||||||
|
let got_success = drop.errors.is_empty();
|
||||||
|
|
||||||
|
if self.success != got_success {
|
||||||
|
let mut err_msg = format!("Expected success: {}, Got: {}.", self.success, got_success);
|
||||||
|
if !drop.errors.is_empty() {
|
||||||
|
err_msg.push_str(&format!(" Actual Errors: {:?}", drop.errors));
|
||||||
|
}
|
||||||
|
return Err(err_msg);
|
||||||
|
}
|
||||||
|
|
||||||
|
if !self.success {
|
||||||
|
if let Some(expected_errors) = &self.errors {
|
||||||
|
let actual_values: Vec<serde_json::Value> = drop.errors
|
||||||
|
.iter()
|
||||||
|
.map(|e| serde_json::to_value(e).unwrap())
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
for (i, expected_val) in expected_errors.iter().enumerate() {
|
||||||
|
let mut matched = false;
|
||||||
|
|
||||||
|
for actual_val in &actual_values {
|
||||||
|
if subset_match(expected_val, actual_val) {
|
||||||
|
matched = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !matched {
|
||||||
|
return Err(format!(
|
||||||
|
"Expected error {} was not found in actual errors.\nExpected subset: {}\nActual full errors: {:?}",
|
||||||
|
i,
|
||||||
|
serde_json::to_string_pretty(expected_val).unwrap(),
|
||||||
|
drop.errors,
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper to check if `expected` is a structural subset of `actual`
|
||||||
|
fn subset_match(expected: &serde_json::Value, actual: &serde_json::Value) -> bool {
|
||||||
|
match (expected, actual) {
|
||||||
|
(serde_json::Value::Object(exp_map), serde_json::Value::Object(act_map)) => {
|
||||||
|
for (k, v) in exp_map {
|
||||||
|
if let Some(act_v) = act_map.get(k) {
|
||||||
|
if !subset_match(v, act_v) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
true
|
||||||
|
}
|
||||||
|
(serde_json::Value::Array(exp_arr), serde_json::Value::Array(act_arr)) => {
|
||||||
|
// Basic check: array sizes and elements must match exactly in order
|
||||||
|
if exp_arr.len() != act_arr.len() {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
for (e, a) in exp_arr.iter().zip(act_arr.iter()) {
|
||||||
|
if !subset_match(e, a) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
true
|
||||||
|
}
|
||||||
|
// For primitives, exact match
|
||||||
|
(e, a) => e == a,
|
||||||
|
}
|
||||||
|
}
|
||||||
@ -1,5 +1,6 @@
|
|||||||
pub mod pattern;
|
pub mod pattern;
|
||||||
pub mod sql;
|
pub mod sql;
|
||||||
|
pub mod drop;
|
||||||
|
|
||||||
use serde::Deserialize;
|
use serde::Deserialize;
|
||||||
|
|
||||||
|
|||||||
@ -31,10 +31,7 @@ impl<'a> ValidationContext<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if let Some(family_target) = &self.schema.family {
|
if let Some(family_target) = &self.schema.family {
|
||||||
// The descendants map is keyed by the schema's own $id, not the target string.
|
if let Some(descendants) = self.db.descendants.get(family_target) {
|
||||||
if let Some(schema_id) = &self.schema.id
|
|
||||||
&& let Some(descendants) = self.db.descendants.get(schema_id)
|
|
||||||
{
|
|
||||||
// Validate against all descendants simulating strict oneOf logic
|
// Validate against all descendants simulating strict oneOf logic
|
||||||
let mut passed_candidates: Vec<(String, usize, ValidationResult)> = Vec::new();
|
let mut passed_candidates: Vec<(String, usize, ValidationResult)> = Vec::new();
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user