Compare commits
20 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 93b0a70718 | |||
| 9c24f1af8f | |||
| f9cf1f837a | |||
| 796df7763c | |||
| 4a10833f50 | |||
| 46fc032026 | |||
| 7ec06b81cc | |||
| c4e8e0309f | |||
| eb91b65e65 | |||
| 8bf3649465 | |||
| 9fe5a34163 | |||
| f5bf21eb58 | |||
| 9dcafed406 | |||
| ffd6c27da3 | |||
| 4941dc6069 | |||
| a8a15a82ef | |||
| 8dcc714963 | |||
| f87ac81f3b | |||
| 8ca9017cc4 | |||
| 10c57e59ec |
23
GEMINI.md
23
GEMINI.md
@ -7,14 +7,14 @@
|
|||||||
JSPG operates by deeply integrating the JSON Schema Draft 2020-12 specification directly into the Postgres session lifecycle. It is built around three core pillars:
|
JSPG operates by deeply integrating the JSON Schema Draft 2020-12 specification directly into the Postgres session lifecycle. It is built around three core pillars:
|
||||||
* **Validator**: In-memory, near-instant JSON structural validation and type polymorphism routing.
|
* **Validator**: In-memory, near-instant JSON structural validation and type polymorphism routing.
|
||||||
* **Merger**: Automatically traverse and UPSERT deeply nested JSON graphs into normalized relational tables.
|
* **Merger**: Automatically traverse and UPSERT deeply nested JSON graphs into normalized relational tables.
|
||||||
* **Queryer**: Compile JSON Schemas into static, cached SQL SPI `SELECT` plans for fetching full entities or isolated "Stems".
|
* **Queryer**: Compile JSON Schemas into static, cached SQL SPI `SELECT` plans for fetching full entities or isolated ad-hoc object boundaries.
|
||||||
|
|
||||||
### 🎯 Goals
|
### 🎯 Goals
|
||||||
1. **Draft 2020-12 Compliance**: Attempt to adhere to the official JSON Schema Draft 2020-12 specification.
|
1. **Draft 2020-12 Compliance**: Attempt to adhere to the official JSON Schema Draft 2020-12 specification.
|
||||||
2. **Ultra-Fast Execution**: Compile schemas into optimized in-memory validation trees and cached SQL SPIs to bypass Postgres Query Builder overheads.
|
2. **Ultra-Fast Execution**: Compile schemas into optimized in-memory validation trees and cached SQL SPIs to bypass Postgres Query Builder overheads.
|
||||||
3. **Connection-Bound Caching**: Leverage the PostgreSQL session lifecycle using an **Atomic Swap** pattern. Schemas are 100% frozen, completely eliminating locks during read access.
|
3. **Connection-Bound Caching**: Leverage the PostgreSQL session lifecycle using an **Atomic Swap** pattern. Schemas are 100% frozen, completely eliminating locks during read access.
|
||||||
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `$family` references natively mapped to Postgres table constraints.
|
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `$family` references natively mapped to Postgres table constraints.
|
||||||
5. **Reactive Beats**: Provide natively generated "Stems" (isolated payload fragments) for dynamic websocket reactivity.
|
5. **Reactive Beats**: Provide ultra-fast natively generated flat payloads mapping directly to the Dart topological state for dynamic websocket reactivity.
|
||||||
|
|
||||||
### Concurrency & Threading ("Immutable Graphs")
|
### Concurrency & Threading ("Immutable Graphs")
|
||||||
To support high-throughput operations while allowing for runtime updates (e.g., during hot-reloading), JSPG uses an **Atomic Swap** pattern:
|
To support high-throughput operations while allowing for runtime updates (e.g., during hot-reloading), JSPG uses an **Atomic Swap** pattern:
|
||||||
@ -118,22 +118,11 @@ The Queryer transforms Postgres into a pre-compiled Semantic Query Engine, desig
|
|||||||
* **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into `JOIN`s for each variation.
|
* **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into `JOIN`s for each variation.
|
||||||
* **Single-Table Bypass**: If the Physical Table is a leaf node with only one variation (e.g. `person` has variations `["person"]`), the compiler cleanly bypasses `CASE` generation and compiles a simple `SELECT` across the base table, as all schema extensions (e.g. `light.person`, `full.person`) are guaranteed to reside in the exact same physical row.
|
* **Single-Table Bypass**: If the Physical Table is a leaf node with only one variation (e.g. `person` has variations `["person"]`), the compiler cleanly bypasses `CASE` generation and compiles a simple `SELECT` across the base table, as all schema extensions (e.g. `light.person`, `full.person`) are guaranteed to reside in the exact same physical row.
|
||||||
|
|
||||||
### The Stem Engine
|
### Ad-Hoc Schema Promotion
|
||||||
|
|
||||||
Rather than over-fetching heavy Entity payloads and trimming them, Punc Framework Websockets depend on isolated subgraphs defined as **Stems**.
|
To seamlessly support deeply nested, inline Object definitions that don't declare an explicit `$id`, JSPG aggressively promotes them to standalone topological entities during the database compilation phase.
|
||||||
A `Stem` is a declaration of an **Entity Type boundary** that exists somewhere within the compiled JSON Schema graph, expressed using **`gjson` multipath syntax** (e.g., `contacts.#.phone_numbers.#`).
|
* **Hash Generation:** While evaluating the unified graph, if the compiler enters an `Object` or `Array` structure completely lacking an `$id`, it dynamically calculates a localized hash alias representing exactly its structural constraints.
|
||||||
|
* **Promotion:** This inline chunk is mathematically elevated to its own `$id` in the `db.schemas` cache registry. This guarantees that $O(1)$ WebSockets or isolated queries can natively target any arbitrary sub-object of a massive database topology directly without recursively re-parsing its parent's AST block every read.
|
||||||
Because `pg_notify` (Beats) fire rigidly from physical Postgres tables (e.g. `{"type": "phone_number"}`), the Go Framework only ever needs to know: "Does the schema `with_contacts.person` contain the `phone_number` Entity anywhere inside its tree, and if so, what is the gjson path to iterate its payload?"
|
|
||||||
|
|
||||||
* **Initialization:** During startup (`jspg_stems()`), the database crawls all Schemas and maps out every physical Entity Type it references. It builds a highly optimized `HashMap<String, HashMap<String, Arc<Stem>>>` providing strictly `O(1)` memory lookups mapping `Schema ID -> { Stem Path -> Entity Type }`.
|
|
||||||
* **GJSON Pathing:** Unlike standard JSON Pointers, stems utilize `.#` array iterator syntax. The Go web server consumes this native path (e.g. `lines.#`) across the raw Postgres JSON byte payload, extracting all active UUIDs in one massive sub-millisecond sweep without unmarshaling Go ASTs.
|
|
||||||
* **Polymorphic Condition Selectors:** When trailing paths would otherwise collide because of abstract polymorphic type definitions (e.g., a `target` property bounded by a `oneOf` taking either `phone_number` or `email_address`), JSPG natively appends evaluated `gjson` type conditions into the path (e.g. `contacts.#.target#(type=="phone_number")`). This guarantees `O(1)` key uniqueness in the HashMap while retaining extreme array extraction speeds natively without runtime AST evaluation.
|
|
||||||
* **Identifier Prioritization:** When determining if a nested object boundary is an Entity, JSPG natively prioritizes defined `$id` tags over `$ref` inheritance pointers to prevent polymorphic boundaries from devolving into their generic base classes.
|
|
||||||
* **Cyclical Deduplication:** Because Punc relationships often reference back on themselves via deeply nested classes, the Stem Engine applies intelligent path deduplication. If the active `current_path` already ends with the target entity string, it traverses the inheritance properties without appending the entity to the stem path again, eliminating infinite powerset loops.
|
|
||||||
* **Relationship Path Squashing:** When calculating string paths structurally, JSPG intentionally **omits** properties natively named `target` or `source` if they belong to a native database `relationship` table override.
|
|
||||||
* **The Go Router**: The Golang Punc framework uses this exact mapping to register WebSocket Beat frequencies exclusively on the Entity types discovered.
|
|
||||||
* **The Queryer Execution**: When the Go framework asks JSPG to hydrate a partial `phone_number` stem for the `with_contacts.person` schema, instead of jumping through string paths, the SQL Compiler simply reaches into the Schema's AST using the `phone_number` Type string, pulls out exactly that entity's mapping rules, and returns a fully correlated `SELECT` block! This natively handles nested array properties injected via `oneOf` or array references efficiently bypassing runtime powerset expansion.
|
|
||||||
* **Performance:** These Stem execution structures are fully statically compiled via SPI and map perfectly to `O(1)` real-time routing logic on the application tier.
|
|
||||||
|
|
||||||
## 5. Testing & Execution Architecture
|
## 5. Testing & Execution Architecture
|
||||||
|
|
||||||
|
|||||||
58
LOOKUP_VERIFICATION.md
Normal file
58
LOOKUP_VERIFICATION.md
Normal file
@ -0,0 +1,58 @@
|
|||||||
|
# The Postgres Partial Index Claiming Pattern
|
||||||
|
|
||||||
|
This document outlines the architectural strategy for securely handling the deduplication, claiming, and verification of sensitive unique identifiers (like email addresses or phone numbers) strictly through PostgreSQL without requiring "magical" logic in the JSPG `Merger`.
|
||||||
|
|
||||||
|
## The Denial of Service (DoS) Squatter Problem
|
||||||
|
|
||||||
|
If you enforce a standard `UNIQUE` constraint on an email address table:
|
||||||
|
1. Malicious User A signs up and adds `jeff.bezos@amazon.com` to their account but never verifies it.
|
||||||
|
2. The real Jeff Bezos signs up.
|
||||||
|
3. The Database blocks Jeff because the unique string already exists.
|
||||||
|
|
||||||
|
The squatter has effectively locked the legitimate owner out of the system.
|
||||||
|
|
||||||
|
## The Anti-Patterns
|
||||||
|
|
||||||
|
1. **Global Entity Flags**: Adding a global `verified` boolean to the root `entity` table forces unrelated objects (like Widgets, Invoices, Orders) to carry verification logic that doesn't belong to them.
|
||||||
|
2. **Magical Merger Logic**: Making JSPG's `Merger` aware of a specific `verified` field breaks its pure structural translation model. The Merger shouldn't need hardcoded conditional logic to know if it's allowed to update an unverified row.
|
||||||
|
|
||||||
|
## The Solution: Postgres Partial Unique Indexes
|
||||||
|
|
||||||
|
The holy grail is to defer all claiming logic natively to the database engine using a **Partial Unique Index**.
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- Remove any existing global unique constraint on address first
|
||||||
|
CREATE UNIQUE INDEX lk_email_address_verified
|
||||||
|
ON email_address (address)
|
||||||
|
WHERE verified_at IS NOT NULL;
|
||||||
|
```
|
||||||
|
|
||||||
|
### How the Lifecycle Works Natively
|
||||||
|
|
||||||
|
1. **Unverified Squatters (Isolated Rows):**
|
||||||
|
A hundred different users can send `{ "address": "jeff.bezos@amazon.com" }` through the `save_person` Punc. Because the Punc isolates them and doesn't allow setting the `verified_at` property natively (enforced by the JSON schema), the JSPG Merger inserts `NULL`.
|
||||||
|
Postgres permits all 100 `INSERT` commands to succeed because the Partial Index **ignores** rows where `verified_at IS NULL`. Every user gets their own isolated, unverified row acting as a placeholder on their contact edge.
|
||||||
|
|
||||||
|
2. **The Verification Race (The Claim):**
|
||||||
|
The real Jeff clicks his magic verification link. The backend securely executes a specific verification Punc that runs:
|
||||||
|
`UPDATE email_address SET verified_at = now() WHERE id = <jeff's-real-uuid>`
|
||||||
|
|
||||||
|
3. **The Lockout:**
|
||||||
|
Because Jeff's row now strictly satisfies `verified_at IS NOT NULL`, that exact row enters the Partial Unique Index.
|
||||||
|
If any of the other 99 squatters *ever* click their fake verification links (or if a new user tries to verify that same email), PostgreSQL hits the index and violently throws a **Unique Constraint Violation**, flawlessly blocking them. The winner has permanently claimed the slot across the entire environment!
|
||||||
|
|
||||||
|
### Periodic Cleanup
|
||||||
|
|
||||||
|
Since unverified rows are allowed to accumulate without colliding, a simple Postgres `pg_cron` job or backend worker can sweep the table nightly to prune abandoned claims and preserve storage:
|
||||||
|
|
||||||
|
```sql
|
||||||
|
DELETE FROM email_address
|
||||||
|
WHERE verified_at IS NULL
|
||||||
|
AND created_at < NOW() - INTERVAL '24 hours';
|
||||||
|
```
|
||||||
|
|
||||||
|
### Why this is the Ultimate Architecture
|
||||||
|
|
||||||
|
* The **JSPG Merger** remains mathematically pure. It doesn't know what `verified_at` is; it simply respects the database's structural limits (`O(1)` pure translation).
|
||||||
|
* **Row-Level Security (RLS)** naturally blocks users from seeing or claiming each other's unverified rows.
|
||||||
|
* You offload complex race-condition tracking entirely to the C-level PostgreSQL B-Tree indexing engine, guaranteeing absolute cluster-wide atomicity.
|
||||||
@ -142,7 +142,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "CONST_VIOLATED",
|
"code": "CONST_VIOLATED",
|
||||||
"path": "/con"
|
"path": "con"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@ -48,7 +48,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "TYPE_MISMATCH",
|
"code": "TYPE_MISMATCH",
|
||||||
"path": "/base_prop"
|
"path": "base_prop"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -109,7 +109,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "REQUIRED_FIELD_MISSING",
|
"code": "REQUIRED_FIELD_MISSING",
|
||||||
"path": "/a"
|
"path": "a"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -126,7 +126,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "REQUIRED_FIELD_MISSING",
|
"code": "REQUIRED_FIELD_MISSING",
|
||||||
"path": "/b"
|
"path": "b"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -196,7 +196,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "DEPENDENCY_FAILED",
|
"code": "DEPENDENCY_FAILED",
|
||||||
"path": "/base_dep"
|
"path": "base_dep"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -214,7 +214,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "DEPENDENCY_FAILED",
|
"code": "DEPENDENCY_FAILED",
|
||||||
"path": "/child_dep"
|
"path": "child_dep"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@ -972,7 +972,12 @@
|
|||||||
"LEFT JOIN agreego.\"user\" t2 ON t2.id = t1.id",
|
"LEFT JOIN agreego.\"user\" t2 ON t2.id = t1.id",
|
||||||
"LEFT JOIN agreego.\"organization\" t3 ON t3.id = t1.id",
|
"LEFT JOIN agreego.\"organization\" t3 ON t3.id = t1.id",
|
||||||
"LEFT JOIN agreego.\"entity\" t4 ON t4.id = t1.id",
|
"LEFT JOIN agreego.\"entity\" t4 ON t4.id = t1.id",
|
||||||
"WHERE \"first_name\" = 'LookupFirst' AND \"last_name\" = 'LookupLast' AND \"date_of_birth\" = '1990-01-01T00:00:00Z' AND \"pronouns\" = 'they/them'"
|
"WHERE (",
|
||||||
|
" \"first_name\" = 'LookupFirst'",
|
||||||
|
" AND \"last_name\" = 'LookupLast'",
|
||||||
|
" AND \"date_of_birth\" = '1990-01-01T00:00:00Z'",
|
||||||
|
" AND \"pronouns\" = 'they/them'",
|
||||||
|
")"
|
||||||
],
|
],
|
||||||
[
|
[
|
||||||
"UPDATE agreego.\"person\"",
|
"UPDATE agreego.\"person\"",
|
||||||
@ -1039,6 +1044,177 @@
|
|||||||
]
|
]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"description": "Update existing person with id (lookup)",
|
||||||
|
"action": "merge",
|
||||||
|
"data": {
|
||||||
|
"id": "33333333-3333-3333-3333-333333333333",
|
||||||
|
"type": "person",
|
||||||
|
"first_name": "LookupFirst",
|
||||||
|
"last_name": "LookupLast",
|
||||||
|
"date_of_birth": "1990-01-01T00:00:00Z",
|
||||||
|
"pronouns": "they/them",
|
||||||
|
"contact_id": "abc-contact"
|
||||||
|
},
|
||||||
|
"mocks": [
|
||||||
|
{
|
||||||
|
"id": "22222222-2222-2222-2222-222222222222",
|
||||||
|
"type": "person",
|
||||||
|
"first_name": "LookupFirst",
|
||||||
|
"last_name": "LookupLast",
|
||||||
|
"date_of_birth": "1990-01-01T00:00:00Z",
|
||||||
|
"pronouns": "they/them",
|
||||||
|
"contact_id": "old-contact"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"schema_id": "person",
|
||||||
|
"expect": {
|
||||||
|
"success": true,
|
||||||
|
"sql": [
|
||||||
|
[
|
||||||
|
"SELECT to_jsonb(t1.*) || to_jsonb(t2.*) || to_jsonb(t3.*) || to_jsonb(t4.*)",
|
||||||
|
"FROM agreego.\"person\" t1",
|
||||||
|
"LEFT JOIN agreego.\"user\" t2 ON t2.id = t1.id",
|
||||||
|
"LEFT JOIN agreego.\"organization\" t3 ON t3.id = t1.id",
|
||||||
|
"LEFT JOIN agreego.\"entity\" t4 ON t4.id = t1.id",
|
||||||
|
"WHERE",
|
||||||
|
" t1.id = '33333333-3333-3333-3333-333333333333'",
|
||||||
|
" OR (",
|
||||||
|
" \"first_name\" = 'LookupFirst'",
|
||||||
|
" AND \"last_name\" = 'LookupLast'",
|
||||||
|
" AND \"date_of_birth\" = '1990-01-01T00:00:00Z'",
|
||||||
|
" AND \"pronouns\" = 'they/them'",
|
||||||
|
" )"
|
||||||
|
],
|
||||||
|
[
|
||||||
|
"UPDATE agreego.\"person\"",
|
||||||
|
"SET",
|
||||||
|
" \"contact_id\" = 'abc-contact'",
|
||||||
|
"WHERE",
|
||||||
|
" id = '22222222-2222-2222-2222-222222222222'"
|
||||||
|
],
|
||||||
|
[
|
||||||
|
"UPDATE agreego.\"entity\"",
|
||||||
|
"SET",
|
||||||
|
" \"modified_at\" = '2026-03-10T00:00:00Z',",
|
||||||
|
" \"modified_by\" = '00000000-0000-0000-0000-000000000000'",
|
||||||
|
"WHERE",
|
||||||
|
" id = '22222222-2222-2222-2222-222222222222'"
|
||||||
|
],
|
||||||
|
[
|
||||||
|
"INSERT INTO agreego.change (",
|
||||||
|
" \"old\",",
|
||||||
|
" \"new\",",
|
||||||
|
" entity_id,",
|
||||||
|
" id,",
|
||||||
|
" kind,",
|
||||||
|
" modified_at,",
|
||||||
|
" modified_by",
|
||||||
|
")",
|
||||||
|
"VALUES (",
|
||||||
|
" '{",
|
||||||
|
" \"contact_id\":\"old-contact\"",
|
||||||
|
" }',",
|
||||||
|
" '{",
|
||||||
|
" \"contact_id\":\"abc-contact\",",
|
||||||
|
" \"type\":\"person\"",
|
||||||
|
" }',",
|
||||||
|
" '22222222-2222-2222-2222-222222222222',",
|
||||||
|
" '{{uuid}}',",
|
||||||
|
" 'update',",
|
||||||
|
" '{{timestamp}}',",
|
||||||
|
" '00000000-0000-0000-0000-000000000000'",
|
||||||
|
")"
|
||||||
|
],
|
||||||
|
[
|
||||||
|
"SELECT pg_notify('entity', '{",
|
||||||
|
" \"complete\":{",
|
||||||
|
" \"contact_id\":\"abc-contact\",",
|
||||||
|
" \"date_of_birth\":\"1990-01-01T00:00:00Z\",",
|
||||||
|
" \"first_name\":\"LookupFirst\",",
|
||||||
|
" \"id\":\"22222222-2222-2222-2222-222222222222\",",
|
||||||
|
" \"last_name\":\"LookupLast\",",
|
||||||
|
" \"modified_at\":\"2026-03-10T00:00:00Z\",",
|
||||||
|
" \"modified_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||||
|
" \"pronouns\":\"they/them\",",
|
||||||
|
" \"type\":\"person\"",
|
||||||
|
" },",
|
||||||
|
" \"new\":{",
|
||||||
|
" \"contact_id\":\"abc-contact\",",
|
||||||
|
" \"type\":\"person\"",
|
||||||
|
" },",
|
||||||
|
" \"old\":{",
|
||||||
|
" \"contact_id\":\"old-contact\"",
|
||||||
|
" },",
|
||||||
|
" \"replaces\":\"33333333-3333-3333-3333-333333333333\"",
|
||||||
|
" }')"
|
||||||
|
]
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Replace existing person with id and no changes (lookup)",
|
||||||
|
"action": "merge",
|
||||||
|
"data": {
|
||||||
|
"id": "33333333-3333-3333-3333-333333333333",
|
||||||
|
"type": "person",
|
||||||
|
"first_name": "LookupFirst",
|
||||||
|
"last_name": "LookupLast",
|
||||||
|
"date_of_birth": "1990-01-01T00:00:00Z",
|
||||||
|
"pronouns": "they/them"
|
||||||
|
},
|
||||||
|
"mocks": [
|
||||||
|
{
|
||||||
|
"id": "22222222-2222-2222-2222-222222222222",
|
||||||
|
"type": "person",
|
||||||
|
"first_name": "LookupFirst",
|
||||||
|
"last_name": "LookupLast",
|
||||||
|
"date_of_birth": "1990-01-01T00:00:00Z",
|
||||||
|
"pronouns": "they/them",
|
||||||
|
"contact_id": "old-contact"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"schema_id": "person",
|
||||||
|
"expect": {
|
||||||
|
"success": true,
|
||||||
|
"sql": [
|
||||||
|
[
|
||||||
|
"SELECT to_jsonb(t1.*) || to_jsonb(t2.*) || to_jsonb(t3.*) || to_jsonb(t4.*)",
|
||||||
|
"FROM agreego.\"person\" t1",
|
||||||
|
"LEFT JOIN agreego.\"user\" t2 ON t2.id = t1.id",
|
||||||
|
"LEFT JOIN agreego.\"organization\" t3 ON t3.id = t1.id",
|
||||||
|
"LEFT JOIN agreego.\"entity\" t4 ON t4.id = t1.id",
|
||||||
|
"WHERE",
|
||||||
|
" t1.id = '33333333-3333-3333-3333-333333333333'",
|
||||||
|
" OR (",
|
||||||
|
" \"first_name\" = 'LookupFirst'",
|
||||||
|
" AND \"last_name\" = 'LookupLast'",
|
||||||
|
" AND \"date_of_birth\" = '1990-01-01T00:00:00Z'",
|
||||||
|
" AND \"pronouns\" = 'they/them'",
|
||||||
|
" )"
|
||||||
|
],
|
||||||
|
[
|
||||||
|
"SELECT pg_notify('entity', '{",
|
||||||
|
" \"complete\":{",
|
||||||
|
" \"contact_id\":\"old-contact\",",
|
||||||
|
" \"date_of_birth\":\"1990-01-01T00:00:00Z\",",
|
||||||
|
" \"first_name\":\"LookupFirst\",",
|
||||||
|
" \"id\":\"22222222-2222-2222-2222-222222222222\",",
|
||||||
|
" \"last_name\":\"LookupLast\",",
|
||||||
|
" \"modified_at\":\"2026-03-10T00:00:00Z\",",
|
||||||
|
" \"modified_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||||
|
" \"pronouns\":\"they/them\",",
|
||||||
|
" \"type\":\"person\"",
|
||||||
|
" },",
|
||||||
|
" \"new\":{",
|
||||||
|
" \"type\":\"person\"",
|
||||||
|
" },",
|
||||||
|
" \"replaces\":\"33333333-3333-3333-3333-333333333333\"",
|
||||||
|
" }')"
|
||||||
|
]
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"description": "Update existing person with id (no lookup)",
|
"description": "Update existing person with id (no lookup)",
|
||||||
"action": "merge",
|
"action": "merge",
|
||||||
@ -1484,7 +1660,7 @@
|
|||||||
"SELECT to_jsonb(t1.*) || to_jsonb(t2.*)",
|
"SELECT to_jsonb(t1.*) || to_jsonb(t2.*)",
|
||||||
"FROM agreego.\"order\" t1",
|
"FROM agreego.\"order\" t1",
|
||||||
"LEFT JOIN agreego.\"entity\" t2 ON t2.id = t1.id",
|
"LEFT JOIN agreego.\"entity\" t2 ON t2.id = t1.id",
|
||||||
"WHERE t1.id = 'abc'"
|
"WHERE t1.id = 'abc' OR (\"id\" = 'abc')"
|
||||||
],
|
],
|
||||||
[
|
[
|
||||||
"INSERT INTO agreego.\"entity\" (",
|
"INSERT INTO agreego.\"entity\" (",
|
||||||
|
|||||||
214
fixtures/paths.json
Normal file
214
fixtures/paths.json
Normal file
@ -0,0 +1,214 @@
|
|||||||
|
[
|
||||||
|
{
|
||||||
|
"description": "Hybrid Array Pathing",
|
||||||
|
"database": {
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "hybrid_pathing",
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"primitives": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "string"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"ad_hoc_objects": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"name": {
|
||||||
|
"type": "string"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": [
|
||||||
|
"name"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"entities": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"id": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"value": {
|
||||||
|
"type": "number",
|
||||||
|
"minimum": 10
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"deep_entities": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"id": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"nested": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"id": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"flag": {
|
||||||
|
"type": "boolean"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"tests": [
|
||||||
|
{
|
||||||
|
"description": "happy path passes structural validation",
|
||||||
|
"data": {
|
||||||
|
"primitives": [
|
||||||
|
"a",
|
||||||
|
"b"
|
||||||
|
],
|
||||||
|
"ad_hoc_objects": [
|
||||||
|
{
|
||||||
|
"name": "obj1"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"entities": [
|
||||||
|
{
|
||||||
|
"id": "entity-1",
|
||||||
|
"value": 15
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"deep_entities": [
|
||||||
|
{
|
||||||
|
"id": "parent-1",
|
||||||
|
"nested": [
|
||||||
|
{
|
||||||
|
"id": "child-1",
|
||||||
|
"flag": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"schema_id": "hybrid_pathing",
|
||||||
|
"action": "validate",
|
||||||
|
"expect": {
|
||||||
|
"success": true
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "primitive arrays use numeric indexing",
|
||||||
|
"data": {
|
||||||
|
"primitives": [
|
||||||
|
"a",
|
||||||
|
123
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"schema_id": "hybrid_pathing",
|
||||||
|
"action": "validate",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "INVALID_TYPE",
|
||||||
|
"path": "primitives/1"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "ad-hoc objects without ids use numeric indexing",
|
||||||
|
"data": {
|
||||||
|
"ad_hoc_objects": [
|
||||||
|
{
|
||||||
|
"name": "valid"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"age": 30
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"schema_id": "hybrid_pathing",
|
||||||
|
"action": "validate",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "REQUIRED_FIELD_MISSING",
|
||||||
|
"path": "ad_hoc_objects/1/name"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "arrays of objects with ids use topological uuid indexing",
|
||||||
|
"data": {
|
||||||
|
"entities": [
|
||||||
|
{
|
||||||
|
"id": "entity-alpha",
|
||||||
|
"value": 20
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "entity-beta",
|
||||||
|
"value": 5
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"schema_id": "hybrid_pathing",
|
||||||
|
"action": "validate",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "MINIMUM_VIOLATED",
|
||||||
|
"path": "entities/entity-beta/value"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "deeply nested entity arrays retain full topological paths",
|
||||||
|
"data": {
|
||||||
|
"deep_entities": [
|
||||||
|
{
|
||||||
|
"id": "parent-omega",
|
||||||
|
"nested": [
|
||||||
|
{
|
||||||
|
"id": "child-alpha",
|
||||||
|
"flag": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "child-beta",
|
||||||
|
"flag": "invalid-string"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"schema_id": "hybrid_pathing",
|
||||||
|
"action": "validate",
|
||||||
|
"expect": {
|
||||||
|
"success": false,
|
||||||
|
"errors": [
|
||||||
|
{
|
||||||
|
"code": "INVALID_TYPE",
|
||||||
|
"path": "deep_entities/parent-omega/nested/child-beta/flag"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
@ -20,6 +20,16 @@
|
|||||||
"$family": "base.person"
|
"$family": "base.person"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "get_orders",
|
||||||
|
"schemas": [
|
||||||
|
{
|
||||||
|
"$id": "get_orders.response",
|
||||||
|
"type": "array",
|
||||||
|
"items": { "$ref": "light.order" }
|
||||||
|
}
|
||||||
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"enums": [],
|
"enums": [],
|
||||||
@ -664,6 +674,15 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"$id": "light.order",
|
||||||
|
"$ref": "order",
|
||||||
|
"properties": {
|
||||||
|
"customer": {
|
||||||
|
"$ref": "base.person"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"$id": "full.order",
|
"$id": "full.order",
|
||||||
"$ref": "order",
|
"$ref": "order",
|
||||||
@ -1003,6 +1022,7 @@
|
|||||||
" JOIN agreego.entity entity_6 ON entity_6.id = relationship_5.id",
|
" JOIN agreego.entity entity_6 ON entity_6.id = relationship_5.id",
|
||||||
" WHERE",
|
" WHERE",
|
||||||
" NOT entity_6.archived",
|
" NOT entity_6.archived",
|
||||||
|
" AND relationship_5.target_type = 'address'",
|
||||||
" AND relationship_5.source_id = entity_3.id),",
|
" AND relationship_5.source_id = entity_3.id),",
|
||||||
" 'age', person_1.age,",
|
" 'age', person_1.age,",
|
||||||
" 'archived', entity_3.archived,",
|
" 'archived', entity_3.archived,",
|
||||||
@ -1094,6 +1114,7 @@
|
|||||||
" JOIN agreego.entity entity_20 ON entity_20.id = relationship_19.id",
|
" JOIN agreego.entity entity_20 ON entity_20.id = relationship_19.id",
|
||||||
" WHERE",
|
" WHERE",
|
||||||
" NOT entity_20.archived",
|
" NOT entity_20.archived",
|
||||||
|
" AND relationship_19.target_type = 'email_address'",
|
||||||
" AND relationship_19.source_id = entity_3.id),",
|
" AND relationship_19.source_id = entity_3.id),",
|
||||||
" 'first_name', person_1.first_name,",
|
" 'first_name', person_1.first_name,",
|
||||||
" 'id', entity_3.id,",
|
" 'id', entity_3.id,",
|
||||||
@ -1127,6 +1148,7 @@
|
|||||||
" JOIN agreego.entity entity_25 ON entity_25.id = relationship_24.id",
|
" JOIN agreego.entity entity_25 ON entity_25.id = relationship_24.id",
|
||||||
" WHERE",
|
" WHERE",
|
||||||
" NOT entity_25.archived",
|
" NOT entity_25.archived",
|
||||||
|
" AND relationship_24.target_type = 'phone_number'",
|
||||||
" AND relationship_24.source_id = entity_3.id),",
|
" AND relationship_24.source_id = entity_3.id),",
|
||||||
" 'type', entity_3.type",
|
" 'type', entity_3.type",
|
||||||
")",
|
")",
|
||||||
@ -1163,7 +1185,7 @@
|
|||||||
"$eq": true,
|
"$eq": true,
|
||||||
"$ne": false
|
"$ne": false
|
||||||
},
|
},
|
||||||
"contacts.#.is_primary": {
|
"contacts/is_primary": {
|
||||||
"$eq": true
|
"$eq": true
|
||||||
},
|
},
|
||||||
"created_at": {
|
"created_at": {
|
||||||
@ -1203,7 +1225,7 @@
|
|||||||
"$eq": "%Doe%",
|
"$eq": "%Doe%",
|
||||||
"$ne": "%Smith%"
|
"$ne": "%Smith%"
|
||||||
},
|
},
|
||||||
"phone_numbers.#.target.number": {
|
"phone_numbers/target/number": {
|
||||||
"$eq": "555-1234"
|
"$eq": "555-1234"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
@ -1240,6 +1262,7 @@
|
|||||||
" JOIN agreego.entity entity_6 ON entity_6.id = relationship_5.id",
|
" JOIN agreego.entity entity_6 ON entity_6.id = relationship_5.id",
|
||||||
" WHERE",
|
" WHERE",
|
||||||
" NOT entity_6.archived",
|
" NOT entity_6.archived",
|
||||||
|
" AND relationship_5.target_type = 'address'",
|
||||||
" AND relationship_5.source_id = entity_3.id),",
|
" AND relationship_5.source_id = entity_3.id),",
|
||||||
" 'age', person_1.age,",
|
" 'age', person_1.age,",
|
||||||
" 'archived', entity_3.archived,",
|
" 'archived', entity_3.archived,",
|
||||||
@ -1332,6 +1355,7 @@
|
|||||||
" JOIN agreego.entity entity_20 ON entity_20.id = relationship_19.id",
|
" JOIN agreego.entity entity_20 ON entity_20.id = relationship_19.id",
|
||||||
" WHERE",
|
" WHERE",
|
||||||
" NOT entity_20.archived",
|
" NOT entity_20.archived",
|
||||||
|
" AND relationship_19.target_type = 'email_address'",
|
||||||
" AND relationship_19.source_id = entity_3.id),",
|
" AND relationship_19.source_id = entity_3.id),",
|
||||||
" 'first_name', person_1.first_name,",
|
" 'first_name', person_1.first_name,",
|
||||||
" 'id', entity_3.id,",
|
" 'id', entity_3.id,",
|
||||||
@ -1366,6 +1390,7 @@
|
|||||||
" JOIN agreego.entity entity_25 ON entity_25.id = relationship_24.id",
|
" JOIN agreego.entity entity_25 ON entity_25.id = relationship_24.id",
|
||||||
" WHERE",
|
" WHERE",
|
||||||
" NOT entity_25.archived",
|
" NOT entity_25.archived",
|
||||||
|
" AND relationship_24.target_type = 'phone_number'",
|
||||||
" AND relationship_24.source_id = entity_3.id),",
|
" AND relationship_24.source_id = entity_3.id),",
|
||||||
" 'type', entity_3.type",
|
" 'type', entity_3.type",
|
||||||
")",
|
")",
|
||||||
@ -1441,7 +1466,9 @@
|
|||||||
"FROM agreego.contact contact_1",
|
"FROM agreego.contact contact_1",
|
||||||
"JOIN agreego.relationship relationship_2 ON relationship_2.id = contact_1.id",
|
"JOIN agreego.relationship relationship_2 ON relationship_2.id = contact_1.id",
|
||||||
"JOIN agreego.entity entity_3 ON entity_3.id = relationship_2.id",
|
"JOIN agreego.entity entity_3 ON entity_3.id = relationship_2.id",
|
||||||
"WHERE NOT entity_3.archived)"
|
"WHERE",
|
||||||
|
" NOT entity_3.archived",
|
||||||
|
" AND relationship_2.target_type = 'email_address')"
|
||||||
]
|
]
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -1561,6 +1588,47 @@
|
|||||||
]
|
]
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Root Array SQL evaluation for Order fetching Light Order",
|
||||||
|
"action": "query",
|
||||||
|
"schema_id": "get_orders.response",
|
||||||
|
"expect": {
|
||||||
|
"success": true,
|
||||||
|
"sql": [
|
||||||
|
[
|
||||||
|
"(SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||||
|
" 'archived', entity_2.archived,",
|
||||||
|
" 'created_at', entity_2.created_at,",
|
||||||
|
" 'customer',",
|
||||||
|
" (SELECT jsonb_build_object(",
|
||||||
|
" 'age', person_3.age,",
|
||||||
|
" 'archived', entity_5.archived,",
|
||||||
|
" 'created_at', entity_5.created_at,",
|
||||||
|
" 'first_name', person_3.first_name,",
|
||||||
|
" 'id', entity_5.id,",
|
||||||
|
" 'last_name', person_3.last_name,",
|
||||||
|
" 'name', entity_5.name,",
|
||||||
|
" 'type', entity_5.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.person person_3",
|
||||||
|
" JOIN agreego.organization organization_4 ON organization_4.id = person_3.id",
|
||||||
|
" JOIN agreego.entity entity_5 ON entity_5.id = organization_4.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_5.archived",
|
||||||
|
" AND order_1.customer_id = person_3.id),",
|
||||||
|
" 'customer_id', order_1.customer_id,",
|
||||||
|
" 'id', entity_2.id,",
|
||||||
|
" 'name', entity_2.name,",
|
||||||
|
" 'total', order_1.total,",
|
||||||
|
" 'type', entity_2.type",
|
||||||
|
")), '[]'::jsonb)",
|
||||||
|
"FROM agreego.order order_1",
|
||||||
|
"JOIN agreego.entity entity_2 ON entity_2.id = order_1.id",
|
||||||
|
"WHERE NOT entity_2.archived)"
|
||||||
|
]
|
||||||
|
]
|
||||||
|
}
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@ -677,7 +677,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "TYPE_MISMATCH",
|
"code": "TYPE_MISMATCH",
|
||||||
"path": "/type"
|
"path": "type"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@ -782,7 +782,7 @@
|
|||||||
"errors": [
|
"errors": [
|
||||||
{
|
{
|
||||||
"code": "TYPE_MISMATCH",
|
"code": "TYPE_MISMATCH",
|
||||||
"path": "/type"
|
"path": "type"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@ -124,42 +124,23 @@ fn parse_and_match_mocks(sql: &str, mocks: &[Value]) -> Option<Vec<Value>> {
|
|||||||
return None;
|
return None;
|
||||||
};
|
};
|
||||||
|
|
||||||
// 2. Extract WHERE conditions
|
// 2. Extract WHERE conditions string
|
||||||
let mut conditions = Vec::new();
|
let mut where_clause = String::new();
|
||||||
if let Some(where_idx) = sql_upper.find(" WHERE ") {
|
if let Some(where_idx) = sql_upper.find(" WHERE ") {
|
||||||
let mut where_end = sql_upper.find(" ORDER BY ").unwrap_or(sql.len());
|
let mut where_end = sql_upper.find(" ORDER BY ").unwrap_or(sql_upper.len());
|
||||||
if let Some(limit_idx) = sql_upper.find(" LIMIT ") {
|
if let Some(limit_idx) = sql_upper.find(" LIMIT ") {
|
||||||
if limit_idx < where_end {
|
if limit_idx < where_end {
|
||||||
where_end = limit_idx;
|
where_end = limit_idx;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
let where_clause = &sql[where_idx + 7..where_end];
|
where_clause = sql[where_idx + 7..where_end].to_string();
|
||||||
let and_regex = Regex::new(r"(?i)\s+AND\s+").ok()?;
|
|
||||||
let parts = and_regex.split(where_clause);
|
|
||||||
for part in parts {
|
|
||||||
if let Some(eq_idx) = part.find('=') {
|
|
||||||
let left = part[..eq_idx]
|
|
||||||
.trim()
|
|
||||||
.split('.')
|
|
||||||
.last()
|
|
||||||
.unwrap_or("")
|
|
||||||
.trim_matches('"');
|
|
||||||
let right = part[eq_idx + 1..].trim().trim_matches('\'');
|
|
||||||
conditions.push((left.to_string(), right.to_string()));
|
|
||||||
} else if part.to_uppercase().contains(" IS NULL") {
|
|
||||||
let left = part[..part.to_uppercase().find(" IS NULL").unwrap()]
|
|
||||||
.trim()
|
|
||||||
.split('.')
|
|
||||||
.last()
|
|
||||||
.unwrap_or("")
|
|
||||||
.replace('"', ""); // Remove quotes explicitly
|
|
||||||
conditions.push((left, "null".to_string()));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// 3. Find matching mocks
|
// 3. Find matching mocks
|
||||||
let mut matches = Vec::new();
|
let mut matches = Vec::new();
|
||||||
|
let or_regex = Regex::new(r"(?i)\s+OR\s+").ok()?;
|
||||||
|
let and_regex = Regex::new(r"(?i)\s+AND\s+").ok()?;
|
||||||
|
|
||||||
for mock in mocks {
|
for mock in mocks {
|
||||||
if let Some(mock_obj) = mock.as_object() {
|
if let Some(mock_obj) = mock.as_object() {
|
||||||
if let Some(t) = mock_obj.get("type") {
|
if let Some(t) = mock_obj.get("type") {
|
||||||
@ -168,25 +149,66 @@ fn parse_and_match_mocks(sql: &str, mocks: &[Value]) -> Option<Vec<Value>> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let mut matches_all = true;
|
if where_clause.is_empty() {
|
||||||
for (k, v) in &conditions {
|
matches.push(mock.clone());
|
||||||
let mock_val_str = match mock_obj.get(k) {
|
continue;
|
||||||
Some(Value::String(s)) => s.clone(),
|
}
|
||||||
Some(Value::Number(n)) => n.to_string(),
|
|
||||||
Some(Value::Bool(b)) => b.to_string(),
|
let or_parts = or_regex.split(&where_clause);
|
||||||
Some(Value::Null) => "null".to_string(),
|
let mut any_branch_matched = false;
|
||||||
_ => {
|
|
||||||
matches_all = false;
|
for or_part in or_parts {
|
||||||
break;
|
let branch_str = or_part.replace('(', "").replace(')', "");
|
||||||
|
let mut branch_matches = true;
|
||||||
|
|
||||||
|
for part in and_regex.split(&branch_str) {
|
||||||
|
if let Some(eq_idx) = part.find('=') {
|
||||||
|
let left = part[..eq_idx]
|
||||||
|
.trim()
|
||||||
|
.split('.')
|
||||||
|
.last()
|
||||||
|
.unwrap_or("")
|
||||||
|
.trim_matches('"');
|
||||||
|
let right = part[eq_idx + 1..].trim().trim_matches('\'');
|
||||||
|
|
||||||
|
let mock_val_str = match mock_obj.get(left) {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
Some(Value::Number(n)) => n.to_string(),
|
||||||
|
Some(Value::Bool(b)) => b.to_string(),
|
||||||
|
Some(Value::Null) => "null".to_string(),
|
||||||
|
_ => "".to_string(),
|
||||||
|
};
|
||||||
|
if mock_val_str != right {
|
||||||
|
branch_matches = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
} else if part.to_uppercase().contains(" IS NULL") {
|
||||||
|
let left = part[..part.to_uppercase().find(" IS NULL").unwrap()]
|
||||||
|
.trim()
|
||||||
|
.split('.')
|
||||||
|
.last()
|
||||||
|
.unwrap_or("")
|
||||||
|
.trim_matches('"');
|
||||||
|
|
||||||
|
let mock_val_str = match mock_obj.get(left) {
|
||||||
|
Some(Value::Null) => "null".to_string(),
|
||||||
|
_ => "".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
if mock_val_str != "null" {
|
||||||
|
branch_matches = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
};
|
}
|
||||||
if mock_val_str != *v {
|
|
||||||
matches_all = false;
|
if branch_matches {
|
||||||
|
any_branch_matched = true;
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if matches_all {
|
if any_branch_matched {
|
||||||
matches.push(mock.clone());
|
matches.push(mock.clone());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@ -9,6 +9,61 @@ impl SpiExecutor {
|
|||||||
pub fn new() -> Self {
|
pub fn new() -> Self {
|
||||||
Self {}
|
Self {}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn transact<F, R>(&self, f: F) -> Result<R, String>
|
||||||
|
where
|
||||||
|
F: FnOnce() -> Result<R, String>,
|
||||||
|
{
|
||||||
|
unsafe {
|
||||||
|
let oldcontext = pgrx::pg_sys::CurrentMemoryContext;
|
||||||
|
let oldowner = pgrx::pg_sys::CurrentResourceOwner;
|
||||||
|
pgrx::pg_sys::BeginInternalSubTransaction(std::ptr::null());
|
||||||
|
pgrx::pg_sys::MemoryContextSwitchTo(oldcontext);
|
||||||
|
|
||||||
|
let runner = std::panic::AssertUnwindSafe(move || {
|
||||||
|
let res = f();
|
||||||
|
|
||||||
|
pgrx::pg_sys::ReleaseCurrentSubTransaction();
|
||||||
|
pgrx::pg_sys::MemoryContextSwitchTo(oldcontext);
|
||||||
|
pgrx::pg_sys::CurrentResourceOwner = oldowner;
|
||||||
|
|
||||||
|
res
|
||||||
|
});
|
||||||
|
|
||||||
|
pgrx::PgTryBuilder::new(runner)
|
||||||
|
.catch_rust_panic(|cause| {
|
||||||
|
pgrx::pg_sys::RollbackAndReleaseCurrentSubTransaction();
|
||||||
|
pgrx::pg_sys::MemoryContextSwitchTo(oldcontext);
|
||||||
|
pgrx::pg_sys::CurrentResourceOwner = oldowner;
|
||||||
|
|
||||||
|
// Rust panics are fatal bugs, not validation errors. Rethrow so they bubble up.
|
||||||
|
cause.rethrow()
|
||||||
|
})
|
||||||
|
.catch_others(|cause| {
|
||||||
|
pgrx::pg_sys::RollbackAndReleaseCurrentSubTransaction();
|
||||||
|
pgrx::pg_sys::MemoryContextSwitchTo(oldcontext);
|
||||||
|
pgrx::pg_sys::CurrentResourceOwner = oldowner;
|
||||||
|
|
||||||
|
let error_msg = match &cause {
|
||||||
|
pgrx::pg_sys::panic::CaughtError::PostgresError(e)
|
||||||
|
| pgrx::pg_sys::panic::CaughtError::ErrorReport(e) => {
|
||||||
|
let json_err = serde_json::json!({
|
||||||
|
"error": e.message(),
|
||||||
|
"code": format!("{:?}", e.sql_error_code()),
|
||||||
|
"detail": e.detail(),
|
||||||
|
"hint": e.hint()
|
||||||
|
});
|
||||||
|
json_err.to_string()
|
||||||
|
}
|
||||||
|
_ => format!("{:?}", cause),
|
||||||
|
};
|
||||||
|
|
||||||
|
pgrx::warning!("JSPG Caught Native Postgres Error: {}", error_msg);
|
||||||
|
Err(error_msg)
|
||||||
|
})
|
||||||
|
.execute()
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl DatabaseExecutor for SpiExecutor {
|
impl DatabaseExecutor for SpiExecutor {
|
||||||
@ -24,7 +79,7 @@ impl DatabaseExecutor for SpiExecutor {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pgrx::PgTryBuilder::new(|| {
|
self.transact(|| {
|
||||||
Spi::connect(|client| {
|
Spi::connect(|client| {
|
||||||
pgrx::notice!("JSPG_SQL: {}", sql);
|
pgrx::notice!("JSPG_SQL: {}", sql);
|
||||||
match client.select(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
match client.select(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
||||||
@ -41,11 +96,6 @@ impl DatabaseExecutor for SpiExecutor {
|
|||||||
}
|
}
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
.catch_others(|cause| {
|
|
||||||
pgrx::warning!("JSPG Caught Native Postgres Error: {:?}", cause);
|
|
||||||
Err(format!("{:?}", cause))
|
|
||||||
})
|
|
||||||
.execute()
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn execute(&self, sql: &str, args: Option<&[Value]>) -> Result<(), String> {
|
fn execute(&self, sql: &str, args: Option<&[Value]>) -> Result<(), String> {
|
||||||
@ -60,7 +110,7 @@ impl DatabaseExecutor for SpiExecutor {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pgrx::PgTryBuilder::new(|| {
|
self.transact(|| {
|
||||||
Spi::connect_mut(|client| {
|
Spi::connect_mut(|client| {
|
||||||
pgrx::notice!("JSPG_SQL: {}", sql);
|
pgrx::notice!("JSPG_SQL: {}", sql);
|
||||||
match client.update(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
match client.update(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
||||||
@ -69,44 +119,43 @@ impl DatabaseExecutor for SpiExecutor {
|
|||||||
}
|
}
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
.catch_others(|cause| {
|
|
||||||
pgrx::warning!("JSPG Caught Native Postgres Error: {:?}", cause);
|
|
||||||
Err(format!("{:?}", cause))
|
|
||||||
})
|
|
||||||
.execute()
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn auth_user_id(&self) -> Result<String, String> {
|
fn auth_user_id(&self) -> Result<String, String> {
|
||||||
Spi::connect(|client| {
|
self.transact(|| {
|
||||||
let mut tup_table = client
|
Spi::connect(|client| {
|
||||||
.select(
|
let mut tup_table = client
|
||||||
"SELECT COALESCE(current_setting('auth.user_id', true), 'ffffffff-ffff-ffff-ffff-ffffffffffff')",
|
.select(
|
||||||
None,
|
"SELECT COALESCE(current_setting('auth.user_id', true), 'ffffffff-ffff-ffff-ffff-ffffffffffff')",
|
||||||
&[],
|
None,
|
||||||
)
|
&[],
|
||||||
.map_err(|e| format!("SPI Select Error: {}", e))?;
|
)
|
||||||
|
.map_err(|e| format!("SPI Select Error: {}", e))?;
|
||||||
|
|
||||||
let row = tup_table
|
let row = tup_table
|
||||||
.next()
|
.next()
|
||||||
.ok_or("No user id setting returned from context".to_string())?;
|
.ok_or("No user id setting returned from context".to_string())?;
|
||||||
let user_id: Option<String> = row.get(1).map_err(|e| e.to_string())?;
|
let user_id: Option<String> = row.get(1).map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
user_id.ok_or("Missing user_id".to_string())
|
user_id.ok_or("Missing user_id".to_string())
|
||||||
|
})
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
fn timestamp(&self) -> Result<String, String> {
|
fn timestamp(&self) -> Result<String, String> {
|
||||||
Spi::connect(|client| {
|
self.transact(|| {
|
||||||
let mut tup_table = client
|
Spi::connect(|client| {
|
||||||
.select("SELECT clock_timestamp()::text", None, &[])
|
let mut tup_table = client
|
||||||
.map_err(|e| format!("SPI Select Error: {}", e))?;
|
.select("SELECT clock_timestamp()::text", None, &[])
|
||||||
|
.map_err(|e| format!("SPI Select Error: {}", e))?;
|
||||||
|
|
||||||
let row = tup_table
|
let row = tup_table
|
||||||
.next()
|
.next()
|
||||||
.ok_or("No clock timestamp returned".to_string())?;
|
.ok_or("No clock timestamp returned".to_string())?;
|
||||||
let timestamp: Option<String> = row.get(1).map_err(|e| e.to_string())?;
|
let timestamp: Option<String> = row.get(1).map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
timestamp.ok_or("Missing timestamp".to_string())
|
timestamp.ok_or("Missing timestamp".to_string())
|
||||||
|
})
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@ -507,10 +507,8 @@ impl Schema {
|
|||||||
let mut parent_type_name = None;
|
let mut parent_type_name = None;
|
||||||
if let Some(family) = &self.obj.family {
|
if let Some(family) = &self.obj.family {
|
||||||
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||||
} else if let Some(id) = &self.obj.id {
|
} else if let Some(identifier) = self.obj.identifier() {
|
||||||
parent_type_name = Some(id.split('.').next_back().unwrap_or("").to_string());
|
parent_type_name = Some(identifier);
|
||||||
} else if let Some(ref_id) = &self.obj.r#ref {
|
|
||||||
parent_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(p_type) = parent_type_name {
|
if let Some(p_type) = parent_type_name {
|
||||||
@ -531,12 +529,12 @@ impl Schema {
|
|||||||
|
|
||||||
if let Some(family) = &target_schema.obj.family {
|
if let Some(family) = &target_schema.obj.family {
|
||||||
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||||
} else if let Some(ref_id) = target_schema.obj.r#ref.as_ref() {
|
} else if let Some(ref_id) = target_schema.obj.identifier() {
|
||||||
child_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
child_type_name = Some(ref_id);
|
||||||
} else if let Some(arr) = &target_schema.obj.one_of {
|
} else if let Some(arr) = &target_schema.obj.one_of {
|
||||||
if let Some(first) = arr.first() {
|
if let Some(first) = arr.first() {
|
||||||
if let Some(ref_id) = first.obj.id.as_ref().or(first.obj.r#ref.as_ref()) {
|
if let Some(ref_id) = first.obj.identifier() {
|
||||||
child_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
child_type_name = Some(ref_id);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -697,6 +695,16 @@ impl<'de> Deserialize<'de> for Schema {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
impl SchemaObject {
|
||||||
|
pub fn identifier(&self) -> Option<String> {
|
||||||
|
if let Some(lookup_key) = self.id.as_ref().or(self.r#ref.as_ref()) {
|
||||||
|
Some(lookup_key.split('.').next_back().unwrap_or("").to_string())
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
#[serde(untagged)]
|
#[serde(untagged)]
|
||||||
pub enum SchemaTypeOrArray {
|
pub enum SchemaTypeOrArray {
|
||||||
|
|||||||
@ -45,12 +45,33 @@ impl Merger {
|
|||||||
let val_resolved = match result {
|
let val_resolved = match result {
|
||||||
Ok(val) => val,
|
Ok(val) => val,
|
||||||
Err(msg) => {
|
Err(msg) => {
|
||||||
|
let mut final_code = "MERGE_FAILED".to_string();
|
||||||
|
let mut final_message = msg.clone();
|
||||||
|
let mut final_cause = None;
|
||||||
|
|
||||||
|
if let Ok(Value::Object(map)) = serde_json::from_str::<Value>(&msg) {
|
||||||
|
if let (Some(Value::String(e_msg)), Some(Value::String(e_code))) = (map.get("error"), map.get("code")) {
|
||||||
|
final_message = e_msg.clone();
|
||||||
|
final_code = e_code.clone();
|
||||||
|
let mut cause_parts = Vec::new();
|
||||||
|
if let Some(Value::String(d)) = map.get("detail") {
|
||||||
|
if !d.is_empty() { cause_parts.push(d.clone()); }
|
||||||
|
}
|
||||||
|
if let Some(Value::String(h)) = map.get("hint") {
|
||||||
|
if !h.is_empty() { cause_parts.push(h.clone()); }
|
||||||
|
}
|
||||||
|
if !cause_parts.is_empty() {
|
||||||
|
final_cause = Some(cause_parts.join("\n"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||||
code: "MERGE_FAILED".to_string(),
|
code: final_code,
|
||||||
message: msg,
|
message: final_message,
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: "".to_string(),
|
path: "".to_string(),
|
||||||
cause: None,
|
cause: final_cause,
|
||||||
context: Some(data),
|
context: Some(data),
|
||||||
schema: None,
|
schema: None,
|
||||||
},
|
},
|
||||||
@ -207,13 +228,15 @@ impl Merger {
|
|||||||
|
|
||||||
let mut entity_change_kind = None;
|
let mut entity_change_kind = None;
|
||||||
let mut entity_fetched = None;
|
let mut entity_fetched = None;
|
||||||
|
let mut entity_replaces = None;
|
||||||
|
|
||||||
if !type_def.relationship {
|
if !type_def.relationship {
|
||||||
let (fields, kind, fetched) =
|
let (fields, kind, fetched, replaces) =
|
||||||
self.stage_entity(entity_fields.clone(), type_def, &user_id, ×tamp)?;
|
self.stage_entity(entity_fields.clone(), type_def, &user_id, ×tamp)?;
|
||||||
entity_fields = fields;
|
entity_fields = fields;
|
||||||
entity_change_kind = kind;
|
entity_change_kind = kind;
|
||||||
entity_fetched = fetched;
|
entity_fetched = fetched;
|
||||||
|
entity_replaces = replaces;
|
||||||
}
|
}
|
||||||
|
|
||||||
let mut entity_response = serde_json::Map::new();
|
let mut entity_response = serde_json::Map::new();
|
||||||
@ -287,11 +310,12 @@ impl Merger {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if type_def.relationship {
|
if type_def.relationship {
|
||||||
let (fields, kind, fetched) =
|
let (fields, kind, fetched, replaces) =
|
||||||
self.stage_entity(entity_fields.clone(), type_def, &user_id, ×tamp)?;
|
self.stage_entity(entity_fields.clone(), type_def, &user_id, ×tamp)?;
|
||||||
entity_fields = fields;
|
entity_fields = fields;
|
||||||
entity_change_kind = kind;
|
entity_change_kind = kind;
|
||||||
entity_fetched = fetched;
|
entity_fetched = fetched;
|
||||||
|
entity_replaces = replaces;
|
||||||
}
|
}
|
||||||
|
|
||||||
self.merge_entity_fields(
|
self.merge_entity_fields(
|
||||||
@ -367,6 +391,7 @@ impl Merger {
|
|||||||
entity_change_kind.as_deref(),
|
entity_change_kind.as_deref(),
|
||||||
&user_id,
|
&user_id,
|
||||||
×tamp,
|
×tamp,
|
||||||
|
entity_replaces.as_deref(),
|
||||||
)?;
|
)?;
|
||||||
|
|
||||||
if let Some(sql) = notify_sql {
|
if let Some(sql) = notify_sql {
|
||||||
@ -398,6 +423,7 @@ impl Merger {
|
|||||||
serde_json::Map<String, Value>,
|
serde_json::Map<String, Value>,
|
||||||
Option<String>,
|
Option<String>,
|
||||||
Option<serde_json::Map<String, Value>>,
|
Option<serde_json::Map<String, Value>>,
|
||||||
|
Option<String>,
|
||||||
),
|
),
|
||||||
String,
|
String,
|
||||||
> {
|
> {
|
||||||
@ -417,11 +443,22 @@ impl Merger {
|
|||||||
.map_or(false, |s| !s.is_empty());
|
.map_or(false, |s| !s.is_empty());
|
||||||
|
|
||||||
if is_anchor && has_valid_id {
|
if is_anchor && has_valid_id {
|
||||||
return Ok((entity_fields, None, None));
|
return Ok((entity_fields, None, None, None));
|
||||||
}
|
}
|
||||||
|
|
||||||
let entity_fetched = self.fetch_entity(&entity_fields, type_def)?;
|
let entity_fetched = self.fetch_entity(&entity_fields, type_def)?;
|
||||||
|
|
||||||
|
let mut replaces_id = None;
|
||||||
|
if let Some(ref fetched_row) = entity_fetched {
|
||||||
|
let provided_id = entity_fields.get("id").and_then(|v| v.as_str());
|
||||||
|
let fetched_id = fetched_row.get("id").and_then(|v| v.as_str());
|
||||||
|
if let (Some(pid), Some(fid)) = (provided_id, fetched_id) {
|
||||||
|
if !pid.is_empty() && pid != fid {
|
||||||
|
replaces_id = Some(pid.to_string());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
let system_keys = vec![
|
let system_keys = vec![
|
||||||
"id".to_string(),
|
"id".to_string(),
|
||||||
"type".to_string(),
|
"type".to_string(),
|
||||||
@ -471,7 +508,7 @@ impl Merger {
|
|||||||
);
|
);
|
||||||
|
|
||||||
entity_fields = new_fields;
|
entity_fields = new_fields;
|
||||||
} else if changes.is_empty() {
|
} else if changes.is_empty() && replaces_id.is_none() {
|
||||||
let mut new_fields = serde_json::Map::new();
|
let mut new_fields = serde_json::Map::new();
|
||||||
new_fields.insert(
|
new_fields.insert(
|
||||||
"id".to_string(),
|
"id".to_string(),
|
||||||
@ -487,6 +524,8 @@ impl Merger {
|
|||||||
.unwrap_or(false);
|
.unwrap_or(false);
|
||||||
entity_change_kind = if is_archived {
|
entity_change_kind = if is_archived {
|
||||||
Some("delete".to_string())
|
Some("delete".to_string())
|
||||||
|
} else if changes.is_empty() && replaces_id.is_some() {
|
||||||
|
Some("replace".to_string())
|
||||||
} else {
|
} else {
|
||||||
Some("update".to_string())
|
Some("update".to_string())
|
||||||
};
|
};
|
||||||
@ -509,7 +548,7 @@ impl Merger {
|
|||||||
entity_fields = new_fields;
|
entity_fields = new_fields;
|
||||||
}
|
}
|
||||||
|
|
||||||
Ok((entity_fields, entity_change_kind, entity_fetched))
|
Ok((entity_fields, entity_change_kind, entity_fetched, replaces_id))
|
||||||
}
|
}
|
||||||
|
|
||||||
fn fetch_entity(
|
fn fetch_entity(
|
||||||
@ -564,11 +603,14 @@ impl Merger {
|
|||||||
template
|
template
|
||||||
};
|
};
|
||||||
|
|
||||||
let where_clause = if let Some(id) = id_val {
|
let mut where_parts = Vec::new();
|
||||||
format!("WHERE t1.id = {}", Self::quote_literal(id))
|
|
||||||
} else if lookup_complete {
|
|
||||||
let mut lookup_predicates = Vec::new();
|
|
||||||
|
|
||||||
|
if let Some(id) = id_val {
|
||||||
|
where_parts.push(format!("t1.id = {}", Self::quote_literal(id)));
|
||||||
|
}
|
||||||
|
|
||||||
|
if lookup_complete {
|
||||||
|
let mut lookup_predicates = Vec::new();
|
||||||
for column in &entity_type.lookup_fields {
|
for column in &entity_type.lookup_fields {
|
||||||
let val = entity_fields.get(column).unwrap_or(&Value::Null);
|
let val = entity_fields.get(column).unwrap_or(&Value::Null);
|
||||||
if column == "type" {
|
if column == "type" {
|
||||||
@ -577,10 +619,14 @@ impl Merger {
|
|||||||
lookup_predicates.push(format!("\"{}\" = {}", column, Self::quote_literal(val)));
|
lookup_predicates.push(format!("\"{}\" = {}", column, Self::quote_literal(val)));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
format!("WHERE {}", lookup_predicates.join(" AND "))
|
where_parts.push(format!("({})", lookup_predicates.join(" AND ")));
|
||||||
} else {
|
}
|
||||||
|
|
||||||
|
if where_parts.is_empty() {
|
||||||
return Ok(None);
|
return Ok(None);
|
||||||
};
|
}
|
||||||
|
|
||||||
|
let where_clause = format!("WHERE {}", where_parts.join(" OR "));
|
||||||
|
|
||||||
let final_sql = format!("{} {}", fetch_sql_template, where_clause);
|
let final_sql = format!("{} {}", fetch_sql_template, where_clause);
|
||||||
|
|
||||||
@ -691,8 +737,7 @@ impl Merger {
|
|||||||
);
|
);
|
||||||
self
|
self
|
||||||
.db
|
.db
|
||||||
.execute(&sql, None)
|
.execute(&sql, None)?;
|
||||||
.map_err(|e| format!("SPI Error in INSERT: {:?}", e))?;
|
|
||||||
} else if change_kind == "update" || change_kind == "delete" {
|
} else if change_kind == "update" || change_kind == "delete" {
|
||||||
entity_pairs.remove("id");
|
entity_pairs.remove("id");
|
||||||
entity_pairs.remove("type");
|
entity_pairs.remove("type");
|
||||||
@ -726,8 +771,7 @@ impl Merger {
|
|||||||
);
|
);
|
||||||
self
|
self
|
||||||
.db
|
.db
|
||||||
.execute(&sql, None)
|
.execute(&sql, None)?;
|
||||||
.map_err(|e| format!("SPI Error in UPDATE: {:?}", e))?;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -742,6 +786,7 @@ impl Merger {
|
|||||||
entity_change_kind: Option<&str>,
|
entity_change_kind: Option<&str>,
|
||||||
user_id: &str,
|
user_id: &str,
|
||||||
timestamp: &str,
|
timestamp: &str,
|
||||||
|
replaces_id: Option<&str>,
|
||||||
) -> Result<Option<String>, String> {
|
) -> Result<Option<String>, String> {
|
||||||
let change_kind = match entity_change_kind {
|
let change_kind = match entity_change_kind {
|
||||||
Some(k) => k,
|
Some(k) => k,
|
||||||
@ -753,9 +798,9 @@ impl Merger {
|
|||||||
|
|
||||||
let mut old_vals = serde_json::Map::new();
|
let mut old_vals = serde_json::Map::new();
|
||||||
let mut new_vals = serde_json::Map::new();
|
let mut new_vals = serde_json::Map::new();
|
||||||
let is_update = change_kind == "update" || change_kind == "delete";
|
let exists = change_kind == "update" || change_kind == "delete" || change_kind == "replace";
|
||||||
|
|
||||||
if !is_update {
|
if !exists {
|
||||||
let system_keys = vec![
|
let system_keys = vec![
|
||||||
"id".to_string(),
|
"id".to_string(),
|
||||||
"created_by".to_string(),
|
"created_by".to_string(),
|
||||||
@ -792,7 +837,7 @@ impl Merger {
|
|||||||
}
|
}
|
||||||
|
|
||||||
let mut complete = entity_fields.clone();
|
let mut complete = entity_fields.clone();
|
||||||
if is_update {
|
if exists {
|
||||||
if let Some(fetched) = entity_fetched {
|
if let Some(fetched) = entity_fetched {
|
||||||
let mut temp = fetched.clone();
|
let mut temp = fetched.clone();
|
||||||
for (k, v) in entity_fields {
|
for (k, v) in entity_fields {
|
||||||
@ -816,9 +861,13 @@ impl Merger {
|
|||||||
if old_val_obj != Value::Null {
|
if old_val_obj != Value::Null {
|
||||||
notification.insert("old".to_string(), old_val_obj.clone());
|
notification.insert("old".to_string(), old_val_obj.clone());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if let Some(rep) = replaces_id {
|
||||||
|
notification.insert("replaces".to_string(), Value::String(rep.to_string()));
|
||||||
|
}
|
||||||
|
|
||||||
let mut notify_sql = None;
|
let mut notify_sql = None;
|
||||||
if type_obj.historical {
|
if type_obj.historical && change_kind != "replace" {
|
||||||
let change_sql = format!(
|
let change_sql = format!(
|
||||||
"INSERT INTO agreego.change (\"old\", \"new\", entity_id, id, kind, modified_at, modified_by) VALUES ({}, {}, {}, {}, {}, {}, {})",
|
"INSERT INTO agreego.change (\"old\", \"new\", entity_id, id, kind, modified_at, modified_by) VALUES ({}, {}, {}, {}, {}, {}, {})",
|
||||||
Self::quote_literal(&old_val_obj),
|
Self::quote_literal(&old_val_obj),
|
||||||
@ -830,10 +879,7 @@ impl Merger {
|
|||||||
Self::quote_literal(&Value::String(user_id.to_string()))
|
Self::quote_literal(&Value::String(user_id.to_string()))
|
||||||
);
|
);
|
||||||
|
|
||||||
self
|
self.db.execute(&change_sql, None)?;
|
||||||
.db
|
|
||||||
.execute(&change_sql, None)
|
|
||||||
.map_err(|e| format!("Executor Error in change: {:?}", e))?;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if type_obj.notify {
|
if type_obj.notify {
|
||||||
|
|||||||
@ -63,37 +63,33 @@ impl<'a> Compiler<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn compile_array(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
fn compile_array(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
||||||
|
// 1. Array of DB Entities (`$ref` or `$family` pointing to a table limit)
|
||||||
if let Some(items) = &node.schema.obj.items {
|
if let Some(items) = &node.schema.obj.items {
|
||||||
let next_path = if node.ast_path.is_empty() {
|
let mut resolved_type = None;
|
||||||
String::from("#")
|
if let Some(family_target) = items.obj.family.as_ref() {
|
||||||
} else {
|
let base_type_name = family_target.split('.').next_back().unwrap_or(family_target);
|
||||||
format!("{}.#", node.ast_path)
|
resolved_type = self.db.types.get(base_type_name);
|
||||||
};
|
} else if let Some(base_type_name) = items.obj.identifier() {
|
||||||
|
resolved_type = self.db.types.get(&base_type_name);
|
||||||
if let Some(ref_id) = &items.obj.r#ref {
|
|
||||||
if let Some(type_def) = self.db.types.get(ref_id) {
|
|
||||||
let mut entity_node = node.clone();
|
|
||||||
entity_node.ast_path = next_path;
|
|
||||||
entity_node.schema = std::sync::Arc::clone(items);
|
|
||||||
return self.compile_entity(type_def, entity_node, true);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
let mut next_node = node.clone();
|
if let Some(type_def) = resolved_type {
|
||||||
next_node.depth += 1;
|
let mut entity_node = node.clone();
|
||||||
next_node.ast_path = next_path;
|
entity_node.schema = std::sync::Arc::clone(items);
|
||||||
next_node.schema = std::sync::Arc::clone(items);
|
return self.compile_entity(type_def, entity_node, true);
|
||||||
let (item_sql, _) = self.compile_node(next_node)?;
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Arrays of mapped Native Postgres Columns (e.g. `jsonb`, `text[]`)
|
||||||
|
if let Some(prop) = &node.property_name {
|
||||||
return Ok((
|
return Ok((
|
||||||
format!("(SELECT jsonb_agg({}) FROM TODO)", item_sql),
|
format!("{}.{}", node.parent_alias, prop),
|
||||||
"array".to_string(),
|
"array".to_string(),
|
||||||
));
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
Ok((
|
// 3. Fallback for root execution of standalone non-entity arrays
|
||||||
"SELECT jsonb_agg(TODO) FROM TODO".to_string(),
|
Err("Cannot compile a root array without a valid entity reference or table mapped via `items`.".to_string())
|
||||||
"array".to_string(),
|
|
||||||
))
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn compile_reference(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
fn compile_reference(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
||||||
@ -102,14 +98,7 @@ impl<'a> Compiler<'a> {
|
|||||||
|
|
||||||
if let Some(family_target) = node.schema.obj.family.as_ref() {
|
if let Some(family_target) = node.schema.obj.family.as_ref() {
|
||||||
resolved_type = self.db.types.get(family_target);
|
resolved_type = self.db.types.get(family_target);
|
||||||
} else if let Some(lookup_key) = node
|
} else if let Some(base_type_name) = node.schema.obj.identifier() {
|
||||||
.schema
|
|
||||||
.obj
|
|
||||||
.id
|
|
||||||
.as_ref()
|
|
||||||
.or(node.schema.obj.r#ref.as_ref())
|
|
||||||
{
|
|
||||||
let base_type_name = lookup_key.split('.').next_back().unwrap_or("").to_string();
|
|
||||||
resolved_type = self.db.types.get(&base_type_name);
|
resolved_type = self.db.types.get(&base_type_name);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -448,22 +437,21 @@ impl<'a> Compiler<'a> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let mut child_node = node.clone();
|
let child_node = Node {
|
||||||
child_node.parent_alias = owner_alias.clone();
|
schema: std::sync::Arc::clone(prop_schema),
|
||||||
let arc_aliases = std::sync::Arc::new(table_aliases.clone());
|
parent_alias: owner_alias.clone(),
|
||||||
child_node.parent_type_aliases = Some(arc_aliases);
|
parent_type_aliases: Some(std::sync::Arc::new(table_aliases.clone())),
|
||||||
child_node.parent_type = Some(r#type);
|
parent_type: Some(r#type),
|
||||||
child_node.parent_schema = Some(std::sync::Arc::clone(&node.schema));
|
parent_schema: Some(std::sync::Arc::clone(&node.schema)),
|
||||||
child_node.property_name = Some(prop_key.clone());
|
property_name: Some(prop_key.clone()),
|
||||||
child_node.depth += 1;
|
depth: node.depth + 1,
|
||||||
let next_path = if node.ast_path.is_empty() {
|
ast_path: if node.ast_path.is_empty() {
|
||||||
prop_key.clone()
|
prop_key.clone()
|
||||||
} else {
|
} else {
|
||||||
format!("{}.{}", node.ast_path, prop_key)
|
format!("{}/{}", node.ast_path, prop_key)
|
||||||
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
child_node.ast_path = next_path;
|
|
||||||
child_node.schema = std::sync::Arc::clone(prop_schema);
|
|
||||||
|
|
||||||
let (val_sql, val_type) = self.compile_node(child_node)?;
|
let (val_sql, val_type) = self.compile_node(child_node)?;
|
||||||
|
|
||||||
@ -501,6 +489,7 @@ impl<'a> Compiler<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
self.compile_filter_conditions(r#type, type_aliases, &node, &base_alias, &mut where_clauses);
|
self.compile_filter_conditions(r#type, type_aliases, &node, &base_alias, &mut where_clauses);
|
||||||
|
self.compile_polymorphic_bounds(r#type, type_aliases, &node, &mut where_clauses);
|
||||||
self.compile_relation_conditions(
|
self.compile_relation_conditions(
|
||||||
r#type,
|
r#type,
|
||||||
type_aliases,
|
type_aliases,
|
||||||
@ -512,6 +501,54 @@ impl<'a> Compiler<'a> {
|
|||||||
Ok(where_clauses)
|
Ok(where_clauses)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn compile_polymorphic_bounds(
|
||||||
|
&self,
|
||||||
|
_type: &crate::database::r#type::Type,
|
||||||
|
type_aliases: &std::collections::HashMap<String, String>,
|
||||||
|
node: &Node,
|
||||||
|
where_clauses: &mut Vec<String>,
|
||||||
|
) {
|
||||||
|
if let Some(edges) = node.schema.obj.compiled_edges.get() {
|
||||||
|
if let Some(props) = node.schema.obj.compiled_properties.get() {
|
||||||
|
for (prop_name, edge) in edges {
|
||||||
|
if let Some(prop_schema) = props.get(prop_name) {
|
||||||
|
// Determine if the property schema resolves to a physical Database Entity
|
||||||
|
let mut bound_type_name = None;
|
||||||
|
if let Some(family_target) = prop_schema.obj.family.as_ref() {
|
||||||
|
bound_type_name = Some(family_target.split('.').next_back().unwrap_or(family_target).to_string());
|
||||||
|
} else if let Some(lookup_key) = prop_schema.obj.identifier() {
|
||||||
|
bound_type_name = Some(lookup_key);
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(type_name) = bound_type_name {
|
||||||
|
// Ensure this type actually exists
|
||||||
|
if self.db.types.contains_key(&type_name) {
|
||||||
|
if let Some(relation) = self.db.relations.get(&edge.constraint) {
|
||||||
|
let mut poly_col = None;
|
||||||
|
let mut table_to_alias = "";
|
||||||
|
|
||||||
|
if edge.forward && relation.source_columns.len() > 1 {
|
||||||
|
poly_col = Some(&relation.source_columns[1]); // e.g., target_type
|
||||||
|
table_to_alias = &relation.source_type; // e.g., relationship
|
||||||
|
} else if !edge.forward && relation.destination_columns.len() > 1 {
|
||||||
|
poly_col = Some(&relation.destination_columns[1]); // e.g., source_type
|
||||||
|
table_to_alias = &relation.destination_type; // e.g., relationship
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(col) = poly_col {
|
||||||
|
if let Some(alias) = type_aliases.get(table_to_alias).or_else(|| type_aliases.get(&node.parent_alias)) {
|
||||||
|
where_clauses.push(format!("{}.{} = '{}'", alias, col, type_name));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
fn resolve_filter_alias(
|
fn resolve_filter_alias(
|
||||||
r#type: &crate::database::r#type::Type,
|
r#type: &crate::database::r#type::Type,
|
||||||
type_aliases: &std::collections::HashMap<String, String>,
|
type_aliases: &std::collections::HashMap<String, String>,
|
||||||
@ -587,15 +624,15 @@ impl<'a> Compiler<'a> {
|
|||||||
let op = parts.next().unwrap_or("$eq");
|
let op = parts.next().unwrap_or("$eq");
|
||||||
|
|
||||||
let field_name = if node.ast_path.is_empty() {
|
let field_name = if node.ast_path.is_empty() {
|
||||||
if full_field_path.contains('.') || full_field_path.contains('#') {
|
if full_field_path.contains('/') {
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
full_field_path
|
full_field_path
|
||||||
} else {
|
} else {
|
||||||
let prefix = format!("{}.", node.ast_path);
|
let prefix = format!("{}/", node.ast_path);
|
||||||
if full_field_path.starts_with(&prefix) {
|
if full_field_path.starts_with(&prefix) {
|
||||||
let remainder = &full_field_path[prefix.len()..];
|
let remainder = &full_field_path[prefix.len()..];
|
||||||
if remainder.contains('.') || remainder.contains('#') {
|
if remainder.contains('/') {
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
remainder
|
remainder
|
||||||
|
|||||||
@ -54,6 +54,45 @@ impl Queryer {
|
|||||||
self.execute_sql(schema_id, &sql, &args)
|
self.execute_sql(schema_id, &sql, &args)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn extract_filters(
|
||||||
|
prefix: String,
|
||||||
|
val: &serde_json::Value,
|
||||||
|
entries: &mut Vec<(String, serde_json::Value)>,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
if let Some(obj) = val.as_object() {
|
||||||
|
let mut is_op_obj = false;
|
||||||
|
if let Some(first_key) = obj.keys().next() {
|
||||||
|
if first_key.starts_with('$') {
|
||||||
|
is_op_obj = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if is_op_obj {
|
||||||
|
for (op, op_val) in obj {
|
||||||
|
if !op.starts_with('$') {
|
||||||
|
return Err(format!("Filter operator must start with '$', got: {}", op));
|
||||||
|
}
|
||||||
|
entries.push((format!("{}:{}", prefix, op), op_val.clone()));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
for (k, v) in obj {
|
||||||
|
let next_prefix = if prefix.is_empty() {
|
||||||
|
k.clone()
|
||||||
|
} else {
|
||||||
|
format!("{}/{}", prefix, k)
|
||||||
|
};
|
||||||
|
Self::extract_filters(next_prefix, v, entries)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
return Err(format!(
|
||||||
|
"Filter for path '{}' must be an operator object like {{$eq: ...}} or a nested map.",
|
||||||
|
prefix
|
||||||
|
));
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
fn parse_filter_entries(
|
fn parse_filter_entries(
|
||||||
&self,
|
&self,
|
||||||
filters_map: Option<&serde_json::Map<String, serde_json::Value>>,
|
filters_map: Option<&serde_json::Map<String, serde_json::Value>>,
|
||||||
@ -61,19 +100,7 @@ impl Queryer {
|
|||||||
let mut filter_entries: Vec<(String, serde_json::Value)> = Vec::new();
|
let mut filter_entries: Vec<(String, serde_json::Value)> = Vec::new();
|
||||||
if let Some(fm) = filters_map {
|
if let Some(fm) = filters_map {
|
||||||
for (key, val) in fm {
|
for (key, val) in fm {
|
||||||
if let Some(obj) = val.as_object() {
|
Self::extract_filters(key.clone(), val, &mut filter_entries)?;
|
||||||
for (op, op_val) in obj {
|
|
||||||
if !op.starts_with('$') {
|
|
||||||
return Err(format!("Filter operator must start with '$', got: {}", op));
|
|
||||||
}
|
|
||||||
filter_entries.push((format!("{}:{}", key, op), op_val.clone()));
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
return Err(format!(
|
|
||||||
"Filter for field '{}' must be an object with operators like $eq, $in, etc.",
|
|
||||||
key
|
|
||||||
));
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
filter_entries.sort_by(|a, b| a.0.cmp(&b.0));
|
filter_entries.sort_by(|a, b| a.0.cmp(&b.0));
|
||||||
|
|||||||
@ -1457,6 +1457,12 @@ fn test_queryer_0_7() {
|
|||||||
crate::tests::runner::run_test_case(&path, 0, 7).unwrap();
|
crate::tests::runner::run_test_case(&path, 0, 7).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_queryer_0_8() {
|
||||||
|
let path = format!("{}/fixtures/queryer.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 8).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_not_0_0() {
|
fn test_not_0_0() {
|
||||||
let path = format!("{}/fixtures/not.json", env!("CARGO_MANIFEST_DIR"));
|
let path = format!("{}/fixtures/not.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
@ -2921,6 +2927,36 @@ fn test_minimum_1_6() {
|
|||||||
crate::tests::runner::run_test_case(&path, 1, 6).unwrap();
|
crate::tests::runner::run_test_case(&path, 1, 6).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_paths_0_0() {
|
||||||
|
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 0).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_paths_0_1() {
|
||||||
|
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 1).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_paths_0_2() {
|
||||||
|
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 2).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_paths_0_3() {
|
||||||
|
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 3).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_paths_0_4() {
|
||||||
|
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 4).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_one_of_0_0() {
|
fn test_one_of_0_0() {
|
||||||
let path = format!("{}/fixtures/oneOf.json", env!("CARGO_MANIFEST_DIR"));
|
let path = format!("{}/fixtures/oneOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
@ -8560,3 +8596,15 @@ fn test_merger_0_10() {
|
|||||||
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
crate::tests::runner::run_test_case(&path, 0, 10).unwrap();
|
crate::tests::runner::run_test_case(&path, 0, 10).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_merger_0_11() {
|
||||||
|
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 11).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_merger_0_12() {
|
||||||
|
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||||
|
crate::tests::runner::run_test_case(&path, 0, 12).unwrap();
|
||||||
|
}
|
||||||
|
|||||||
@ -134,12 +134,12 @@ fn test_library_api() {
|
|||||||
{
|
{
|
||||||
"code": "REQUIRED_FIELD_MISSING",
|
"code": "REQUIRED_FIELD_MISSING",
|
||||||
"message": "Missing name",
|
"message": "Missing name",
|
||||||
"details": { "path": "/name" }
|
"details": { "path": "name" }
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"code": "STRICT_PROPERTY_VIOLATION",
|
"code": "STRICT_PROPERTY_VIOLATION",
|
||||||
"message": "Unexpected property 'wrong'",
|
"message": "Unexpected property 'wrong'",
|
||||||
"details": { "path": "/wrong" }
|
"details": { "path": "wrong" }
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
})
|
})
|
||||||
|
|||||||
@ -41,6 +41,14 @@ impl<'a> ValidationContext<'a> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub fn join_path(&self, key: &str) -> String {
|
||||||
|
if self.path.is_empty() {
|
||||||
|
key.to_string()
|
||||||
|
} else {
|
||||||
|
format!("{}/{}", self.path, key)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
pub fn derive(
|
pub fn derive(
|
||||||
&self,
|
&self,
|
||||||
schema: &'a Schema,
|
schema: &'a Schema,
|
||||||
|
|||||||
@ -91,12 +91,17 @@ impl<'a> ValidationContext<'a> {
|
|||||||
if let Some(ref prefix) = self.schema.prefix_items {
|
if let Some(ref prefix) = self.schema.prefix_items {
|
||||||
for (i, sub_schema) in prefix.iter().enumerate() {
|
for (i, sub_schema) in prefix.iter().enumerate() {
|
||||||
if i < len {
|
if i < len {
|
||||||
let path = format!("{}/{}", self.path, i);
|
|
||||||
if let Some(child_instance) = arr.get(i) {
|
if let Some(child_instance) = arr.get(i) {
|
||||||
|
let mut item_path = self.join_path(&i.to_string());
|
||||||
|
if let Some(obj) = child_instance.as_object() {
|
||||||
|
if let Some(id_str) = obj.get("id").and_then(|v| v.as_str()) {
|
||||||
|
item_path = self.join_path(id_str);
|
||||||
|
}
|
||||||
|
}
|
||||||
let derived = self.derive(
|
let derived = self.derive(
|
||||||
sub_schema,
|
sub_schema,
|
||||||
child_instance,
|
child_instance,
|
||||||
&path,
|
&item_path,
|
||||||
HashSet::new(),
|
HashSet::new(),
|
||||||
self.extensible,
|
self.extensible,
|
||||||
false,
|
false,
|
||||||
@ -112,12 +117,17 @@ impl<'a> ValidationContext<'a> {
|
|||||||
|
|
||||||
if let Some(ref items_schema) = self.schema.items {
|
if let Some(ref items_schema) = self.schema.items {
|
||||||
for i in validation_index..len {
|
for i in validation_index..len {
|
||||||
let path = format!("{}/{}", self.path, i);
|
|
||||||
if let Some(child_instance) = arr.get(i) {
|
if let Some(child_instance) = arr.get(i) {
|
||||||
|
let mut item_path = self.join_path(&i.to_string());
|
||||||
|
if let Some(obj) = child_instance.as_object() {
|
||||||
|
if let Some(id_str) = obj.get("id").and_then(|v| v.as_str()) {
|
||||||
|
item_path = self.join_path(id_str);
|
||||||
|
}
|
||||||
|
}
|
||||||
let derived = self.derive(
|
let derived = self.derive(
|
||||||
items_schema,
|
items_schema,
|
||||||
child_instance,
|
child_instance,
|
||||||
&path,
|
&item_path,
|
||||||
HashSet::new(),
|
HashSet::new(),
|
||||||
self.extensible,
|
self.extensible,
|
||||||
false,
|
false,
|
||||||
|
|||||||
@ -44,7 +44,7 @@ impl<'a> ValidationContext<'a> {
|
|||||||
result.errors.push(ValidationError {
|
result.errors.push(ValidationError {
|
||||||
code: "STRICT_PROPERTY_VIOLATION".to_string(),
|
code: "STRICT_PROPERTY_VIOLATION".to_string(),
|
||||||
message: format!("Unexpected property '{}'", key),
|
message: format!("Unexpected property '{}'", key),
|
||||||
path: format!("{}/{}", self.path, key),
|
path: self.join_path(key),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -53,10 +53,18 @@ impl<'a> ValidationContext<'a> {
|
|||||||
if let Some(arr) = self.instance.as_array() {
|
if let Some(arr) = self.instance.as_array() {
|
||||||
for i in 0..arr.len() {
|
for i in 0..arr.len() {
|
||||||
if !result.evaluated_indices.contains(&i) {
|
if !result.evaluated_indices.contains(&i) {
|
||||||
|
let mut item_path = self.join_path(&i.to_string());
|
||||||
|
if let Some(child_instance) = arr.get(i) {
|
||||||
|
if let Some(obj) = child_instance.as_object() {
|
||||||
|
if let Some(id_str) = obj.get("id").and_then(|v| v.as_str()) {
|
||||||
|
item_path = self.join_path(id_str);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
result.errors.push(ValidationError {
|
result.errors.push(ValidationError {
|
||||||
code: "STRICT_ITEM_VIOLATION".to_string(),
|
code: "STRICT_ITEM_VIOLATION".to_string(),
|
||||||
message: format!("Unexpected item at index {}", i),
|
message: format!("Unexpected item at index {}", i),
|
||||||
path: format!("{}/{}", self.path, i),
|
path: item_path,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@ -14,16 +14,13 @@ impl<'a> ValidationContext<'a> {
|
|||||||
let current = self.instance;
|
let current = self.instance;
|
||||||
if let Some(obj) = current.as_object() {
|
if let Some(obj) = current.as_object() {
|
||||||
// Entity implicit type validation
|
// Entity implicit type validation
|
||||||
// Use the specific schema id or ref as a fallback
|
if let Some(schema_identifier) = self.schema.identifier() {
|
||||||
if let Some(identifier) = self.schema.id.as_ref().or(self.schema.r#ref.as_ref()) {
|
|
||||||
// Kick in if the data object has a type field
|
// Kick in if the data object has a type field
|
||||||
if let Some(type_val) = obj.get("type")
|
if let Some(type_val) = obj.get("type")
|
||||||
&& let Some(type_str) = type_val.as_str()
|
&& let Some(type_str) = type_val.as_str()
|
||||||
{
|
{
|
||||||
// Get the string or the final segment as the base
|
// Check if the identifier is a global type name
|
||||||
let base = identifier.split('.').next_back().unwrap_or("").to_string();
|
if let Some(type_def) = self.db.types.get(&schema_identifier) {
|
||||||
// Check if the base is a global type name
|
|
||||||
if let Some(type_def) = self.db.types.get(&base) {
|
|
||||||
// Ensure the instance type is a variation of the global type
|
// Ensure the instance type is a variation of the global type
|
||||||
if type_def.variations.contains(type_str) {
|
if type_def.variations.contains(type_str) {
|
||||||
// Ensure it passes strict mode
|
// Ensure it passes strict mode
|
||||||
@ -35,12 +32,12 @@ impl<'a> ValidationContext<'a> {
|
|||||||
"Type '{}' is not a valid descendant for this entity bound schema",
|
"Type '{}' is not a valid descendant for this entity bound schema",
|
||||||
type_str
|
type_str
|
||||||
),
|
),
|
||||||
path: format!("{}/type", self.path),
|
path: self.join_path("type"),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
// Ad-Hoc schemas natively use strict schema discriminator strings instead of variation inheritance
|
// Ad-Hoc schemas natively use strict schema discriminator strings instead of variation inheritance
|
||||||
if type_str == identifier {
|
if type_str == schema_identifier.as_str() {
|
||||||
result.evaluated_keys.insert("type".to_string());
|
result.evaluated_keys.insert("type".to_string());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -73,7 +70,7 @@ impl<'a> ValidationContext<'a> {
|
|||||||
result.errors.push(ValidationError {
|
result.errors.push(ValidationError {
|
||||||
code: "REQUIRED_FIELD_MISSING".to_string(),
|
code: "REQUIRED_FIELD_MISSING".to_string(),
|
||||||
message: format!("Missing {}", field),
|
message: format!("Missing {}", field),
|
||||||
path: format!("{}/{}", self.path, field),
|
path: self.join_path(field),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -112,7 +109,7 @@ impl<'a> ValidationContext<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if let Some(child_instance) = obj.get(key) {
|
if let Some(child_instance) = obj.get(key) {
|
||||||
let new_path = format!("{}/{}", self.path, key);
|
let new_path = self.join_path(key);
|
||||||
let is_ref = sub_schema.r#ref.is_some();
|
let is_ref = sub_schema.r#ref.is_some();
|
||||||
let next_extensible = if is_ref { false } else { self.extensible };
|
let next_extensible = if is_ref { false } else { self.extensible };
|
||||||
|
|
||||||
@ -128,14 +125,9 @@ impl<'a> ValidationContext<'a> {
|
|||||||
|
|
||||||
// Entity Bound Implicit Type Interception
|
// Entity Bound Implicit Type Interception
|
||||||
if key == "type"
|
if key == "type"
|
||||||
&& let Some(schema_bound) = sub_schema.id.as_ref().or(sub_schema.r#ref.as_ref())
|
&& let Some(schema_bound) = sub_schema.identifier()
|
||||||
{
|
{
|
||||||
let physical_type_name = schema_bound
|
if let Some(type_def) = self.db.types.get(&schema_bound)
|
||||||
.split('.')
|
|
||||||
.next_back()
|
|
||||||
.unwrap_or("")
|
|
||||||
.to_string();
|
|
||||||
if let Some(type_def) = self.db.types.get(&physical_type_name)
|
|
||||||
&& let Some(instance_type) = child_instance.as_str()
|
&& let Some(instance_type) = child_instance.as_str()
|
||||||
&& type_def.variations.contains(instance_type)
|
&& type_def.variations.contains(instance_type)
|
||||||
{
|
{
|
||||||
@ -155,7 +147,7 @@ impl<'a> ValidationContext<'a> {
|
|||||||
for (compiled_re, sub_schema) in compiled_pp {
|
for (compiled_re, sub_schema) in compiled_pp {
|
||||||
for (key, child_instance) in obj {
|
for (key, child_instance) in obj {
|
||||||
if compiled_re.0.is_match(key) {
|
if compiled_re.0.is_match(key) {
|
||||||
let new_path = format!("{}/{}", self.path, key);
|
let new_path = self.join_path(key);
|
||||||
let is_ref = sub_schema.r#ref.is_some();
|
let is_ref = sub_schema.r#ref.is_some();
|
||||||
let next_extensible = if is_ref { false } else { self.extensible };
|
let next_extensible = if is_ref { false } else { self.extensible };
|
||||||
|
|
||||||
@ -194,7 +186,7 @@ impl<'a> ValidationContext<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if !locally_matched {
|
if !locally_matched {
|
||||||
let new_path = format!("{}/{}", self.path, key);
|
let new_path = self.join_path(key);
|
||||||
let is_ref = additional_schema.r#ref.is_some();
|
let is_ref = additional_schema.r#ref.is_some();
|
||||||
let next_extensible = if is_ref { false } else { self.extensible };
|
let next_extensible = if is_ref { false } else { self.extensible };
|
||||||
|
|
||||||
@ -215,7 +207,7 @@ impl<'a> ValidationContext<'a> {
|
|||||||
|
|
||||||
if let Some(ref property_names) = self.schema.property_names {
|
if let Some(ref property_names) = self.schema.property_names {
|
||||||
for key in obj.keys() {
|
for key in obj.keys() {
|
||||||
let _new_path = format!("{}/propertyNames/{}", self.path, key);
|
let _new_path = self.join_path(&format!("propertyNames/{}", key));
|
||||||
let val_str = Value::String(key.to_string());
|
let val_str = Value::String(key.to_string());
|
||||||
|
|
||||||
let ctx = ValidationContext::new(
|
let ctx = ValidationContext::new(
|
||||||
|
|||||||
Reference in New Issue
Block a user