Compare commits
11 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 4a10833f50 | |||
| 46fc032026 | |||
| 7ec06b81cc | |||
| c4e8e0309f | |||
| eb91b65e65 | |||
| 8bf3649465 | |||
| 9fe5a34163 | |||
| f5bf21eb58 | |||
| 9dcafed406 | |||
| ffd6c27da3 | |||
| 4941dc6069 |
23
GEMINI.md
23
GEMINI.md
@ -7,14 +7,14 @@
|
||||
JSPG operates by deeply integrating the JSON Schema Draft 2020-12 specification directly into the Postgres session lifecycle. It is built around three core pillars:
|
||||
* **Validator**: In-memory, near-instant JSON structural validation and type polymorphism routing.
|
||||
* **Merger**: Automatically traverse and UPSERT deeply nested JSON graphs into normalized relational tables.
|
||||
* **Queryer**: Compile JSON Schemas into static, cached SQL SPI `SELECT` plans for fetching full entities or isolated "Stems".
|
||||
* **Queryer**: Compile JSON Schemas into static, cached SQL SPI `SELECT` plans for fetching full entities or isolated ad-hoc object boundaries.
|
||||
|
||||
### 🎯 Goals
|
||||
1. **Draft 2020-12 Compliance**: Attempt to adhere to the official JSON Schema Draft 2020-12 specification.
|
||||
2. **Ultra-Fast Execution**: Compile schemas into optimized in-memory validation trees and cached SQL SPIs to bypass Postgres Query Builder overheads.
|
||||
3. **Connection-Bound Caching**: Leverage the PostgreSQL session lifecycle using an **Atomic Swap** pattern. Schemas are 100% frozen, completely eliminating locks during read access.
|
||||
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `$family` references natively mapped to Postgres table constraints.
|
||||
5. **Reactive Beats**: Provide natively generated "Stems" (isolated payload fragments) for dynamic websocket reactivity.
|
||||
5. **Reactive Beats**: Provide ultra-fast natively generated flat payloads mapping directly to the Dart topological state for dynamic websocket reactivity.
|
||||
|
||||
### Concurrency & Threading ("Immutable Graphs")
|
||||
To support high-throughput operations while allowing for runtime updates (e.g., during hot-reloading), JSPG uses an **Atomic Swap** pattern:
|
||||
@ -118,22 +118,11 @@ The Queryer transforms Postgres into a pre-compiled Semantic Query Engine, desig
|
||||
* **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into `JOIN`s for each variation.
|
||||
* **Single-Table Bypass**: If the Physical Table is a leaf node with only one variation (e.g. `person` has variations `["person"]`), the compiler cleanly bypasses `CASE` generation and compiles a simple `SELECT` across the base table, as all schema extensions (e.g. `light.person`, `full.person`) are guaranteed to reside in the exact same physical row.
|
||||
|
||||
### The Stem Engine
|
||||
### Ad-Hoc Schema Promotion
|
||||
|
||||
Rather than over-fetching heavy Entity payloads and trimming them, Punc Framework Websockets depend on isolated subgraphs defined as **Stems**.
|
||||
A `Stem` is a declaration of an **Entity Type boundary** that exists somewhere within the compiled JSON Schema graph, expressed using **`gjson` multipath syntax** (e.g., `contacts.#.phone_numbers.#`).
|
||||
|
||||
Because `pg_notify` (Beats) fire rigidly from physical Postgres tables (e.g. `{"type": "phone_number"}`), the Go Framework only ever needs to know: "Does the schema `with_contacts.person` contain the `phone_number` Entity anywhere inside its tree, and if so, what is the gjson path to iterate its payload?"
|
||||
|
||||
* **Initialization:** During startup (`jspg_stems()`), the database crawls all Schemas and maps out every physical Entity Type it references. It builds a highly optimized `HashMap<String, HashMap<String, Arc<Stem>>>` providing strictly `O(1)` memory lookups mapping `Schema ID -> { Stem Path -> Entity Type }`.
|
||||
* **GJSON Pathing:** Unlike standard JSON Pointers, stems utilize `.#` array iterator syntax. The Go web server consumes this native path (e.g. `lines.#`) across the raw Postgres JSON byte payload, extracting all active UUIDs in one massive sub-millisecond sweep without unmarshaling Go ASTs.
|
||||
* **Polymorphic Condition Selectors:** When trailing paths would otherwise collide because of abstract polymorphic type definitions (e.g., a `target` property bounded by a `oneOf` taking either `phone_number` or `email_address`), JSPG natively appends evaluated `gjson` type conditions into the path (e.g. `contacts.#.target#(type=="phone_number")`). This guarantees `O(1)` key uniqueness in the HashMap while retaining extreme array extraction speeds natively without runtime AST evaluation.
|
||||
* **Identifier Prioritization:** When determining if a nested object boundary is an Entity, JSPG natively prioritizes defined `$id` tags over `$ref` inheritance pointers to prevent polymorphic boundaries from devolving into their generic base classes.
|
||||
* **Cyclical Deduplication:** Because Punc relationships often reference back on themselves via deeply nested classes, the Stem Engine applies intelligent path deduplication. If the active `current_path` already ends with the target entity string, it traverses the inheritance properties without appending the entity to the stem path again, eliminating infinite powerset loops.
|
||||
* **Relationship Path Squashing:** When calculating string paths structurally, JSPG intentionally **omits** properties natively named `target` or `source` if they belong to a native database `relationship` table override.
|
||||
* **The Go Router**: The Golang Punc framework uses this exact mapping to register WebSocket Beat frequencies exclusively on the Entity types discovered.
|
||||
* **The Queryer Execution**: When the Go framework asks JSPG to hydrate a partial `phone_number` stem for the `with_contacts.person` schema, instead of jumping through string paths, the SQL Compiler simply reaches into the Schema's AST using the `phone_number` Type string, pulls out exactly that entity's mapping rules, and returns a fully correlated `SELECT` block! This natively handles nested array properties injected via `oneOf` or array references efficiently bypassing runtime powerset expansion.
|
||||
* **Performance:** These Stem execution structures are fully statically compiled via SPI and map perfectly to `O(1)` real-time routing logic on the application tier.
|
||||
To seamlessly support deeply nested, inline Object definitions that don't declare an explicit `$id`, JSPG aggressively promotes them to standalone topological entities during the database compilation phase.
|
||||
* **Hash Generation:** While evaluating the unified graph, if the compiler enters an `Object` or `Array` structure completely lacking an `$id`, it dynamically calculates a localized hash alias representing exactly its structural constraints.
|
||||
* **Promotion:** This inline chunk is mathematically elevated to its own `$id` in the `db.schemas` cache registry. This guarantees that $O(1)$ WebSockets or isolated queries can natively target any arbitrary sub-object of a massive database topology directly without recursively re-parsing its parent's AST block every read.
|
||||
|
||||
## 5. Testing & Execution Architecture
|
||||
|
||||
|
||||
@ -142,7 +142,7 @@
|
||||
"errors": [
|
||||
{
|
||||
"code": "CONST_VIOLATED",
|
||||
"path": "/con"
|
||||
"path": "con"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@ -48,7 +48,7 @@
|
||||
"errors": [
|
||||
{
|
||||
"code": "TYPE_MISMATCH",
|
||||
"path": "/base_prop"
|
||||
"path": "base_prop"
|
||||
}
|
||||
]
|
||||
}
|
||||
@ -109,7 +109,7 @@
|
||||
"errors": [
|
||||
{
|
||||
"code": "REQUIRED_FIELD_MISSING",
|
||||
"path": "/a"
|
||||
"path": "a"
|
||||
}
|
||||
]
|
||||
}
|
||||
@ -126,7 +126,7 @@
|
||||
"errors": [
|
||||
{
|
||||
"code": "REQUIRED_FIELD_MISSING",
|
||||
"path": "/b"
|
||||
"path": "b"
|
||||
}
|
||||
]
|
||||
}
|
||||
@ -196,7 +196,7 @@
|
||||
"errors": [
|
||||
{
|
||||
"code": "DEPENDENCY_FAILED",
|
||||
"path": "/base_dep"
|
||||
"path": "base_dep"
|
||||
}
|
||||
]
|
||||
}
|
||||
@ -214,7 +214,7 @@
|
||||
"errors": [
|
||||
{
|
||||
"code": "DEPENDENCY_FAILED",
|
||||
"path": "/child_dep"
|
||||
"path": "child_dep"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@ -972,7 +972,119 @@
|
||||
"LEFT JOIN agreego.\"user\" t2 ON t2.id = t1.id",
|
||||
"LEFT JOIN agreego.\"organization\" t3 ON t3.id = t1.id",
|
||||
"LEFT JOIN agreego.\"entity\" t4 ON t4.id = t1.id",
|
||||
"WHERE \"first_name\" = 'LookupFirst' AND \"last_name\" = 'LookupLast' AND \"date_of_birth\" = '1990-01-01T00:00:00Z' AND \"pronouns\" = 'they/them'"
|
||||
"WHERE (",
|
||||
" \"first_name\" = 'LookupFirst'",
|
||||
" AND \"last_name\" = 'LookupLast'",
|
||||
" AND \"date_of_birth\" = '1990-01-01T00:00:00Z'",
|
||||
" AND \"pronouns\" = 'they/them'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
"UPDATE agreego.\"person\"",
|
||||
"SET",
|
||||
" \"contact_id\" = 'abc-contact'",
|
||||
"WHERE",
|
||||
" id = '22222222-2222-2222-2222-222222222222'"
|
||||
],
|
||||
[
|
||||
"UPDATE agreego.\"entity\"",
|
||||
"SET",
|
||||
" \"modified_at\" = '2026-03-10T00:00:00Z',",
|
||||
" \"modified_by\" = '00000000-0000-0000-0000-000000000000'",
|
||||
"WHERE",
|
||||
" id = '22222222-2222-2222-2222-222222222222'"
|
||||
],
|
||||
[
|
||||
"INSERT INTO agreego.change (",
|
||||
" \"old\",",
|
||||
" \"new\",",
|
||||
" entity_id,",
|
||||
" id,",
|
||||
" kind,",
|
||||
" modified_at,",
|
||||
" modified_by",
|
||||
")",
|
||||
"VALUES (",
|
||||
" '{",
|
||||
" \"contact_id\":\"old-contact\"",
|
||||
" }',",
|
||||
" '{",
|
||||
" \"contact_id\":\"abc-contact\",",
|
||||
" \"type\":\"person\"",
|
||||
" }',",
|
||||
" '22222222-2222-2222-2222-222222222222',",
|
||||
" '{{uuid}}',",
|
||||
" 'update',",
|
||||
" '{{timestamp}}',",
|
||||
" '00000000-0000-0000-0000-000000000000'",
|
||||
")"
|
||||
],
|
||||
[
|
||||
"SELECT pg_notify('entity', '{",
|
||||
" \"complete\":{",
|
||||
" \"contact_id\":\"abc-contact\",",
|
||||
" \"date_of_birth\":\"1990-01-01T00:00:00Z\",",
|
||||
" \"first_name\":\"LookupFirst\",",
|
||||
" \"id\":\"22222222-2222-2222-2222-222222222222\",",
|
||||
" \"last_name\":\"LookupLast\",",
|
||||
" \"modified_at\":\"2026-03-10T00:00:00Z\",",
|
||||
" \"modified_by\":\"00000000-0000-0000-0000-000000000000\",",
|
||||
" \"pronouns\":\"they/them\",",
|
||||
" \"type\":\"person\"",
|
||||
" },",
|
||||
" \"new\":{",
|
||||
" \"contact_id\":\"abc-contact\",",
|
||||
" \"type\":\"person\"",
|
||||
" },",
|
||||
" \"old\":{",
|
||||
" \"contact_id\":\"old-contact\"",
|
||||
" }",
|
||||
" }')"
|
||||
]
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Update existing person with id (lookup)",
|
||||
"action": "merge",
|
||||
"data": {
|
||||
"id": "33333333-3333-3333-3333-333333333333",
|
||||
"type": "person",
|
||||
"first_name": "LookupFirst",
|
||||
"last_name": "LookupLast",
|
||||
"date_of_birth": "1990-01-01T00:00:00Z",
|
||||
"pronouns": "they/them",
|
||||
"contact_id": "abc-contact"
|
||||
},
|
||||
"mocks": [
|
||||
{
|
||||
"id": "22222222-2222-2222-2222-222222222222",
|
||||
"type": "person",
|
||||
"first_name": "LookupFirst",
|
||||
"last_name": "LookupLast",
|
||||
"date_of_birth": "1990-01-01T00:00:00Z",
|
||||
"pronouns": "they/them",
|
||||
"contact_id": "old-contact"
|
||||
}
|
||||
],
|
||||
"schema_id": "person",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
[
|
||||
"SELECT to_jsonb(t1.*) || to_jsonb(t2.*) || to_jsonb(t3.*) || to_jsonb(t4.*)",
|
||||
"FROM agreego.\"person\" t1",
|
||||
"LEFT JOIN agreego.\"user\" t2 ON t2.id = t1.id",
|
||||
"LEFT JOIN agreego.\"organization\" t3 ON t3.id = t1.id",
|
||||
"LEFT JOIN agreego.\"entity\" t4 ON t4.id = t1.id",
|
||||
"WHERE",
|
||||
" t1.id = '33333333-3333-3333-3333-333333333333'",
|
||||
" OR (",
|
||||
" \"first_name\" = 'LookupFirst'",
|
||||
" AND \"last_name\" = 'LookupLast'",
|
||||
" AND \"date_of_birth\" = '1990-01-01T00:00:00Z'",
|
||||
" AND \"pronouns\" = 'they/them'",
|
||||
" )"
|
||||
],
|
||||
[
|
||||
"UPDATE agreego.\"person\"",
|
||||
@ -1484,7 +1596,7 @@
|
||||
"SELECT to_jsonb(t1.*) || to_jsonb(t2.*)",
|
||||
"FROM agreego.\"order\" t1",
|
||||
"LEFT JOIN agreego.\"entity\" t2 ON t2.id = t1.id",
|
||||
"WHERE t1.id = 'abc'"
|
||||
"WHERE t1.id = 'abc' OR (\"id\" = 'abc')"
|
||||
],
|
||||
[
|
||||
"INSERT INTO agreego.\"entity\" (",
|
||||
|
||||
214
fixtures/paths.json
Normal file
214
fixtures/paths.json
Normal file
@ -0,0 +1,214 @@
|
||||
[
|
||||
{
|
||||
"description": "Hybrid Array Pathing",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "hybrid_pathing",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"primitives": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"ad_hoc_objects": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"name"
|
||||
]
|
||||
}
|
||||
},
|
||||
"entities": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "string"
|
||||
},
|
||||
"value": {
|
||||
"type": "number",
|
||||
"minimum": 10
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"deep_entities": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "string"
|
||||
},
|
||||
"nested": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "string"
|
||||
},
|
||||
"flag": {
|
||||
"type": "boolean"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "happy path passes structural validation",
|
||||
"data": {
|
||||
"primitives": [
|
||||
"a",
|
||||
"b"
|
||||
],
|
||||
"ad_hoc_objects": [
|
||||
{
|
||||
"name": "obj1"
|
||||
}
|
||||
],
|
||||
"entities": [
|
||||
{
|
||||
"id": "entity-1",
|
||||
"value": 15
|
||||
}
|
||||
],
|
||||
"deep_entities": [
|
||||
{
|
||||
"id": "parent-1",
|
||||
"nested": [
|
||||
{
|
||||
"id": "child-1",
|
||||
"flag": true
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"schema_id": "hybrid_pathing",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "primitive arrays use numeric indexing",
|
||||
"data": {
|
||||
"primitives": [
|
||||
"a",
|
||||
123
|
||||
]
|
||||
},
|
||||
"schema_id": "hybrid_pathing",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "INVALID_TYPE",
|
||||
"path": "primitives/1"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "ad-hoc objects without ids use numeric indexing",
|
||||
"data": {
|
||||
"ad_hoc_objects": [
|
||||
{
|
||||
"name": "valid"
|
||||
},
|
||||
{
|
||||
"age": 30
|
||||
}
|
||||
]
|
||||
},
|
||||
"schema_id": "hybrid_pathing",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "REQUIRED_FIELD_MISSING",
|
||||
"path": "ad_hoc_objects/1/name"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "arrays of objects with ids use topological uuid indexing",
|
||||
"data": {
|
||||
"entities": [
|
||||
{
|
||||
"id": "entity-alpha",
|
||||
"value": 20
|
||||
},
|
||||
{
|
||||
"id": "entity-beta",
|
||||
"value": 5
|
||||
}
|
||||
]
|
||||
},
|
||||
"schema_id": "hybrid_pathing",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "MINIMUM_VIOLATED",
|
||||
"path": "entities/entity-beta/value"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "deeply nested entity arrays retain full topological paths",
|
||||
"data": {
|
||||
"deep_entities": [
|
||||
{
|
||||
"id": "parent-omega",
|
||||
"nested": [
|
||||
{
|
||||
"id": "child-alpha",
|
||||
"flag": true
|
||||
},
|
||||
{
|
||||
"id": "child-beta",
|
||||
"flag": "invalid-string"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"schema_id": "hybrid_pathing",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "INVALID_TYPE",
|
||||
"path": "deep_entities/parent-omega/nested/child-beta/flag"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
@ -20,6 +20,16 @@
|
||||
"$family": "base.person"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "get_orders",
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "get_orders.response",
|
||||
"type": "array",
|
||||
"items": { "$ref": "light.order" }
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"enums": [],
|
||||
@ -664,6 +674,15 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"$id": "light.order",
|
||||
"$ref": "order",
|
||||
"properties": {
|
||||
"customer": {
|
||||
"$ref": "base.person"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"$id": "full.order",
|
||||
"$ref": "order",
|
||||
@ -1569,6 +1588,47 @@
|
||||
]
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Root Array SQL evaluation for Order fetching Light Order",
|
||||
"action": "query",
|
||||
"schema_id": "get_orders.response",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"sql": [
|
||||
[
|
||||
"(SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||
" 'archived', entity_2.archived,",
|
||||
" 'created_at', entity_2.created_at,",
|
||||
" 'customer',",
|
||||
" (SELECT jsonb_build_object(",
|
||||
" 'age', person_3.age,",
|
||||
" 'archived', entity_5.archived,",
|
||||
" 'created_at', entity_5.created_at,",
|
||||
" 'first_name', person_3.first_name,",
|
||||
" 'id', entity_5.id,",
|
||||
" 'last_name', person_3.last_name,",
|
||||
" 'name', entity_5.name,",
|
||||
" 'type', entity_5.type",
|
||||
" )",
|
||||
" FROM agreego.person person_3",
|
||||
" JOIN agreego.organization organization_4 ON organization_4.id = person_3.id",
|
||||
" JOIN agreego.entity entity_5 ON entity_5.id = organization_4.id",
|
||||
" WHERE",
|
||||
" NOT entity_5.archived",
|
||||
" AND order_1.customer_id = person_3.id),",
|
||||
" 'customer_id', order_1.customer_id,",
|
||||
" 'id', entity_2.id,",
|
||||
" 'name', entity_2.name,",
|
||||
" 'total', order_1.total,",
|
||||
" 'type', entity_2.type",
|
||||
")), '[]'::jsonb)",
|
||||
"FROM agreego.order order_1",
|
||||
"JOIN agreego.entity entity_2 ON entity_2.id = order_1.id",
|
||||
"WHERE NOT entity_2.archived)"
|
||||
]
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@ -677,7 +677,7 @@
|
||||
"errors": [
|
||||
{
|
||||
"code": "TYPE_MISMATCH",
|
||||
"path": "/type"
|
||||
"path": "type"
|
||||
}
|
||||
]
|
||||
}
|
||||
@ -782,7 +782,7 @@
|
||||
"errors": [
|
||||
{
|
||||
"code": "TYPE_MISMATCH",
|
||||
"path": "/type"
|
||||
"path": "type"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@ -124,42 +124,23 @@ fn parse_and_match_mocks(sql: &str, mocks: &[Value]) -> Option<Vec<Value>> {
|
||||
return None;
|
||||
};
|
||||
|
||||
// 2. Extract WHERE conditions
|
||||
let mut conditions = Vec::new();
|
||||
// 2. Extract WHERE conditions string
|
||||
let mut where_clause = String::new();
|
||||
if let Some(where_idx) = sql_upper.find(" WHERE ") {
|
||||
let mut where_end = sql_upper.find(" ORDER BY ").unwrap_or(sql.len());
|
||||
let mut where_end = sql_upper.find(" ORDER BY ").unwrap_or(sql_upper.len());
|
||||
if let Some(limit_idx) = sql_upper.find(" LIMIT ") {
|
||||
if limit_idx < where_end {
|
||||
where_end = limit_idx;
|
||||
}
|
||||
}
|
||||
let where_clause = &sql[where_idx + 7..where_end];
|
||||
let and_regex = Regex::new(r"(?i)\s+AND\s+").ok()?;
|
||||
let parts = and_regex.split(where_clause);
|
||||
for part in parts {
|
||||
if let Some(eq_idx) = part.find('=') {
|
||||
let left = part[..eq_idx]
|
||||
.trim()
|
||||
.split('.')
|
||||
.last()
|
||||
.unwrap_or("")
|
||||
.trim_matches('"');
|
||||
let right = part[eq_idx + 1..].trim().trim_matches('\'');
|
||||
conditions.push((left.to_string(), right.to_string()));
|
||||
} else if part.to_uppercase().contains(" IS NULL") {
|
||||
let left = part[..part.to_uppercase().find(" IS NULL").unwrap()]
|
||||
.trim()
|
||||
.split('.')
|
||||
.last()
|
||||
.unwrap_or("")
|
||||
.replace('"', ""); // Remove quotes explicitly
|
||||
conditions.push((left, "null".to_string()));
|
||||
}
|
||||
}
|
||||
where_clause = sql[where_idx + 7..where_end].to_string();
|
||||
}
|
||||
|
||||
// 3. Find matching mocks
|
||||
let mut matches = Vec::new();
|
||||
let or_regex = Regex::new(r"(?i)\s+OR\s+").ok()?;
|
||||
let and_regex = Regex::new(r"(?i)\s+AND\s+").ok()?;
|
||||
|
||||
for mock in mocks {
|
||||
if let Some(mock_obj) = mock.as_object() {
|
||||
if let Some(t) = mock_obj.get("type") {
|
||||
@ -168,25 +149,66 @@ fn parse_and_match_mocks(sql: &str, mocks: &[Value]) -> Option<Vec<Value>> {
|
||||
}
|
||||
}
|
||||
|
||||
let mut matches_all = true;
|
||||
for (k, v) in &conditions {
|
||||
let mock_val_str = match mock_obj.get(k) {
|
||||
Some(Value::String(s)) => s.clone(),
|
||||
Some(Value::Number(n)) => n.to_string(),
|
||||
Some(Value::Bool(b)) => b.to_string(),
|
||||
Some(Value::Null) => "null".to_string(),
|
||||
_ => {
|
||||
matches_all = false;
|
||||
break;
|
||||
if where_clause.is_empty() {
|
||||
matches.push(mock.clone());
|
||||
continue;
|
||||
}
|
||||
|
||||
let or_parts = or_regex.split(&where_clause);
|
||||
let mut any_branch_matched = false;
|
||||
|
||||
for or_part in or_parts {
|
||||
let branch_str = or_part.replace('(', "").replace(')', "");
|
||||
let mut branch_matches = true;
|
||||
|
||||
for part in and_regex.split(&branch_str) {
|
||||
if let Some(eq_idx) = part.find('=') {
|
||||
let left = part[..eq_idx]
|
||||
.trim()
|
||||
.split('.')
|
||||
.last()
|
||||
.unwrap_or("")
|
||||
.trim_matches('"');
|
||||
let right = part[eq_idx + 1..].trim().trim_matches('\'');
|
||||
|
||||
let mock_val_str = match mock_obj.get(left) {
|
||||
Some(Value::String(s)) => s.clone(),
|
||||
Some(Value::Number(n)) => n.to_string(),
|
||||
Some(Value::Bool(b)) => b.to_string(),
|
||||
Some(Value::Null) => "null".to_string(),
|
||||
_ => "".to_string(),
|
||||
};
|
||||
if mock_val_str != right {
|
||||
branch_matches = false;
|
||||
break;
|
||||
}
|
||||
} else if part.to_uppercase().contains(" IS NULL") {
|
||||
let left = part[..part.to_uppercase().find(" IS NULL").unwrap()]
|
||||
.trim()
|
||||
.split('.')
|
||||
.last()
|
||||
.unwrap_or("")
|
||||
.trim_matches('"');
|
||||
|
||||
let mock_val_str = match mock_obj.get(left) {
|
||||
Some(Value::Null) => "null".to_string(),
|
||||
_ => "".to_string(),
|
||||
};
|
||||
|
||||
if mock_val_str != "null" {
|
||||
branch_matches = false;
|
||||
break;
|
||||
}
|
||||
}
|
||||
};
|
||||
if mock_val_str != *v {
|
||||
matches_all = false;
|
||||
}
|
||||
|
||||
if branch_matches {
|
||||
any_branch_matched = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if matches_all {
|
||||
if any_branch_matched {
|
||||
matches.push(mock.clone());
|
||||
}
|
||||
}
|
||||
|
||||
@ -9,6 +9,61 @@ impl SpiExecutor {
|
||||
pub fn new() -> Self {
|
||||
Self {}
|
||||
}
|
||||
|
||||
fn transact<F, R>(&self, f: F) -> Result<R, String>
|
||||
where
|
||||
F: FnOnce() -> Result<R, String>,
|
||||
{
|
||||
unsafe {
|
||||
let oldcontext = pgrx::pg_sys::CurrentMemoryContext;
|
||||
let oldowner = pgrx::pg_sys::CurrentResourceOwner;
|
||||
pgrx::pg_sys::BeginInternalSubTransaction(std::ptr::null());
|
||||
pgrx::pg_sys::MemoryContextSwitchTo(oldcontext);
|
||||
|
||||
let runner = std::panic::AssertUnwindSafe(move || {
|
||||
let res = f();
|
||||
|
||||
pgrx::pg_sys::ReleaseCurrentSubTransaction();
|
||||
pgrx::pg_sys::MemoryContextSwitchTo(oldcontext);
|
||||
pgrx::pg_sys::CurrentResourceOwner = oldowner;
|
||||
|
||||
res
|
||||
});
|
||||
|
||||
pgrx::PgTryBuilder::new(runner)
|
||||
.catch_rust_panic(|cause| {
|
||||
pgrx::pg_sys::RollbackAndReleaseCurrentSubTransaction();
|
||||
pgrx::pg_sys::MemoryContextSwitchTo(oldcontext);
|
||||
pgrx::pg_sys::CurrentResourceOwner = oldowner;
|
||||
|
||||
// Rust panics are fatal bugs, not validation errors. Rethrow so they bubble up.
|
||||
cause.rethrow()
|
||||
})
|
||||
.catch_others(|cause| {
|
||||
pgrx::pg_sys::RollbackAndReleaseCurrentSubTransaction();
|
||||
pgrx::pg_sys::MemoryContextSwitchTo(oldcontext);
|
||||
pgrx::pg_sys::CurrentResourceOwner = oldowner;
|
||||
|
||||
let error_msg = match &cause {
|
||||
pgrx::pg_sys::panic::CaughtError::PostgresError(e)
|
||||
| pgrx::pg_sys::panic::CaughtError::ErrorReport(e) => {
|
||||
let json_err = serde_json::json!({
|
||||
"error": e.message(),
|
||||
"code": format!("{:?}", e.sql_error_code()),
|
||||
"detail": e.detail(),
|
||||
"hint": e.hint()
|
||||
});
|
||||
json_err.to_string()
|
||||
}
|
||||
_ => format!("{:?}", cause),
|
||||
};
|
||||
|
||||
pgrx::warning!("JSPG Caught Native Postgres Error: {}", error_msg);
|
||||
Err(error_msg)
|
||||
})
|
||||
.execute()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl DatabaseExecutor for SpiExecutor {
|
||||
@ -24,7 +79,7 @@ impl DatabaseExecutor for SpiExecutor {
|
||||
}
|
||||
}
|
||||
|
||||
pgrx::PgTryBuilder::new(|| {
|
||||
self.transact(|| {
|
||||
Spi::connect(|client| {
|
||||
pgrx::notice!("JSPG_SQL: {}", sql);
|
||||
match client.select(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
||||
@ -41,11 +96,6 @@ impl DatabaseExecutor for SpiExecutor {
|
||||
}
|
||||
})
|
||||
})
|
||||
.catch_others(|cause| {
|
||||
pgrx::warning!("JSPG Caught Native Postgres Error: {:?}", cause);
|
||||
Err(format!("{:?}", cause))
|
||||
})
|
||||
.execute()
|
||||
}
|
||||
|
||||
fn execute(&self, sql: &str, args: Option<&[Value]>) -> Result<(), String> {
|
||||
@ -60,7 +110,7 @@ impl DatabaseExecutor for SpiExecutor {
|
||||
}
|
||||
}
|
||||
|
||||
pgrx::PgTryBuilder::new(|| {
|
||||
self.transact(|| {
|
||||
Spi::connect_mut(|client| {
|
||||
pgrx::notice!("JSPG_SQL: {}", sql);
|
||||
match client.update(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
||||
@ -69,44 +119,43 @@ impl DatabaseExecutor for SpiExecutor {
|
||||
}
|
||||
})
|
||||
})
|
||||
.catch_others(|cause| {
|
||||
pgrx::warning!("JSPG Caught Native Postgres Error: {:?}", cause);
|
||||
Err(format!("{:?}", cause))
|
||||
})
|
||||
.execute()
|
||||
}
|
||||
|
||||
fn auth_user_id(&self) -> Result<String, String> {
|
||||
Spi::connect(|client| {
|
||||
let mut tup_table = client
|
||||
.select(
|
||||
"SELECT COALESCE(current_setting('auth.user_id', true), 'ffffffff-ffff-ffff-ffff-ffffffffffff')",
|
||||
None,
|
||||
&[],
|
||||
)
|
||||
.map_err(|e| format!("SPI Select Error: {}", e))?;
|
||||
self.transact(|| {
|
||||
Spi::connect(|client| {
|
||||
let mut tup_table = client
|
||||
.select(
|
||||
"SELECT COALESCE(current_setting('auth.user_id', true), 'ffffffff-ffff-ffff-ffff-ffffffffffff')",
|
||||
None,
|
||||
&[],
|
||||
)
|
||||
.map_err(|e| format!("SPI Select Error: {}", e))?;
|
||||
|
||||
let row = tup_table
|
||||
.next()
|
||||
.ok_or("No user id setting returned from context".to_string())?;
|
||||
let user_id: Option<String> = row.get(1).map_err(|e| e.to_string())?;
|
||||
let row = tup_table
|
||||
.next()
|
||||
.ok_or("No user id setting returned from context".to_string())?;
|
||||
let user_id: Option<String> = row.get(1).map_err(|e| e.to_string())?;
|
||||
|
||||
user_id.ok_or("Missing user_id".to_string())
|
||||
user_id.ok_or("Missing user_id".to_string())
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
fn timestamp(&self) -> Result<String, String> {
|
||||
Spi::connect(|client| {
|
||||
let mut tup_table = client
|
||||
.select("SELECT clock_timestamp()::text", None, &[])
|
||||
.map_err(|e| format!("SPI Select Error: {}", e))?;
|
||||
self.transact(|| {
|
||||
Spi::connect(|client| {
|
||||
let mut tup_table = client
|
||||
.select("SELECT clock_timestamp()::text", None, &[])
|
||||
.map_err(|e| format!("SPI Select Error: {}", e))?;
|
||||
|
||||
let row = tup_table
|
||||
.next()
|
||||
.ok_or("No clock timestamp returned".to_string())?;
|
||||
let timestamp: Option<String> = row.get(1).map_err(|e| e.to_string())?;
|
||||
let row = tup_table
|
||||
.next()
|
||||
.ok_or("No clock timestamp returned".to_string())?;
|
||||
let timestamp: Option<String> = row.get(1).map_err(|e| e.to_string())?;
|
||||
|
||||
timestamp.ok_or("Missing timestamp".to_string())
|
||||
timestamp.ok_or("Missing timestamp".to_string())
|
||||
})
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@ -507,10 +507,8 @@ impl Schema {
|
||||
let mut parent_type_name = None;
|
||||
if let Some(family) = &self.obj.family {
|
||||
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||
} else if let Some(id) = &self.obj.id {
|
||||
parent_type_name = Some(id.split('.').next_back().unwrap_or("").to_string());
|
||||
} else if let Some(ref_id) = &self.obj.r#ref {
|
||||
parent_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
||||
} else if let Some(identifier) = self.obj.identifier() {
|
||||
parent_type_name = Some(identifier);
|
||||
}
|
||||
|
||||
if let Some(p_type) = parent_type_name {
|
||||
@ -531,12 +529,12 @@ impl Schema {
|
||||
|
||||
if let Some(family) = &target_schema.obj.family {
|
||||
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||
} else if let Some(ref_id) = target_schema.obj.r#ref.as_ref() {
|
||||
child_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
||||
} else if let Some(ref_id) = target_schema.obj.identifier() {
|
||||
child_type_name = Some(ref_id);
|
||||
} else if let Some(arr) = &target_schema.obj.one_of {
|
||||
if let Some(first) = arr.first() {
|
||||
if let Some(ref_id) = first.obj.id.as_ref().or(first.obj.r#ref.as_ref()) {
|
||||
child_type_name = Some(ref_id.split('.').next_back().unwrap_or("").to_string());
|
||||
if let Some(ref_id) = first.obj.identifier() {
|
||||
child_type_name = Some(ref_id);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -697,6 +695,16 @@ impl<'de> Deserialize<'de> for Schema {
|
||||
}
|
||||
}
|
||||
|
||||
impl SchemaObject {
|
||||
pub fn identifier(&self) -> Option<String> {
|
||||
if let Some(lookup_key) = self.id.as_ref().or(self.r#ref.as_ref()) {
|
||||
Some(lookup_key.split('.').next_back().unwrap_or("").to_string())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(untagged)]
|
||||
pub enum SchemaTypeOrArray {
|
||||
|
||||
@ -45,12 +45,33 @@ impl Merger {
|
||||
let val_resolved = match result {
|
||||
Ok(val) => val,
|
||||
Err(msg) => {
|
||||
let mut final_code = "MERGE_FAILED".to_string();
|
||||
let mut final_message = msg.clone();
|
||||
let mut final_cause = None;
|
||||
|
||||
if let Ok(Value::Object(map)) = serde_json::from_str::<Value>(&msg) {
|
||||
if let (Some(Value::String(e_msg)), Some(Value::String(e_code))) = (map.get("error"), map.get("code")) {
|
||||
final_message = e_msg.clone();
|
||||
final_code = e_code.clone();
|
||||
let mut cause_parts = Vec::new();
|
||||
if let Some(Value::String(d)) = map.get("detail") {
|
||||
if !d.is_empty() { cause_parts.push(d.clone()); }
|
||||
}
|
||||
if let Some(Value::String(h)) = map.get("hint") {
|
||||
if !h.is_empty() { cause_parts.push(h.clone()); }
|
||||
}
|
||||
if !cause_parts.is_empty() {
|
||||
final_cause = Some(cause_parts.join("\n"));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||
code: "MERGE_FAILED".to_string(),
|
||||
message: msg,
|
||||
code: final_code,
|
||||
message: final_message,
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: "".to_string(),
|
||||
cause: None,
|
||||
cause: final_cause,
|
||||
context: Some(data),
|
||||
schema: None,
|
||||
},
|
||||
@ -564,11 +585,14 @@ impl Merger {
|
||||
template
|
||||
};
|
||||
|
||||
let where_clause = if let Some(id) = id_val {
|
||||
format!("WHERE t1.id = {}", Self::quote_literal(id))
|
||||
} else if lookup_complete {
|
||||
let mut lookup_predicates = Vec::new();
|
||||
let mut where_parts = Vec::new();
|
||||
|
||||
if let Some(id) = id_val {
|
||||
where_parts.push(format!("t1.id = {}", Self::quote_literal(id)));
|
||||
}
|
||||
|
||||
if lookup_complete {
|
||||
let mut lookup_predicates = Vec::new();
|
||||
for column in &entity_type.lookup_fields {
|
||||
let val = entity_fields.get(column).unwrap_or(&Value::Null);
|
||||
if column == "type" {
|
||||
@ -577,10 +601,14 @@ impl Merger {
|
||||
lookup_predicates.push(format!("\"{}\" = {}", column, Self::quote_literal(val)));
|
||||
}
|
||||
}
|
||||
format!("WHERE {}", lookup_predicates.join(" AND "))
|
||||
} else {
|
||||
where_parts.push(format!("({})", lookup_predicates.join(" AND ")));
|
||||
}
|
||||
|
||||
if where_parts.is_empty() {
|
||||
return Ok(None);
|
||||
};
|
||||
}
|
||||
|
||||
let where_clause = format!("WHERE {}", where_parts.join(" OR "));
|
||||
|
||||
let final_sql = format!("{} {}", fetch_sql_template, where_clause);
|
||||
|
||||
@ -691,8 +719,7 @@ impl Merger {
|
||||
);
|
||||
self
|
||||
.db
|
||||
.execute(&sql, None)
|
||||
.map_err(|e| format!("SPI Error in INSERT: {:?}", e))?;
|
||||
.execute(&sql, None)?;
|
||||
} else if change_kind == "update" || change_kind == "delete" {
|
||||
entity_pairs.remove("id");
|
||||
entity_pairs.remove("type");
|
||||
@ -726,8 +753,7 @@ impl Merger {
|
||||
);
|
||||
self
|
||||
.db
|
||||
.execute(&sql, None)
|
||||
.map_err(|e| format!("SPI Error in UPDATE: {:?}", e))?;
|
||||
.execute(&sql, None)?;
|
||||
}
|
||||
}
|
||||
|
||||
@ -830,10 +856,7 @@ impl Merger {
|
||||
Self::quote_literal(&Value::String(user_id.to_string()))
|
||||
);
|
||||
|
||||
self
|
||||
.db
|
||||
.execute(&change_sql, None)
|
||||
.map_err(|e| format!("Executor Error in change: {:?}", e))?;
|
||||
self.db.execute(&change_sql, None)?;
|
||||
}
|
||||
|
||||
if type_obj.notify {
|
||||
|
||||
@ -63,33 +63,33 @@ impl<'a> Compiler<'a> {
|
||||
}
|
||||
|
||||
fn compile_array(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
||||
// 1. Array of DB Entities (`$ref` or `$family` pointing to a table limit)
|
||||
if let Some(items) = &node.schema.obj.items {
|
||||
let next_path = node.ast_path.clone();
|
||||
|
||||
if let Some(ref_id) = &items.obj.r#ref {
|
||||
if let Some(type_def) = self.db.types.get(ref_id) {
|
||||
let mut entity_node = node.clone();
|
||||
entity_node.ast_path = next_path;
|
||||
entity_node.schema = std::sync::Arc::clone(items);
|
||||
return self.compile_entity(type_def, entity_node, true);
|
||||
}
|
||||
let mut resolved_type = None;
|
||||
if let Some(family_target) = items.obj.family.as_ref() {
|
||||
let base_type_name = family_target.split('.').next_back().unwrap_or(family_target);
|
||||
resolved_type = self.db.types.get(base_type_name);
|
||||
} else if let Some(base_type_name) = items.obj.identifier() {
|
||||
resolved_type = self.db.types.get(&base_type_name);
|
||||
}
|
||||
|
||||
let mut next_node = node.clone();
|
||||
next_node.depth += 1;
|
||||
next_node.ast_path = next_path;
|
||||
next_node.schema = std::sync::Arc::clone(items);
|
||||
let (item_sql, _) = self.compile_node(next_node)?;
|
||||
if let Some(type_def) = resolved_type {
|
||||
let mut entity_node = node.clone();
|
||||
entity_node.schema = std::sync::Arc::clone(items);
|
||||
return self.compile_entity(type_def, entity_node, true);
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Arrays of mapped Native Postgres Columns (e.g. `jsonb`, `text[]`)
|
||||
if let Some(prop) = &node.property_name {
|
||||
return Ok((
|
||||
format!("(SELECT jsonb_agg({}) FROM TODO)", item_sql),
|
||||
format!("{}.{}", node.parent_alias, prop),
|
||||
"array".to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
Ok((
|
||||
"SELECT jsonb_agg(TODO) FROM TODO".to_string(),
|
||||
"array".to_string(),
|
||||
))
|
||||
// 3. Fallback for root execution of standalone non-entity arrays
|
||||
Err("Cannot compile a root array without a valid entity reference or table mapped via `items`.".to_string())
|
||||
}
|
||||
|
||||
fn compile_reference(&mut self, node: Node<'a>) -> Result<(String, String), String> {
|
||||
@ -98,14 +98,7 @@ impl<'a> Compiler<'a> {
|
||||
|
||||
if let Some(family_target) = node.schema.obj.family.as_ref() {
|
||||
resolved_type = self.db.types.get(family_target);
|
||||
} else if let Some(lookup_key) = node
|
||||
.schema
|
||||
.obj
|
||||
.id
|
||||
.as_ref()
|
||||
.or(node.schema.obj.r#ref.as_ref())
|
||||
{
|
||||
let base_type_name = lookup_key.split('.').next_back().unwrap_or("").to_string();
|
||||
} else if let Some(base_type_name) = node.schema.obj.identifier() {
|
||||
resolved_type = self.db.types.get(&base_type_name);
|
||||
}
|
||||
|
||||
@ -523,8 +516,8 @@ impl<'a> Compiler<'a> {
|
||||
let mut bound_type_name = None;
|
||||
if let Some(family_target) = prop_schema.obj.family.as_ref() {
|
||||
bound_type_name = Some(family_target.split('.').next_back().unwrap_or(family_target).to_string());
|
||||
} else if let Some(lookup_key) = prop_schema.obj.id.as_ref().or(prop_schema.obj.r#ref.as_ref()) {
|
||||
bound_type_name = Some(lookup_key.split('.').next_back().unwrap_or(lookup_key).to_string());
|
||||
} else if let Some(lookup_key) = prop_schema.obj.identifier() {
|
||||
bound_type_name = Some(lookup_key);
|
||||
}
|
||||
|
||||
if let Some(type_name) = bound_type_name {
|
||||
|
||||
@ -1457,6 +1457,12 @@ fn test_queryer_0_7() {
|
||||
crate::tests::runner::run_test_case(&path, 0, 7).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_queryer_0_8() {
|
||||
let path = format!("{}/fixtures/queryer.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 8).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_not_0_0() {
|
||||
let path = format!("{}/fixtures/not.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -2921,6 +2927,36 @@ fn test_minimum_1_6() {
|
||||
crate::tests::runner::run_test_case(&path, 1, 6).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_paths_0_0() {
|
||||
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_paths_0_1() {
|
||||
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_paths_0_2() {
|
||||
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_paths_0_3() {
|
||||
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 3).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_paths_0_4() {
|
||||
let path = format!("{}/fixtures/paths.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 4).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_one_of_0_0() {
|
||||
let path = format!("{}/fixtures/oneOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -8560,3 +8596,9 @@ fn test_merger_0_10() {
|
||||
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 10).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_merger_0_11() {
|
||||
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 11).unwrap();
|
||||
}
|
||||
|
||||
@ -134,12 +134,12 @@ fn test_library_api() {
|
||||
{
|
||||
"code": "REQUIRED_FIELD_MISSING",
|
||||
"message": "Missing name",
|
||||
"details": { "path": "/name" }
|
||||
"details": { "path": "name" }
|
||||
},
|
||||
{
|
||||
"code": "STRICT_PROPERTY_VIOLATION",
|
||||
"message": "Unexpected property 'wrong'",
|
||||
"details": { "path": "/wrong" }
|
||||
"details": { "path": "wrong" }
|
||||
}
|
||||
]
|
||||
})
|
||||
|
||||
@ -41,6 +41,14 @@ impl<'a> ValidationContext<'a> {
|
||||
}
|
||||
}
|
||||
|
||||
pub fn join_path(&self, key: &str) -> String {
|
||||
if self.path.is_empty() {
|
||||
key.to_string()
|
||||
} else {
|
||||
format!("{}/{}", self.path, key)
|
||||
}
|
||||
}
|
||||
|
||||
pub fn derive(
|
||||
&self,
|
||||
schema: &'a Schema,
|
||||
|
||||
@ -91,12 +91,17 @@ impl<'a> ValidationContext<'a> {
|
||||
if let Some(ref prefix) = self.schema.prefix_items {
|
||||
for (i, sub_schema) in prefix.iter().enumerate() {
|
||||
if i < len {
|
||||
let path = format!("{}/{}", self.path, i);
|
||||
if let Some(child_instance) = arr.get(i) {
|
||||
let mut item_path = self.join_path(&i.to_string());
|
||||
if let Some(obj) = child_instance.as_object() {
|
||||
if let Some(id_str) = obj.get("id").and_then(|v| v.as_str()) {
|
||||
item_path = self.join_path(id_str);
|
||||
}
|
||||
}
|
||||
let derived = self.derive(
|
||||
sub_schema,
|
||||
child_instance,
|
||||
&path,
|
||||
&item_path,
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
false,
|
||||
@ -112,12 +117,17 @@ impl<'a> ValidationContext<'a> {
|
||||
|
||||
if let Some(ref items_schema) = self.schema.items {
|
||||
for i in validation_index..len {
|
||||
let path = format!("{}/{}", self.path, i);
|
||||
if let Some(child_instance) = arr.get(i) {
|
||||
let mut item_path = self.join_path(&i.to_string());
|
||||
if let Some(obj) = child_instance.as_object() {
|
||||
if let Some(id_str) = obj.get("id").and_then(|v| v.as_str()) {
|
||||
item_path = self.join_path(id_str);
|
||||
}
|
||||
}
|
||||
let derived = self.derive(
|
||||
items_schema,
|
||||
child_instance,
|
||||
&path,
|
||||
&item_path,
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
false,
|
||||
|
||||
@ -44,7 +44,7 @@ impl<'a> ValidationContext<'a> {
|
||||
result.errors.push(ValidationError {
|
||||
code: "STRICT_PROPERTY_VIOLATION".to_string(),
|
||||
message: format!("Unexpected property '{}'", key),
|
||||
path: format!("{}/{}", self.path, key),
|
||||
path: self.join_path(key),
|
||||
});
|
||||
}
|
||||
}
|
||||
@ -53,10 +53,18 @@ impl<'a> ValidationContext<'a> {
|
||||
if let Some(arr) = self.instance.as_array() {
|
||||
for i in 0..arr.len() {
|
||||
if !result.evaluated_indices.contains(&i) {
|
||||
let mut item_path = self.join_path(&i.to_string());
|
||||
if let Some(child_instance) = arr.get(i) {
|
||||
if let Some(obj) = child_instance.as_object() {
|
||||
if let Some(id_str) = obj.get("id").and_then(|v| v.as_str()) {
|
||||
item_path = self.join_path(id_str);
|
||||
}
|
||||
}
|
||||
}
|
||||
result.errors.push(ValidationError {
|
||||
code: "STRICT_ITEM_VIOLATION".to_string(),
|
||||
message: format!("Unexpected item at index {}", i),
|
||||
path: format!("{}/{}", self.path, i),
|
||||
path: item_path,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
@ -14,16 +14,13 @@ impl<'a> ValidationContext<'a> {
|
||||
let current = self.instance;
|
||||
if let Some(obj) = current.as_object() {
|
||||
// Entity implicit type validation
|
||||
// Use the specific schema id or ref as a fallback
|
||||
if let Some(identifier) = self.schema.id.as_ref().or(self.schema.r#ref.as_ref()) {
|
||||
if let Some(schema_identifier) = self.schema.identifier() {
|
||||
// Kick in if the data object has a type field
|
||||
if let Some(type_val) = obj.get("type")
|
||||
&& let Some(type_str) = type_val.as_str()
|
||||
{
|
||||
// Get the string or the final segment as the base
|
||||
let base = identifier.split('.').next_back().unwrap_or("").to_string();
|
||||
// Check if the base is a global type name
|
||||
if let Some(type_def) = self.db.types.get(&base) {
|
||||
// Check if the identifier is a global type name
|
||||
if let Some(type_def) = self.db.types.get(&schema_identifier) {
|
||||
// Ensure the instance type is a variation of the global type
|
||||
if type_def.variations.contains(type_str) {
|
||||
// Ensure it passes strict mode
|
||||
@ -35,12 +32,12 @@ impl<'a> ValidationContext<'a> {
|
||||
"Type '{}' is not a valid descendant for this entity bound schema",
|
||||
type_str
|
||||
),
|
||||
path: format!("{}/type", self.path),
|
||||
path: self.join_path("type"),
|
||||
});
|
||||
}
|
||||
} else {
|
||||
// Ad-Hoc schemas natively use strict schema discriminator strings instead of variation inheritance
|
||||
if type_str == identifier {
|
||||
if type_str == schema_identifier.as_str() {
|
||||
result.evaluated_keys.insert("type".to_string());
|
||||
}
|
||||
}
|
||||
@ -73,7 +70,7 @@ impl<'a> ValidationContext<'a> {
|
||||
result.errors.push(ValidationError {
|
||||
code: "REQUIRED_FIELD_MISSING".to_string(),
|
||||
message: format!("Missing {}", field),
|
||||
path: format!("{}/{}", self.path, field),
|
||||
path: self.join_path(field),
|
||||
});
|
||||
}
|
||||
}
|
||||
@ -112,7 +109,7 @@ impl<'a> ValidationContext<'a> {
|
||||
}
|
||||
|
||||
if let Some(child_instance) = obj.get(key) {
|
||||
let new_path = format!("{}/{}", self.path, key);
|
||||
let new_path = self.join_path(key);
|
||||
let is_ref = sub_schema.r#ref.is_some();
|
||||
let next_extensible = if is_ref { false } else { self.extensible };
|
||||
|
||||
@ -128,14 +125,9 @@ impl<'a> ValidationContext<'a> {
|
||||
|
||||
// Entity Bound Implicit Type Interception
|
||||
if key == "type"
|
||||
&& let Some(schema_bound) = sub_schema.id.as_ref().or(sub_schema.r#ref.as_ref())
|
||||
&& let Some(schema_bound) = sub_schema.identifier()
|
||||
{
|
||||
let physical_type_name = schema_bound
|
||||
.split('.')
|
||||
.next_back()
|
||||
.unwrap_or("")
|
||||
.to_string();
|
||||
if let Some(type_def) = self.db.types.get(&physical_type_name)
|
||||
if let Some(type_def) = self.db.types.get(&schema_bound)
|
||||
&& let Some(instance_type) = child_instance.as_str()
|
||||
&& type_def.variations.contains(instance_type)
|
||||
{
|
||||
@ -155,7 +147,7 @@ impl<'a> ValidationContext<'a> {
|
||||
for (compiled_re, sub_schema) in compiled_pp {
|
||||
for (key, child_instance) in obj {
|
||||
if compiled_re.0.is_match(key) {
|
||||
let new_path = format!("{}/{}", self.path, key);
|
||||
let new_path = self.join_path(key);
|
||||
let is_ref = sub_schema.r#ref.is_some();
|
||||
let next_extensible = if is_ref { false } else { self.extensible };
|
||||
|
||||
@ -194,7 +186,7 @@ impl<'a> ValidationContext<'a> {
|
||||
}
|
||||
|
||||
if !locally_matched {
|
||||
let new_path = format!("{}/{}", self.path, key);
|
||||
let new_path = self.join_path(key);
|
||||
let is_ref = additional_schema.r#ref.is_some();
|
||||
let next_extensible = if is_ref { false } else { self.extensible };
|
||||
|
||||
@ -215,7 +207,7 @@ impl<'a> ValidationContext<'a> {
|
||||
|
||||
if let Some(ref property_names) = self.schema.property_names {
|
||||
for key in obj.keys() {
|
||||
let _new_path = format!("{}/propertyNames/{}", self.path, key);
|
||||
let _new_path = self.join_path(&format!("propertyNames/{}", key));
|
||||
let val_str = Value::String(key.to_string());
|
||||
|
||||
let ctx = ValidationContext::new(
|
||||
|
||||
Reference in New Issue
Block a user