Compare commits
8 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| c2267b68d8 | |||
| f58d1a32a3 | |||
| d9f4a90225 | |||
| 509194b55f | |||
| 87a845e85a | |||
| 8175b10a97 | |||
| 0b072d66e7 | |||
| 41649766db |
34
GEMINI.md
34
GEMINI.md
@ -13,7 +13,7 @@ JSPG operates by deeply integrating the JSON Schema Draft 2020-12 specification
|
||||
1. **Draft 2020-12 Based**: Attempt to adhere to the official JSON Schema Draft 2020-12 specification, while heavily augmenting it for strict structural typing.
|
||||
2. **Ultra-Fast Execution**: Compile schemas into optimized in-memory validation trees and cached SQL SPIs to bypass Postgres Query Builder overheads.
|
||||
3. **Connection-Bound Caching**: Leverage the PostgreSQL session lifecycle using an **Atomic Swap** pattern. Schemas are 100% frozen, completely eliminating locks during read access.
|
||||
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `$family` references natively mapped to Postgres table constraints.
|
||||
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `family` references natively mapped to Postgres table constraints.
|
||||
5. **Reactive Beats**: Provide ultra-fast natively generated flat payloads mapping directly to the Dart topological state for dynamic websocket reactivity.
|
||||
|
||||
### Concurrency & Threading ("Immutable Graphs")
|
||||
@ -55,8 +55,8 @@ In Punc, polymorphic targets like explicit tagged unions or STI (Single Table In
|
||||
Therefore, any schema that participates in polymorphic discrimination MUST explicitly define its discriminator properties natively inside its `properties` block. However, to stay DRY and maintain flexible APIs, you **DO NOT** need to hardcode `const` values, nor should you add them to your `required` array. The Punc engine treats `type` and `kind` as **magic properties**.
|
||||
|
||||
**Magic Validation Constraints**:
|
||||
* **Dynamically Required**: The system inherently drives the need for their requirement. The Validator dynamically expects the discriminators and structurally bubbles `MISSING_TYPE` ultimata ONLY when a polymorphic router (`$family` / `oneOf`) dynamically requires them to resolve a path. You never manually put them in the JSON schema `required` block.
|
||||
* **Implicit Resolution**: When wrapped in `$family` or `oneOf`, the polymorphic router can mathematically parse the schema key (e.g. `light.person`) and natively validate that `type` equals `"person"` and `kind` equals `"light"`, bubbling `CONST_VIOLATED` if they mismatch, all without you ever hardcoding `const` limitations.
|
||||
* **Dynamically Required**: The system inherently drives the need for their requirement. The Validator dynamically expects the discriminators and structurally bubbles `MISSING_TYPE` ultimata ONLY when a polymorphic router (`family` / `oneOf`) dynamically requires them to resolve a path. You never manually put them in the JSON schema `required` block.
|
||||
* **Implicit Resolution**: When wrapped in `family` or `oneOf`, the polymorphic router can mathematically parse the schema key (e.g. `light.person`) and natively validate that `type` equals `"person"` and `kind` equals `"light"`, bubbling `CONST_VIOLATED` if they mismatch, all without you ever hardcoding `const` limitations.
|
||||
* **Generator Explicitness**: Because Postgres is the Single Source of Truth, forcing the explicit definition in `properties` initially guarantees the downstream Dart/Go code generators observe the fields and can cleanly serialize them dynamically back to the server.
|
||||
|
||||
For example, a schema registered under the exact key `"light.person"` inside the database registry must natively define its own structural boundaries:
|
||||
@ -72,7 +72,7 @@ For example, a schema registered under the exact key `"light.person"` inside the
|
||||
|
||||
* **The Object Contract (Presence)**: The Object enforces its own structural integrity mechanically. Standard JSON Validation natively ensures `type` and `kind` are dynamically present as expected.
|
||||
* **The Dynamic Values (`db.types`)**: Because the `type` and `kind` properties technically exist, the Punc engine dynamically intercepts them during `validate_object`. It mathematically parses the schema key (e.g. `light.person`) and natively validates that `type` equals `"person"` (or a valid descendant in `db.types`) and `kind` equals `"light"`, bubbling `CONST_VIOLATED` if they mismatch.
|
||||
* **The Routing Contract**: When wrapped in `$family` or `oneOf`, the polymorphic router can execute Lightning Fast $O(1)$ fast-paths by reading the payload's `type`/`kind` identifiers, and gracefully fallback to standard structural failure if omitted.
|
||||
* **The Routing Contract**: When wrapped in `family` or `oneOf`, the polymorphic router can execute Lightning Fast $O(1)$ fast-paths by reading the payload's `type`/`kind` identifiers, and gracefully fallback to standard structural failure if omitted.
|
||||
|
||||
### Composition & Inheritance (The `type` keyword)
|
||||
Punc completely abandons the standard JSON Schema `$ref` keyword. Instead, it overloads the exact same `type` keyword used for primitives. A `"type"` in Punc is mathematically evaluated as either a Native Primitive (`"string"`, `"null"`) or a Custom Object Pointer (`"budget"`, `"user"`).
|
||||
@ -81,24 +81,24 @@ Punc completely abandons the standard JSON Schema `$ref` keyword. Instead, it ov
|
||||
* **Primitive Array Shorthand (Optionality)**: The `type` array syntax is heavily optimized for nullable fields. Defining `"type": ["budget", "null"]` natively builds a nullable strict, generating `Budget? budget;` in Dart. You can freely mix primitives like `["string", "number", "null"]`.
|
||||
* **Strict Array Constraint**: To explicitly prevent mathematically ambiguous Multiple Inheritance, a `type` array is strictly constrained to at most **ONE** Custom Object Pointer. Defining `"type": ["person", "organization"]` will intentionally trigger a fatal database compilation error natively instructing developers to build a proper tagged union (`oneOf`) instead.
|
||||
|
||||
### Polymorphism (`$family` and `oneOf`)
|
||||
### Polymorphism (`family` and `oneOf`)
|
||||
Polymorphism is how an object boundary can dynamically take on entirely different shapes based on the payload provided at runtime. Punc utilizes the static database metadata generated from Postgres (`db.types`) to enforce these boundaries deterministically, rather than relying on ambiguous tree-traversals.
|
||||
|
||||
* **`$family` (Target-Based Polymorphism)**: An explicit Punc compiler macro instructing the engine to resolve dynamic options against the registered database `types` variations or its inner schema registry. It uses the exact physical constraints of the database to build SQL and validation routes.
|
||||
* **`family` (Target-Based Polymorphism)**: An explicit Punc compiler macro instructing the engine to resolve dynamic options against the registered database `types` variations or its inner schema registry. It uses the exact physical constraints of the database to build SQL and validation routes.
|
||||
* **Scenario A: Global Tables (Vertical Routing)**
|
||||
* *Setup*: `{ "$family": "organization" }`
|
||||
* *Execution*: The engine queries `db.types.get("organization").variations` and finds `["bot", "organization", "person"]`. Because organizations are structurally table-backed, the `$family` automatically uses `type` as the discriminator.
|
||||
* *Setup*: `{ "family": "organization" }`
|
||||
* *Execution*: The engine queries `db.types.get("organization").variations` and finds `["bot", "organization", "person"]`. Because organizations are structurally table-backed, the `family` automatically uses `type` as the discriminator.
|
||||
* *Options*: `bot` -> `bot`, `person` -> `person`, `organization` -> `organization`.
|
||||
* **Scenario B: Prefixed Tables (Vertical Projection)**
|
||||
* *Setup*: `{ "$family": "light.organization" }`
|
||||
* *Setup*: `{ "family": "light.organization" }`
|
||||
* *Execution*: The engine sees the prefix `light.` and base `organization`. It queries `db.types.get("organization").variations` and dynamically prepends the prefix to discover the relevant UI schemas.
|
||||
* *Options*: `person` -> `light.person`, `organization` -> `light.organization`. (If a projection like `light.bot` does not exist in `db.schemas`, it is safely ignored).
|
||||
* **Scenario C: Single Table Inheritance (Horizontal Routing)**
|
||||
* *Setup*: `{ "$family": "widget" }` (Where `widget` is a table type but has no external variations).
|
||||
* *Execution*: The engine queries `db.types.get("widget").variations` and finds only `["widget"]`. Since it lacks table inheritance, it is treated as STI. The engine scans the specific, confined `schemas` array directly under `db.types.get("widget")` for any registered key terminating in the base `.widget` (e.g., `stock.widget`). The `$family` automatically uses `kind` as the discriminator.
|
||||
* *Setup*: `{ "family": "widget" }` (Where `widget` is a table type but has no external variations).
|
||||
* *Execution*: The engine queries `db.types.get("widget").variations` and finds only `["widget"]`. Since it lacks table inheritance, it is treated as STI. The engine scans the specific, confined `schemas` array directly under `db.types.get("widget")` for any registered key terminating in the base `.widget` (e.g., `stock.widget`). The `family` automatically uses `kind` as the discriminator.
|
||||
* *Options*: `stock` -> `stock.widget`, `tasks` -> `tasks.widget`.
|
||||
|
||||
* **`oneOf` (Strict Tagged Unions)**: A hardcoded list of candidate schemas. Unlike `$family` which relies on global DB metadata, `oneOf` forces pure mathematical structural evaluation of the provided candidates. It strictly bans typical JSON Schema "Union of Sets" fallback searches. Every candidate MUST possess a mathematically unique discriminator payload to allow $O(1)$ routing.
|
||||
* **`oneOf` (Strict Tagged Unions)**: A hardcoded list of candidate schemas. Unlike `family` which relies on global DB metadata, `oneOf` forces pure mathematical structural evaluation of the provided candidates. It strictly bans typical JSON Schema "Union of Sets" fallback searches. Every candidate MUST possess a mathematically unique discriminator payload to allow $O(1)$ routing.
|
||||
* **Disjoint Types**: `oneOf: [{ "type": "person" }, { "type": "widget" }]`. The engine succeeds because the native `type` acts as a unique discriminator (`"person"` vs `"widget"`).
|
||||
* **STI Types**: `oneOf: [{ "type": "heavy.person" }, { "type": "light.person" }]`. The engine succeeds. Even though both share `"type": "person"`, their explicit discriminator is `kind` (`"heavy"` vs `"light"`), ensuring unique $O(1)$ fast-paths.
|
||||
* **Conflicting Types**: `oneOf: [{ "type": "person" }, { "type": "light.person" }]`. The engine **fails compilation natively**. Both schemas evaluate to `"type": "person"` and neither provides a disjoint `kind` constraint, making them mathematically ambiguous and impossible to route in $O(1)$ time.
|
||||
@ -187,10 +187,10 @@ The Validator provides strict, schema-driven evaluation for the "Punc" architect
|
||||
JSPG implements specific extensions to the Draft 2020-12 standard to support the Punc architecture's object-oriented needs while heavily optimizing for zero-runtime lookups.
|
||||
|
||||
* **Caching Strategy**: The Validator caches the pre-compiled `Database` registry in memory upon initialization (`jspg_setup`). This registry holds the comprehensive graph of schema boundaries, Types, ENUMs, and Foreign Key relationships, acting as the Single Source of Truth for all validation operations without polling Postgres.
|
||||
* **Discriminator Fast Paths & Extraction**: When executing a polymorphic node (`oneOf` or `$family`), the engine statically analyzes the incoming JSON payload for the literal `type` and `kind` string coordinates. It routes the evaluation specifically to matching candidates in $O(1)$ while returning `MISSING_TYPE` ultimata directly.
|
||||
* **Discriminator Fast Paths & Extraction**: When executing a polymorphic node (`oneOf` or `family`), the engine statically analyzes the incoming JSON payload for the literal `type` and `kind` string coordinates. It routes the evaluation specifically to matching candidates in $O(1)$ while returning `MISSING_TYPE` ultimata directly.
|
||||
* **Missing Type Ultimatum**: If an entity logically requires a discriminator and the JSON payload omits it, JSPG short-circuits branch execution entirely, bubbling a single, perfectly-pathed `MISSING_TYPE` error back to the UI natively to prevent confusing cascading failures.
|
||||
* **Golden Match Context**: When exactly one structural candidate perfectly maps a discriminator, the Validator exclusively cascades that specific structural error context directly to the user, stripping away all noise generated by other parallel schemas.
|
||||
* **Topological Array Pathing**: Instead of relying on explicit `$id` references or injected properties, array iteration paths are dynamically typed based on their compiler boundary constraints. If the array's `items` schema resolves to a topological table-backed entity (e.g., inheriting via a `$family` macro tracked in the global DB catalog), the array locks paths and derives element indexes from their actual UUID paths (`array/widget-1/name`), natively enforcing database continuity. If evaluating isolated ad-hoc JSONB elements, strict numeric indexing is enforced natively (`array/1/name`) preventing synthetic payload manipulation.
|
||||
* **Topological Array Pathing**: Instead of relying on explicit `$id` references or injected properties, array iteration paths are dynamically typed based on their compiler boundary constraints. If the array's `items` schema resolves to a topological table-backed entity (e.g., inheriting via a `family` macro tracked in the global DB catalog), the array locks paths and derives element indexes from their actual UUID paths (`array/widget-1/name`), natively enforcing database continuity. If evaluating isolated ad-hoc JSONB elements, strict numeric indexing is enforced natively (`array/1/name`) preventing synthetic payload manipulation.
|
||||
|
||||
---
|
||||
|
||||
@ -234,11 +234,11 @@ The Queryer transforms Postgres into a pre-compiled Semantic Query Engine, desig
|
||||
* **Dynamic Filtering**: Binds parameters natively through `cue.filters` objects. The queryer enforces a strict, structured, MongoDB-style operator syntax to map incoming JSON request constraints directly to their originating structural table columns. Filters support both flat path notation (e.g., `"contacts/is_primary": {...}`) and deeply nested recursive JSON structures (e.g., `{"contacts": {"is_primary": {...}}}`). The queryer recursively traverses and flattens these structures at AST compilation time.
|
||||
* **Equality / Inequality**: `{"$eq": value}`, `{"$ne": value}` automatically map to `=` and `!=`.
|
||||
* **Comparison**: `{"$gt": ...}`, `{"$gte": ...}`, `{"$lt": ...}`, `{"$lte": ...}` directly compile to Postgres comparison operators (`> `, `>=`, `<`, `<=`).
|
||||
* **Array Inclusion**: `{"$in": [values]}`, `{"$nin": [values]}` use native `jsonb_array_elements_text()` bindings to enforce `IN` and `NOT IN` logic without runtime SQL injection risks.
|
||||
* **Array Inclusion**: `{"$of": [values]}`, `{"$nof": [values]}` use native `jsonb_array_elements_text()` bindings to enforce `IN` and `NOT IN` logic without runtime SQL injection risks.
|
||||
* **Text Matching (ILIKE)**: Evaluates `$eq` or `$ne` against string fields containing the `%` character natively into Postgres `ILIKE` and `NOT ILIKE` partial substring matches.
|
||||
* **Type Casting**: Safely resolves dynamic combinations by casting values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`).
|
||||
* **Polymorphic SQL Generation (`$family`)**: Compiles `$family` properties by analyzing the **Physical Database Variations**, *not* the schema descendants.
|
||||
* **The Dot Convention**: When a schema requests `$family: "target.schema"`, the compiler extracts the base type (e.g. `schema`) and looks up its Physical Table definition.
|
||||
* **Polymorphic SQL Generation (`family`)**: Compiles `family` properties by analyzing the **Physical Database Variations**, *not* the schema descendants.
|
||||
* **The Dot Convention**: When a schema requests `family: "target.schema"`, the compiler extracts the base type (e.g. `schema`) and looks up its Physical Table definition.
|
||||
* **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into sub-queries for each variation. To ensure safe resolution, the compiler dynamically evaluates correlation boundaries: it attempts standard Relational Edge discovery first. If no explicit relational edge exists (indicating pure Table Inheritance rather than a standard foreign-key graph relationship), it safely invokes a **Table Parity Fallback**. This generates an explicit ID correlation constraint (`AND inner.id = outer.id`), perfectly binding the structural variations back to the parent row to eliminate Cartesian products.
|
||||
* **Single-Table Bypass**: If the Physical Table is a leaf node with only one variation (e.g. `person` has variations `["person"]`), the compiler cleanly bypasses `CASE` generation and compiles a simple `SELECT` across the base table, as all schema extensions (e.g. `light.person`, `full.person`) are guaranteed to reside in the exact same physical row.
|
||||
|
||||
|
||||
104
add_test.py
104
add_test.py
@ -1,104 +0,0 @@
|
||||
import json
|
||||
|
||||
def load_json(path):
|
||||
with open(path, 'r') as f:
|
||||
return json.load(f)
|
||||
|
||||
def save_json(path, data):
|
||||
with open(path, 'w') as f:
|
||||
json.dump(data, f, indent=2)
|
||||
|
||||
def add_invoice(data):
|
||||
# Add 'invoice' type
|
||||
types = data[0]['database']['types']
|
||||
|
||||
# Check if invoice already exists
|
||||
if any(t.get('name') == 'invoice' for t in types):
|
||||
return
|
||||
|
||||
types.append({
|
||||
"name": "invoice",
|
||||
"hierarchy": ["invoice", "entity"],
|
||||
"primary_key": ["id"],
|
||||
"field_types": {
|
||||
"id": "uuid",
|
||||
"number": "text",
|
||||
"metadata": "jsonb"
|
||||
},
|
||||
"schemas": {
|
||||
"invoice": {
|
||||
"type": "entity",
|
||||
"properties": {
|
||||
"id": { "type": "string" },
|
||||
"number": { "type": "string" },
|
||||
"metadata": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"internal_note": { "type": "string" },
|
||||
"customer_snapshot": { "type": "entity" },
|
||||
"related_rules": {
|
||||
"type": "array",
|
||||
"items": { "type": "governance_rule" }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
def process_merger():
|
||||
data = load_json('fixtures/merger.json')
|
||||
add_invoice(data)
|
||||
|
||||
# Add test
|
||||
data[0]['tests'].append({
|
||||
"name": "Insert invoice with deep jsonb metadata",
|
||||
"schema": "invoice",
|
||||
"payload": {
|
||||
"number": "INV-1001",
|
||||
"metadata": {
|
||||
"internal_note": "Confidential",
|
||||
"customer_snapshot": {
|
||||
"id": "00000000-0000-0000-0000-000000000000",
|
||||
"type": "person",
|
||||
"first_name": "John"
|
||||
},
|
||||
"related_rules": [
|
||||
{
|
||||
"id": "11111111-1111-1111-1111-111111111111"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"expect": {
|
||||
"sql": [
|
||||
[
|
||||
"INSERT INTO agreego.invoice (metadata, number, id) VALUES ($1, $2, gen_random_uuid()) ON CONFLICT (id) DO UPDATE SET metadata = EXCLUDED.metadata, number = EXCLUDED.number RETURNING id, type",
|
||||
{"metadata": {"customer_snapshot": {"first_name": "John", "id": "00000000-0000-0000-0000-000000000000", "type": "person"}, "internal_note": "Confidential", "related_rules": [{"id": "11111111-1111-1111-1111-111111111111"}]}, "number": "INV-1001"}
|
||||
]
|
||||
]
|
||||
}
|
||||
})
|
||||
save_json('fixtures/merger.json', data)
|
||||
|
||||
def process_queryer():
|
||||
data = load_json('fixtures/queryer.json')
|
||||
add_invoice(data)
|
||||
|
||||
data[0]['tests'].append({
|
||||
"name": "Query invoice with complex JSONB metadata field extraction",
|
||||
"schema": "invoice",
|
||||
"query": {
|
||||
"extract": ["id", "number", "metadata"],
|
||||
"conditions": []
|
||||
},
|
||||
"expect": {
|
||||
"sql": "SELECT jsonb_build_object('id', t1.id, 'metadata', t1.metadata, 'number', t1.number) FROM agreego.invoice t1 WHERE (t1.id IS NOT NULL)",
|
||||
"params": {}
|
||||
}
|
||||
})
|
||||
save_json('fixtures/queryer.json', data)
|
||||
|
||||
process_merger()
|
||||
process_queryer()
|
||||
152
append_test.py
152
append_test.py
@ -1,152 +0,0 @@
|
||||
import json
|
||||
|
||||
path = "fixtures/database.json"
|
||||
|
||||
with open(path, "r") as f:
|
||||
data = json.load(f)
|
||||
|
||||
new_test = {
|
||||
"description": "Schema Promotion Accuracy Test - -- One Database to Rule Them All --",
|
||||
"database": {
|
||||
"puncs": [],
|
||||
"enums": [],
|
||||
"relations": [],
|
||||
"types": [
|
||||
{
|
||||
"id": "t1",
|
||||
"type": "type",
|
||||
"name": "person",
|
||||
"module": "core",
|
||||
"source": "person",
|
||||
"hierarchy": ["person"],
|
||||
"variations": ["person", "student"],
|
||||
"schemas": {
|
||||
"full.person": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"type": {"type": "string"},
|
||||
"name": {"type": "string"},
|
||||
"email": {
|
||||
"$family": "email_address"
|
||||
},
|
||||
"generic_bubble": {
|
||||
"type": "some_bubble"
|
||||
},
|
||||
"ad_hoc_bubble": {
|
||||
"type": "some_bubble",
|
||||
"properties": {
|
||||
"extra_inline_feature": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"}
|
||||
},
|
||||
"standard_relations": {
|
||||
"type": "array",
|
||||
"items": {"type": "contact"}
|
||||
},
|
||||
"extended_relations": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "contact",
|
||||
"properties": {
|
||||
"target": {"type": "email_address"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"student.person": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"type": {"type": "string"},
|
||||
"kind": {"type": "string"},
|
||||
"school": {"type": "string"}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "t2",
|
||||
"type": "type",
|
||||
"name": "email_address",
|
||||
"module": "core",
|
||||
"source": "email_address",
|
||||
"hierarchy": ["email_address"],
|
||||
"variations": ["email_address"],
|
||||
"schemas": {
|
||||
"light.email_address": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"address": {"type": "string"}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "t3",
|
||||
"type": "type",
|
||||
"name": "contact",
|
||||
"module": "core",
|
||||
"source": "contact",
|
||||
"hierarchy": ["contact"],
|
||||
"variations": ["contact"],
|
||||
"schemas": {
|
||||
"full.contact": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {"type": "string"}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "t4",
|
||||
"type": "type",
|
||||
"name": "some_bubble",
|
||||
"module": "core",
|
||||
"source": "some_bubble",
|
||||
"hierarchy": ["some_bubble"],
|
||||
"variations": ["some_bubble"],
|
||||
"schemas": {
|
||||
"some_bubble": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"bubble_prop": {"type": "string"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "Assert exact topological schema promotion paths",
|
||||
"action": "compile",
|
||||
"expect": {
|
||||
"success": True,
|
||||
"schemas": [
|
||||
"ad_hoc_bubble",
|
||||
"email_address",
|
||||
"extended_relations",
|
||||
"extended_relations/target",
|
||||
"full.contact",
|
||||
"full.person",
|
||||
"full.person/ad_hoc_bubble",
|
||||
"full.person/extended_relations",
|
||||
"full.person/extended_relations/target",
|
||||
"light.email_address",
|
||||
"person",
|
||||
"some_bubble",
|
||||
"student.person"
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
data.append(new_test)
|
||||
with open(path, "w") as f:
|
||||
json.dump(data, f, indent=2)
|
||||
|
||||
@ -1,34 +0,0 @@
|
||||
import json
|
||||
|
||||
path = "fixtures/database.json"
|
||||
|
||||
with open(path, "r") as f:
|
||||
data = json.load(f)
|
||||
|
||||
test_case = data[-1]
|
||||
# Get full.person object properties
|
||||
props = test_case["database"]["types"][0]["schemas"]["full.person"]["properties"]
|
||||
|
||||
# Find extended_relations target and add properties!
|
||||
target_ref = props["extended_relations"]["items"]["properties"]["target"]
|
||||
target_ref["properties"] = {
|
||||
"extra_3rd_level": {"type": "string"}
|
||||
}
|
||||
|
||||
# The target is now an ad-hoc composition itself!
|
||||
# We expect `full.person/extended_relations/target` to be globally promoted.
|
||||
|
||||
test_case["tests"][0]["expect"]["schemas"] = [
|
||||
"full.contact",
|
||||
"full.person",
|
||||
"full.person/ad_hoc_bubble",
|
||||
"full.person/extended_relations",
|
||||
"full.person/extended_relations/target", # BOOM! Right here, 3 levels deep!
|
||||
"light.email_address",
|
||||
"some_bubble",
|
||||
"student.person"
|
||||
]
|
||||
|
||||
with open(path, "w") as f:
|
||||
json.dump(data, f, indent=2)
|
||||
|
||||
@ -1,22 +0,0 @@
|
||||
import json
|
||||
|
||||
path = "fixtures/database.json"
|
||||
|
||||
with open(path, "r") as f:
|
||||
data = json.load(f)
|
||||
|
||||
test_case = data[-1]
|
||||
test_case["tests"][0]["expect"]["schemas"] = [
|
||||
"full.contact",
|
||||
"full.person",
|
||||
"full.person/ad_hoc_bubble",
|
||||
"full.person/extended_relations",
|
||||
"full.person/extended_relations/items",
|
||||
"light.email_address",
|
||||
"some_bubble",
|
||||
"student.person"
|
||||
]
|
||||
|
||||
with open(path, "w") as f:
|
||||
json.dump(data, f, indent=2)
|
||||
|
||||
87
fix_test.py
87
fix_test.py
@ -1,87 +0,0 @@
|
||||
import json
|
||||
|
||||
def load_json(path):
|
||||
with open(path, 'r') as f:
|
||||
return json.load(f)
|
||||
|
||||
def save_json(path, data):
|
||||
with open(path, 'w') as f:
|
||||
json.dump(data, f, indent=4)
|
||||
|
||||
def fix_merger():
|
||||
data = load_json('fixtures/merger.json')
|
||||
last_test = data[0]['tests'][-1]
|
||||
|
||||
# Check if the last test is our bad one
|
||||
if "name" in last_test and last_test["name"] == "Insert invoice with deep jsonb metadata":
|
||||
new_test = {
|
||||
"description": last_test["name"],
|
||||
"action": "merge",
|
||||
"schema_id": last_test["schema"],
|
||||
"data": last_test["payload"],
|
||||
"expect": {
|
||||
"success": True,
|
||||
"sql": [
|
||||
[
|
||||
"INSERT INTO agreego.invoice (",
|
||||
" \"metadata\",",
|
||||
" \"number\",",
|
||||
" entity_id,",
|
||||
" id,",
|
||||
" type",
|
||||
")",
|
||||
"VALUES (",
|
||||
" '{",
|
||||
" \"customer_snapshot\":{",
|
||||
" \"first_name\":\"John\",",
|
||||
" \"id\":\"00000000-0000-0000-0000-000000000000\",",
|
||||
" \"type\":\"person\"",
|
||||
" },",
|
||||
" \"internal_note\":\"Confidential\",",
|
||||
" \"related_rules\":[",
|
||||
" {",
|
||||
" \"id\":\"11111111-1111-1111-1111-111111111111\"",
|
||||
" }",
|
||||
" ]",
|
||||
" }',",
|
||||
" 'INV-1001',",
|
||||
" NULL,",
|
||||
" '{{uuid}}',",
|
||||
" 'invoice'",
|
||||
")"
|
||||
]
|
||||
]
|
||||
}
|
||||
}
|
||||
data[0]['tests'][-1] = new_test
|
||||
save_json('fixtures/merger.json', data)
|
||||
|
||||
def fix_queryer():
|
||||
data = load_json('fixtures/queryer.json')
|
||||
last_test = data[0]['tests'][-1]
|
||||
|
||||
if "name" in last_test and last_test["name"] == "Query invoice with complex JSONB metadata field extraction":
|
||||
new_test = {
|
||||
"description": last_test["name"],
|
||||
"action": "query",
|
||||
"schema_id": last_test["schema"],
|
||||
"expect": {
|
||||
"success": True,
|
||||
"sql": [
|
||||
[
|
||||
"(SELECT jsonb_strip_nulls(jsonb_build_object(",
|
||||
" 'id', invoice_1.id,",
|
||||
" 'metadata', invoice_1.metadata,",
|
||||
" 'number', invoice_1.number,",
|
||||
" 'type', invoice_1.type",
|
||||
"))",
|
||||
"FROM agreego.invoice invoice_1)"
|
||||
]
|
||||
]
|
||||
}
|
||||
}
|
||||
data[0]['tests'][-1] = new_test
|
||||
save_json('fixtures/queryer.json', data)
|
||||
|
||||
fix_merger()
|
||||
fix_queryer()
|
||||
@ -515,7 +515,7 @@
|
||||
"type": "string"
|
||||
},
|
||||
"email": {
|
||||
"$family": "email_address"
|
||||
"family": "email_address"
|
||||
},
|
||||
"generic_bubble": {
|
||||
"type": "some_bubble"
|
||||
@ -651,16 +651,21 @@
|
||||
"action": "compile",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"schemas": [
|
||||
"full.contact",
|
||||
"full.person",
|
||||
"full.person/ad_hoc_bubble",
|
||||
"full.person/extended_relations",
|
||||
"full.person/extended_relations/target",
|
||||
"light.email_address",
|
||||
"some_bubble",
|
||||
"student.person"
|
||||
]
|
||||
"schemas": {
|
||||
"full.contact": {},
|
||||
"full.contact.filter": {},
|
||||
"full.person": {},
|
||||
"full.person.filter": {},
|
||||
"full.person/ad_hoc_bubble": {},
|
||||
"full.person/extended_relations": {},
|
||||
"full.person/extended_relations/target": {},
|
||||
"light.email_address": {},
|
||||
"light.email_address.filter": {},
|
||||
"some_bubble": {},
|
||||
"some_bubble.filter": {},
|
||||
"student.person": {},
|
||||
"student.person.filter": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
@ -919,11 +924,14 @@
|
||||
"action": "compile",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"schemas": [
|
||||
"entity",
|
||||
"invoice",
|
||||
"invoice_line"
|
||||
]
|
||||
"schemas": {
|
||||
"entity": {},
|
||||
"entity.filter": {},
|
||||
"invoice": {},
|
||||
"invoice.filter": {},
|
||||
"invoice_line": {},
|
||||
"invoice_line.filter": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
222
fixtures/filter.json
Normal file
222
fixtures/filter.json
Normal file
@ -0,0 +1,222 @@
|
||||
[
|
||||
{
|
||||
"description": "Filter Synthesis Object-Oriented Composition",
|
||||
"database": {
|
||||
"puncs": [],
|
||||
"enums": [],
|
||||
"relations": [
|
||||
{
|
||||
"id": "rel1",
|
||||
"type": "relation",
|
||||
"constraint": "fk_person_billing_address",
|
||||
"source_type": "person",
|
||||
"source_columns": [
|
||||
"billing_address_id"
|
||||
],
|
||||
"destination_type": "address",
|
||||
"destination_columns": [
|
||||
"id"
|
||||
],
|
||||
"prefix": "billing_address"
|
||||
}
|
||||
],
|
||||
"types": [
|
||||
{
|
||||
"id": "type1",
|
||||
"type": "type",
|
||||
"name": "person",
|
||||
"module": "core",
|
||||
"source": "person",
|
||||
"hierarchy": [
|
||||
"person"
|
||||
],
|
||||
"variations": [
|
||||
"person"
|
||||
],
|
||||
"schemas": {
|
||||
"person": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"first_name": {
|
||||
"type": "string"
|
||||
},
|
||||
"age": {
|
||||
"type": "integer"
|
||||
},
|
||||
"billing_address": {
|
||||
"type": "address"
|
||||
},
|
||||
"birth_date": {
|
||||
"type": "string",
|
||||
"format": "date-time"
|
||||
},
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"ad_hoc": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "type2",
|
||||
"type": "type",
|
||||
"name": "address",
|
||||
"module": "core",
|
||||
"source": "address",
|
||||
"hierarchy": [
|
||||
"address"
|
||||
],
|
||||
"variations": [
|
||||
"address"
|
||||
],
|
||||
"schemas": {
|
||||
"address": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"city": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "type3",
|
||||
"type": "type",
|
||||
"name": "filter",
|
||||
"module": "core",
|
||||
"source": "filter",
|
||||
"hierarchy": [
|
||||
"filter"
|
||||
],
|
||||
"variations": [
|
||||
"filter",
|
||||
"string.condition",
|
||||
"integer.condition",
|
||||
"date.condition"
|
||||
],
|
||||
"schemas": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"kind": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"string.condition": {
|
||||
"type": "condition",
|
||||
"properties": {
|
||||
"$eq": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"integer.condition": {
|
||||
"type": "condition",
|
||||
"properties": {
|
||||
"$eq": {
|
||||
"type": [
|
||||
"integer",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"date.condition": {
|
||||
"type": "condition",
|
||||
"properties": {
|
||||
"$eq": {
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "Assert filter generation map accurately represents strongly typed conditions natively.",
|
||||
"action": "compile",
|
||||
"expect": {
|
||||
"success": true,
|
||||
"schemas": {
|
||||
"person": {},
|
||||
"person.filter": {
|
||||
"type": "object",
|
||||
"compiledPropertyNames": [
|
||||
"age",
|
||||
"billing_address",
|
||||
"birth_date",
|
||||
"first_name"
|
||||
],
|
||||
"properties": {
|
||||
"first_name": {
|
||||
"type": [
|
||||
"string.condition",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"age": {
|
||||
"type": [
|
||||
"integer.condition",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"billing_address": {
|
||||
"type": [
|
||||
"address.filter",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"birth_date": {
|
||||
"type": [
|
||||
"date.condition",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"address": {},
|
||||
"address.filter": {
|
||||
"type": "object",
|
||||
"compiledPropertyNames": [
|
||||
"city"
|
||||
],
|
||||
"properties": {
|
||||
"city": {
|
||||
"type": [
|
||||
"string.condition",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"condition": {},
|
||||
"string.condition": {},
|
||||
"integer.condition": {},
|
||||
"date.condition": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
@ -331,7 +331,7 @@
|
||||
"table_families": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"$family": "widget"
|
||||
"family": "widget"
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -339,7 +339,6 @@
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
|
||||
{
|
||||
"description": "families mechanically map physical variants directly onto topological uuid array paths",
|
||||
"data": {
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
[
|
||||
{
|
||||
"description": "Vertical $family Routing (Across Tables)",
|
||||
"description": "Vertical family Routing (Across Tables)",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
@ -77,7 +77,7 @@
|
||||
],
|
||||
"schemas": {
|
||||
"family_entity": {
|
||||
"$family": "entity"
|
||||
"family": "entity"
|
||||
}
|
||||
}
|
||||
},
|
||||
@ -150,7 +150,7 @@
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "Matrix $family Routing (Vertical + Horizontal Intersections)",
|
||||
"description": "Matrix family Routing (Vertical + Horizontal Intersections)",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
@ -226,7 +226,7 @@
|
||||
],
|
||||
"schemas": {
|
||||
"family_light_org": {
|
||||
"$family": "light.organization"
|
||||
"family": "light.organization"
|
||||
}
|
||||
}
|
||||
},
|
||||
@ -278,7 +278,7 @@
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "Horizontal $family Routing (Virtual Variations)",
|
||||
"description": "Horizontal family Routing (Virtual Variations)",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
@ -319,10 +319,10 @@
|
||||
],
|
||||
"schemas": {
|
||||
"family_widget": {
|
||||
"$family": "widget"
|
||||
"family": "widget"
|
||||
},
|
||||
"family_stock_widget": {
|
||||
"$family": "stock.widget"
|
||||
"family": "stock.widget"
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
@ -17,7 +17,7 @@
|
||||
"get_organizations.response": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"$family": "organization"
|
||||
"family": "organization"
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -26,7 +26,7 @@
|
||||
"name": "get_light_organization",
|
||||
"schemas": {
|
||||
"get_light_organization.response": {
|
||||
"$family": "light.organization"
|
||||
"family": "light.organization"
|
||||
}
|
||||
}
|
||||
},
|
||||
@ -34,7 +34,7 @@
|
||||
"name": "get_full_organization",
|
||||
"schemas": {
|
||||
"get_full_organization.response": {
|
||||
"$family": "full.organization"
|
||||
"family": "full.organization"
|
||||
}
|
||||
}
|
||||
},
|
||||
@ -55,7 +55,7 @@
|
||||
"get_widgets.response": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"$family": "widget"
|
||||
"family": "widget"
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1199,10 +1199,10 @@
|
||||
"id": {
|
||||
"$eq": "123e4567-e89b-12d3-a456-426614174000",
|
||||
"$ne": "123e4567-e89b-12d3-a456-426614174001",
|
||||
"$in": [
|
||||
"$of": [
|
||||
"123e4567-e89b-12d3-a456-426614174000"
|
||||
],
|
||||
"$nin": [
|
||||
"$nof": [
|
||||
"123e4567-e89b-12d3-a456-426614174001"
|
||||
]
|
||||
},
|
||||
@ -1241,9 +1241,9 @@
|
||||
" AND entity_1.created_at <= ($7#>>'{}')::timestamptz",
|
||||
" AND entity_1.created_at != ($8#>>'{}')::timestamptz",
|
||||
" AND entity_1.id = ($9#>>'{}')::uuid",
|
||||
" AND entity_1.id IN (SELECT value::uuid FROM jsonb_array_elements_text(($10#>>'{}')::jsonb))",
|
||||
" AND entity_1.id != ($11#>>'{}')::uuid",
|
||||
" AND entity_1.id NOT IN (SELECT value::uuid FROM jsonb_array_elements_text(($12#>>'{}')::jsonb))",
|
||||
" AND entity_1.id != ($10#>>'{}')::uuid",
|
||||
" AND entity_1.id NOT IN (SELECT value::uuid FROM jsonb_array_elements_text(($11#>>'{}')::jsonb))",
|
||||
" AND entity_1.id IN (SELECT value::uuid FROM jsonb_array_elements_text(($12#>>'{}')::jsonb))",
|
||||
")))"
|
||||
]
|
||||
]
|
||||
@ -1448,14 +1448,14 @@
|
||||
"$eq": 30,
|
||||
"$gt": 20,
|
||||
"$gte": 20,
|
||||
"$in": [
|
||||
"$of": [
|
||||
30,
|
||||
40
|
||||
],
|
||||
"$lt": 50,
|
||||
"$lte": 50,
|
||||
"$ne": 25,
|
||||
"$nin": [
|
||||
"$nof": [
|
||||
1,
|
||||
2
|
||||
]
|
||||
@ -1481,24 +1481,24 @@
|
||||
"$eq": "Jane%",
|
||||
"$gt": "A",
|
||||
"$gte": "A",
|
||||
"$in": [
|
||||
"$of": [
|
||||
"Jane",
|
||||
"John"
|
||||
],
|
||||
"$lt": "Z",
|
||||
"$lte": "Z",
|
||||
"$ne": "Doe",
|
||||
"$nin": [
|
||||
"$nof": [
|
||||
"Bob"
|
||||
]
|
||||
},
|
||||
"id": {
|
||||
"$eq": "00000000-0000-0000-0000-000000000001",
|
||||
"$in": [
|
||||
"$of": [
|
||||
"00000000-0000-0000-0000-000000000001"
|
||||
],
|
||||
"$ne": "00000000-0000-0000-0000-000000000002",
|
||||
"$nin": [
|
||||
"$nof": [
|
||||
"00000000-0000-0000-0000-000000000002"
|
||||
]
|
||||
},
|
||||
@ -1677,11 +1677,11 @@
|
||||
" AND person_1.age = ($1#>>'{}')::numeric",
|
||||
" AND person_1.age > ($2#>>'{}')::numeric",
|
||||
" AND person_1.age >= ($3#>>'{}')::numeric",
|
||||
" AND person_1.age IN (SELECT value::numeric FROM jsonb_array_elements_text(($4#>>'{}')::jsonb))",
|
||||
" AND person_1.age < ($5#>>'{}')::numeric",
|
||||
" AND person_1.age <= ($6#>>'{}')::numeric",
|
||||
" AND person_1.age != ($7#>>'{}')::numeric",
|
||||
" AND person_1.age NOT IN (SELECT value::numeric FROM jsonb_array_elements_text(($8#>>'{}')::jsonb))",
|
||||
" AND person_1.age < ($4#>>'{}')::numeric",
|
||||
" AND person_1.age <= ($5#>>'{}')::numeric",
|
||||
" AND person_1.age != ($6#>>'{}')::numeric",
|
||||
" AND person_1.age NOT IN (SELECT value::numeric FROM jsonb_array_elements_text(($7#>>'{}')::jsonb))",
|
||||
" AND person_1.age IN (SELECT value::numeric FROM jsonb_array_elements_text(($8#>>'{}')::jsonb))",
|
||||
" AND entity_3.archived = ($9#>>'{}')::boolean",
|
||||
" AND entity_3.archived != ($10#>>'{}')::boolean",
|
||||
" AND entity_3.created_at = ($12#>>'{}')::timestamptz",
|
||||
@ -1693,15 +1693,15 @@
|
||||
" AND person_1.first_name ILIKE $18#>>'{}'",
|
||||
" AND person_1.first_name > ($19#>>'{}')",
|
||||
" AND person_1.first_name >= ($20#>>'{}')",
|
||||
" AND person_1.first_name IN (SELECT value FROM jsonb_array_elements_text(($21#>>'{}')::jsonb))",
|
||||
" AND person_1.first_name < ($22#>>'{}')",
|
||||
" AND person_1.first_name <= ($23#>>'{}')",
|
||||
" AND person_1.first_name NOT ILIKE $24#>>'{}'",
|
||||
" AND person_1.first_name NOT IN (SELECT value FROM jsonb_array_elements_text(($25#>>'{}')::jsonb))",
|
||||
" AND person_1.first_name < ($21#>>'{}')",
|
||||
" AND person_1.first_name <= ($22#>>'{}')",
|
||||
" AND person_1.first_name NOT ILIKE $23#>>'{}'",
|
||||
" AND person_1.first_name NOT IN (SELECT value FROM jsonb_array_elements_text(($24#>>'{}')::jsonb))",
|
||||
" AND person_1.first_name IN (SELECT value FROM jsonb_array_elements_text(($25#>>'{}')::jsonb))",
|
||||
" AND entity_3.id = ($26#>>'{}')::uuid",
|
||||
" AND entity_3.id IN (SELECT value::uuid FROM jsonb_array_elements_text(($27#>>'{}')::jsonb))",
|
||||
" AND entity_3.id != ($28#>>'{}')::uuid",
|
||||
" AND entity_3.id NOT IN (SELECT value::uuid FROM jsonb_array_elements_text(($29#>>'{}')::jsonb))",
|
||||
" AND entity_3.id != ($27#>>'{}')::uuid",
|
||||
" AND entity_3.id NOT IN (SELECT value::uuid FROM jsonb_array_elements_text(($28#>>'{}')::jsonb))",
|
||||
" AND entity_3.id IN (SELECT value::uuid FROM jsonb_array_elements_text(($29#>>'{}')::jsonb))",
|
||||
" AND person_1.last_name ILIKE $30#>>'{}'",
|
||||
" AND person_1.last_name NOT ILIKE $31#>>'{}')))"
|
||||
]
|
||||
|
||||
19
scratch.rs
19
scratch.rs
@ -1,19 +0,0 @@
|
||||
use cellular_jspg::database::{Database, object::SchemaTypeOrArray};
|
||||
use cellular_jspg::tests::fixtures::get_queryer_db;
|
||||
|
||||
fn main() {
|
||||
let db_json = get_queryer_db();
|
||||
let db = Database::from_json(&db_json).unwrap();
|
||||
let keys: Vec<_> = db.schemas.keys().collect();
|
||||
println!("Found schemas: {}", keys.len());
|
||||
let mut found = false;
|
||||
for k in keys {
|
||||
if k.contains("email_addresses") {
|
||||
println!("Contains email_addresses: {}", k);
|
||||
found = true;
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
println!("No email_addresses found at all!");
|
||||
}
|
||||
}
|
||||
163
src/database/compile/collection.rs
Normal file
163
src/database/compile/collection.rs
Normal file
@ -0,0 +1,163 @@
|
||||
use crate::database::schema::Schema;
|
||||
use std::sync::Arc;
|
||||
|
||||
impl Schema {
|
||||
#[allow(unused_variables)]
|
||||
pub(crate) fn validate_identifier(
|
||||
id: &str,
|
||||
field_name: &str,
|
||||
root_id: &str,
|
||||
path: &str,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
#[cfg(not(test))]
|
||||
for c in id.chars() {
|
||||
if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "INVALID_IDENTIFIER".to_string(),
|
||||
message: format!(
|
||||
"Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.]",
|
||||
c, field_name, id
|
||||
),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: Some(path.to_string()),
|
||||
schema: Some(root_id.to_string()),
|
||||
..Default::default()
|
||||
},
|
||||
});
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn collect_schemas(
|
||||
schema_arc: &Arc<Schema>,
|
||||
root_id: &str,
|
||||
path: String,
|
||||
to_insert: &mut Vec<(String, Arc<Schema>)>,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &schema_arc.obj.type_ {
|
||||
if t == "array" {
|
||||
if let Some(items) = &schema_arc.obj.items {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(it)) = &items.obj.type_ {
|
||||
if !crate::database::object::is_primitive_type(it) {
|
||||
if items.obj.properties.is_some() || items.obj.cases.is_some() {
|
||||
to_insert.push((path.clone(), Arc::clone(schema_arc)));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if !crate::database::object::is_primitive_type(t) {
|
||||
Self::validate_identifier(t, "type", root_id, &path, errors);
|
||||
|
||||
// Is this an explicit inline ad-hoc composition?
|
||||
if schema_arc.obj.properties.is_some() || schema_arc.obj.cases.is_some() {
|
||||
to_insert.push((path.clone(), Arc::clone(schema_arc)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(family) = &schema_arc.obj.family {
|
||||
Self::validate_identifier(family, "family", root_id, &path, errors);
|
||||
}
|
||||
|
||||
Self::collect_child_schemas(schema_arc, root_id, path, to_insert, errors);
|
||||
}
|
||||
|
||||
pub fn collect_child_schemas(
|
||||
schema_arc: &Arc<Schema>,
|
||||
root_id: &str,
|
||||
path: String,
|
||||
to_insert: &mut Vec<(String, Arc<Schema>)>,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
if let Some(props) = &schema_arc.obj.properties {
|
||||
for (k, v) in props.iter() {
|
||||
let next_path = format!("{}/{}", path, k);
|
||||
Self::collect_schemas(v, root_id, next_path, to_insert, errors);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_props) = &schema_arc.obj.pattern_properties {
|
||||
for (k, v) in pattern_props.iter() {
|
||||
let next_path = format!("{}/{}", path, k);
|
||||
Self::collect_schemas(v, root_id, next_path, to_insert, errors);
|
||||
}
|
||||
}
|
||||
|
||||
let mut map_arr = |arr: &Vec<Arc<Schema>>, sub: &str| {
|
||||
for (i, v) in arr.iter().enumerate() {
|
||||
Self::collect_schemas(
|
||||
v,
|
||||
root_id,
|
||||
format!("{}/{}/{}", path, sub, i),
|
||||
to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
if let Some(arr) = &schema_arc.obj.prefix_items {
|
||||
map_arr(arr, "prefixItems");
|
||||
}
|
||||
|
||||
if let Some(arr) = &schema_arc.obj.one_of {
|
||||
map_arr(arr, "oneOf");
|
||||
}
|
||||
|
||||
let mut map_opt = |opt: &Option<Arc<Schema>>, pass_path: bool, sub: &str| {
|
||||
if let Some(v) = opt {
|
||||
if pass_path {
|
||||
// Arrays explicitly push their wrapper natively.
|
||||
// 'items' becomes a transparent conduit, bypassing self-promotion and skipping the '/items' suffix.
|
||||
Self::collect_child_schemas(v, root_id, path.clone(), to_insert, errors);
|
||||
} else {
|
||||
Self::collect_child_schemas(v, root_id, format!("{}/{}", path, sub), to_insert, errors);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
map_opt(
|
||||
&schema_arc.obj.additional_properties,
|
||||
false,
|
||||
"additionalProperties",
|
||||
);
|
||||
map_opt(&schema_arc.obj.items, true, "items");
|
||||
map_opt(&schema_arc.obj.not, false, "not");
|
||||
map_opt(&schema_arc.obj.contains, false, "contains");
|
||||
map_opt(&schema_arc.obj.property_names, false, "propertyNames");
|
||||
|
||||
if let Some(cases) = &schema_arc.obj.cases {
|
||||
for (i, c) in cases.iter().enumerate() {
|
||||
if let Some(when) = &c.when {
|
||||
Self::collect_schemas(
|
||||
when,
|
||||
root_id,
|
||||
format!("{}/cases/{}/when", path, i),
|
||||
to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
if let Some(then) = &c.then {
|
||||
Self::collect_schemas(
|
||||
then,
|
||||
root_id,
|
||||
format!("{}/cases/{}/then", path, i),
|
||||
to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
if let Some(else_) = &c.else_ {
|
||||
Self::collect_schemas(
|
||||
else_,
|
||||
root_id,
|
||||
format!("{}/cases/{}/else", path, i),
|
||||
to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
128
src/database/compile/edges.rs
Normal file
128
src/database/compile/edges.rs
Normal file
@ -0,0 +1,128 @@
|
||||
use crate::database::schema::Schema;
|
||||
|
||||
impl Schema {
|
||||
/// Dynamically infers and compiles all structural database relationships between this Schema
|
||||
/// and its nested children. This functions recursively traverses the JSON Schema abstract syntax
|
||||
/// tree, identifies physical PostgreSQL table boundaries, and locks the resulting relation
|
||||
/// constraint paths directly onto the `compiled_edges` map in O(1) memory.
|
||||
pub fn compile_edges(
|
||||
&self,
|
||||
db: &crate::database::Database,
|
||||
root_id: &str,
|
||||
path: &str,
|
||||
props: &std::collections::BTreeMap<String, std::sync::Arc<Schema>>,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) -> std::collections::BTreeMap<String, crate::database::edge::Edge> {
|
||||
let mut schema_edges = std::collections::BTreeMap::new();
|
||||
|
||||
// Determine the physical Database Table Name this schema structurally represents
|
||||
// Plucks the polymorphic discriminator via dot-notation (e.g. extracting "person" from "full.person")
|
||||
let mut parent_type_name = None;
|
||||
|
||||
if let Some(family) = &self.obj.family {
|
||||
// 1. Explicit horizontal routing
|
||||
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||
} else if path == root_id {
|
||||
// 2. Root nodes trust their exact registry footprint
|
||||
let base_type_name = path.split('.').next_back().unwrap_or(path).to_string();
|
||||
if db.types.contains_key(&base_type_name) {
|
||||
parent_type_name = Some(base_type_name);
|
||||
}
|
||||
} else if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
|
||||
// 3. Nested graphs trust their explicit struct pointer reference
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
parent_type_name = Some(t.split('.').next_back().unwrap_or(t).to_string());
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(p_type) = parent_type_name {
|
||||
// Proceed only if the resolved table physically exists within the Postgres Type hierarchy
|
||||
if let Some(type_def) = db.types.get(&p_type) {
|
||||
// Iterate over all discovered schema boundaries mapped inside the object
|
||||
for (prop_name, prop_schema) in props {
|
||||
let mut child_type_name = None;
|
||||
let mut target_schema = prop_schema.clone();
|
||||
let mut is_array = false;
|
||||
|
||||
// Structurally unpack the inner target entity if the object maps to an array list
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) =
|
||||
&prop_schema.obj.type_
|
||||
{
|
||||
if t == "array" {
|
||||
is_array = true;
|
||||
if let Some(items) = &prop_schema.obj.items {
|
||||
target_schema = items.clone();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Determine the physical Postgres table backing the nested child schema recursively
|
||||
if let Some(family) = &target_schema.obj.family {
|
||||
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||
} else if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) =
|
||||
&target_schema.obj.type_
|
||||
{
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
child_type_name = Some(t.split('.').next_back().unwrap_or(t).to_string());
|
||||
}
|
||||
} else if let Some(arr) = &target_schema.obj.one_of {
|
||||
if let Some(first) = arr.first() {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &first.obj.type_
|
||||
{
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
child_type_name = Some(t.split('.').next_back().unwrap_or(t).to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(c_type) = child_type_name {
|
||||
// Skip edge compilation for JSONB columns — they store data inline, not relationally.
|
||||
// The physical column type from field_types is the single source of truth.
|
||||
if let Some(ft) = type_def
|
||||
.field_types
|
||||
.as_ref()
|
||||
.and_then(|v| v.get(prop_name.as_str()))
|
||||
.and_then(|v| v.as_str())
|
||||
{
|
||||
if ft == "jsonb" {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
if db.types.contains_key(&c_type) {
|
||||
// Ensure the child Schema's AST has accurately compiled its own physical property keys so we can
|
||||
// inject them securely for Many-to-Many Twin Deduction disambiguation matching.
|
||||
target_schema.compile(db, root_id, format!("{}/{}", path, prop_name), errors);
|
||||
|
||||
if let Some(compiled_target_props) = target_schema.obj.compiled_properties.get() {
|
||||
let keys_for_ambiguity: Vec<String> =
|
||||
compiled_target_props.keys().cloned().collect();
|
||||
|
||||
// Interrogate the Database catalog graph to discover the exact Foreign Key Constraint connecting the components
|
||||
if let Some((relation, is_forward)) = db.resolve_relation(
|
||||
&p_type,
|
||||
&c_type,
|
||||
prop_name,
|
||||
Some(&keys_for_ambiguity),
|
||||
is_array,
|
||||
Some(root_id),
|
||||
&format!("{}/{}", path, prop_name),
|
||||
errors,
|
||||
) {
|
||||
schema_edges.insert(
|
||||
prop_name.clone(),
|
||||
crate::database::edge::Edge {
|
||||
constraint: relation.constraint.clone(),
|
||||
forward: is_forward,
|
||||
},
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
schema_edges
|
||||
}
|
||||
}
|
||||
78
src/database/compile/filters.rs
Normal file
78
src/database/compile/filters.rs
Normal file
@ -0,0 +1,78 @@
|
||||
use crate::database::object::{SchemaObject, SchemaTypeOrArray};
|
||||
use crate::database::schema::Schema;
|
||||
use crate::database::Database;
|
||||
use std::collections::BTreeMap;
|
||||
use std::sync::Arc;
|
||||
|
||||
impl Schema {
|
||||
pub fn compile_filter(
|
||||
&self,
|
||||
_db: &Database,
|
||||
_root_id: &str,
|
||||
_errors: &mut Vec<crate::drop::Error>,
|
||||
) -> Option<Schema> {
|
||||
if let Some(props) = self.obj.compiled_properties.get() {
|
||||
let mut filter_props = BTreeMap::new();
|
||||
for (key, child) in props {
|
||||
if let Some(mut filter_type) = Self::resolve_filter_type(child) {
|
||||
filter_type.push("null".to_string());
|
||||
|
||||
let mut child_obj = SchemaObject::default();
|
||||
child_obj.type_ = Some(SchemaTypeOrArray::Multiple(filter_type));
|
||||
|
||||
filter_props.insert(key.clone(), Arc::new(Schema { obj: child_obj, always_fail: false }));
|
||||
}
|
||||
}
|
||||
|
||||
if !filter_props.is_empty() {
|
||||
let mut wrapper_obj = SchemaObject::default();
|
||||
wrapper_obj.type_ = Some(SchemaTypeOrArray::Single("object".to_string()));
|
||||
wrapper_obj.properties = Some(filter_props);
|
||||
|
||||
return Some(Schema { obj: wrapper_obj, always_fail: false });
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
fn resolve_filter_type(schema: &Arc<Schema>) -> Option<Vec<String>> {
|
||||
if let Some(type_) = &schema.obj.type_ {
|
||||
match type_ {
|
||||
SchemaTypeOrArray::Single(t) => {
|
||||
return Self::map_filter_string(t, schema);
|
||||
}
|
||||
SchemaTypeOrArray::Multiple(types) => {
|
||||
for t in types {
|
||||
if t != "null" {
|
||||
return Self::map_filter_string(t, schema);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
fn map_filter_string(t: &str, schema: &Arc<Schema>) -> Option<Vec<String>> {
|
||||
match t {
|
||||
"string" => {
|
||||
if let Some(fmt) = &schema.obj.format {
|
||||
if fmt == "date-time" {
|
||||
return Some(vec!["date.condition".to_string()]);
|
||||
}
|
||||
}
|
||||
Some(vec!["string.condition".to_string()])
|
||||
}
|
||||
"integer" => Some(vec!["integer.condition".to_string()]),
|
||||
"number" => Some(vec!["number.condition".to_string()]),
|
||||
"boolean" => Some(vec!["boolean.condition".to_string()]),
|
||||
"object" => None, // Inline structures are ignored in Composed References
|
||||
"array" => None, // We don't filter primitive arrays or map complex arrays yet
|
||||
"null" => None,
|
||||
custom => {
|
||||
// Assume anything else is a Relational cross-boundary that already has its own .filter dynamically built
|
||||
Some(vec![format!("{}.filter", custom)])
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
175
src/database/compile/mod.rs
Normal file
175
src/database/compile/mod.rs
Normal file
@ -0,0 +1,175 @@
|
||||
pub mod collection;
|
||||
pub mod edges;
|
||||
pub mod filters;
|
||||
pub mod polymorphism;
|
||||
|
||||
use crate::database::schema::Schema;
|
||||
|
||||
impl Schema {
|
||||
pub fn compile(
|
||||
&self,
|
||||
db: &crate::database::Database,
|
||||
root_id: &str,
|
||||
path: String,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
if self.obj.compiled_properties.get().is_some() {
|
||||
return;
|
||||
}
|
||||
|
||||
if let Some(format_str) = &self.obj.format {
|
||||
if let Some(fmt) = crate::database::formats::FORMATS.get(format_str.as_str()) {
|
||||
let _ = self
|
||||
.obj
|
||||
.compiled_format
|
||||
.set(crate::database::object::CompiledFormat::Func(fmt.func));
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_str) = &self.obj.pattern {
|
||||
if let Ok(re) = regex::Regex::new(pattern_str) {
|
||||
let _ = self
|
||||
.obj
|
||||
.compiled_pattern
|
||||
.set(crate::database::object::CompiledRegex(re));
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_props) = &self.obj.pattern_properties {
|
||||
let mut compiled = Vec::new();
|
||||
for (k, v) in pattern_props {
|
||||
if let Ok(re) = regex::Regex::new(k) {
|
||||
compiled.push((crate::database::object::CompiledRegex(re), v.clone()));
|
||||
}
|
||||
}
|
||||
if !compiled.is_empty() {
|
||||
let _ = self.obj.compiled_pattern_properties.set(compiled);
|
||||
}
|
||||
}
|
||||
|
||||
let mut props = std::collections::BTreeMap::new();
|
||||
|
||||
// 1. Resolve INHERITANCE dependencies first
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
if let Some(parent) = db.schemas.get(t) {
|
||||
parent.as_ref().compile(db, t, t.clone(), errors);
|
||||
if let Some(p_props) = parent.obj.compiled_properties.get() {
|
||||
props.extend(p_props.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Multiple(types)) = &self.obj.type_ {
|
||||
let mut custom_type_count = 0;
|
||||
for t in types {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
custom_type_count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
if custom_type_count > 1 {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "MULTIPLE_INHERITANCE_PROHIBITED".to_string(),
|
||||
message: format!(
|
||||
"Schema attempts to extend multiple custom object pointers in its type array {:?}. Use 'oneOf' for polymorphism and tagged unions.",
|
||||
types
|
||||
),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: Some(path.clone()),
|
||||
schema: Some(root_id.to_string()),
|
||||
..Default::default()
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
for t in types {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
if let Some(parent) = db.schemas.get(t) {
|
||||
parent.as_ref().compile(db, t, t.clone(), errors);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Add local properties
|
||||
if let Some(local_props) = &self.obj.properties {
|
||||
for (k, v) in local_props {
|
||||
props.insert(k.clone(), v.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Add cases conditionally-defined properties recursively
|
||||
if let Some(cases) = &self.obj.cases {
|
||||
for (i, c) in cases.iter().enumerate() {
|
||||
if let Some(child) = &c.when {
|
||||
child.compile(db, root_id, format!("{}/cases/{}/when", path, i), errors);
|
||||
}
|
||||
if let Some(child) = &c.then {
|
||||
child.compile(db, root_id, format!("{}/cases/{}/then", path, i), errors);
|
||||
if let Some(t_props) = child.obj.compiled_properties.get() {
|
||||
props.extend(t_props.clone());
|
||||
}
|
||||
}
|
||||
if let Some(child) = &c.else_ {
|
||||
child.compile(db, root_id, format!("{}/cases/{}/else", path, i), errors);
|
||||
if let Some(e_props) = child.obj.compiled_properties.get() {
|
||||
props.extend(e_props.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Set the OnceLock!
|
||||
let _ = self.obj.compiled_properties.set(props.clone());
|
||||
let mut names: Vec<String> = props.keys().cloned().collect();
|
||||
names.sort();
|
||||
let _ = self.obj.compiled_property_names.set(names);
|
||||
|
||||
// 5. Compute Edges natively
|
||||
let schema_edges = self.compile_edges(db, root_id, &path, &props, errors);
|
||||
let _ = self.obj.compiled_edges.set(schema_edges);
|
||||
|
||||
// 5. Build our inline children properties recursively NOW! (Depth-first search)
|
||||
if let Some(local_props) = &self.obj.properties {
|
||||
for (k, child) in local_props {
|
||||
child.compile(db, root_id, format!("{}/{}", path, k), errors);
|
||||
}
|
||||
}
|
||||
if let Some(items) = &self.obj.items {
|
||||
items.compile(db, root_id, format!("{}/items", path), errors);
|
||||
}
|
||||
if let Some(pattern_props) = &self.obj.pattern_properties {
|
||||
for (k, child) in pattern_props {
|
||||
child.compile(db, root_id, format!("{}/{}", path, k), errors);
|
||||
}
|
||||
}
|
||||
if let Some(additional_props) = &self.obj.additional_properties {
|
||||
additional_props.compile(
|
||||
db,
|
||||
root_id,
|
||||
format!("{}/additionalProperties", path),
|
||||
errors,
|
||||
);
|
||||
}
|
||||
if let Some(one_of) = &self.obj.one_of {
|
||||
for (i, child) in one_of.iter().enumerate() {
|
||||
child.compile(db, root_id, format!("{}/oneOf/{}", path, i), errors);
|
||||
}
|
||||
}
|
||||
if let Some(arr) = &self.obj.prefix_items {
|
||||
for (i, child) in arr.iter().enumerate() {
|
||||
child.compile(db, root_id, format!("{}/prefixItems/{}", path, i), errors);
|
||||
}
|
||||
}
|
||||
if let Some(child) = &self.obj.not {
|
||||
child.compile(db, root_id, format!("{}/not", path), errors);
|
||||
}
|
||||
if let Some(child) = &self.obj.contains {
|
||||
child.compile(db, root_id, format!("{}/contains", path), errors);
|
||||
}
|
||||
|
||||
self.compile_polymorphism(db, root_id, &path, errors);
|
||||
}
|
||||
}
|
||||
153
src/database/compile/polymorphism.rs
Normal file
153
src/database/compile/polymorphism.rs
Normal file
@ -0,0 +1,153 @@
|
||||
use crate::database::schema::Schema;
|
||||
|
||||
impl Schema {
|
||||
pub fn compile_polymorphism(
|
||||
&self,
|
||||
db: &crate::database::Database,
|
||||
root_id: &str,
|
||||
path: &str,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
let mut options = std::collections::BTreeMap::new();
|
||||
let mut strategy = String::new();
|
||||
|
||||
if let Some(family) = &self.obj.family {
|
||||
let family_base = family.split('.').next_back().unwrap_or(family).to_string();
|
||||
let family_prefix = family
|
||||
.strip_suffix(&family_base)
|
||||
.unwrap_or("")
|
||||
.trim_end_matches('.');
|
||||
|
||||
if let Some(type_def) = db.types.get(&family_base) {
|
||||
if type_def.variations.len() > 1 && type_def.variations.iter().any(|v| v != &family_base) {
|
||||
// Scenario A / B: Table Variations
|
||||
strategy = "type".to_string();
|
||||
for var in &type_def.variations {
|
||||
let target_id = if family_prefix.is_empty() {
|
||||
var.to_string()
|
||||
} else {
|
||||
format!("{}.{}", family_prefix, var)
|
||||
};
|
||||
|
||||
if db.schemas.contains_key(&target_id) {
|
||||
options.insert(var.to_string(), (None, Some(target_id)));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Scenario C: Single Table Inheritance (Horizontal)
|
||||
strategy = "kind".to_string();
|
||||
|
||||
let suffix = format!(".{}", family_base);
|
||||
|
||||
for (id, schema) in &type_def.schemas {
|
||||
if id.ends_with(&suffix) || id == &family_base {
|
||||
if let Some(kind_val) = schema.obj.get_discriminator_value("kind", id) {
|
||||
options.insert(kind_val, (None, Some(id.to_string())));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if let Some(one_of) = &self.obj.one_of {
|
||||
let mut type_vals = std::collections::HashSet::new();
|
||||
let mut kind_vals = std::collections::HashSet::new();
|
||||
let mut disjoint_base = true;
|
||||
let mut structural_types = std::collections::HashSet::new();
|
||||
|
||||
for c in one_of {
|
||||
let mut child_id = String::new();
|
||||
let mut child_is_primitive = false;
|
||||
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &c.obj.type_ {
|
||||
if crate::database::object::is_primitive_type(t) {
|
||||
child_is_primitive = true;
|
||||
structural_types.insert(t.clone());
|
||||
} else {
|
||||
child_id = t.clone();
|
||||
structural_types.insert("object".to_string());
|
||||
}
|
||||
} else {
|
||||
disjoint_base = false;
|
||||
}
|
||||
|
||||
if !child_is_primitive {
|
||||
if let Some(t_val) = c.obj.get_discriminator_value("type", &child_id) {
|
||||
type_vals.insert(t_val);
|
||||
}
|
||||
if let Some(k_val) = c.obj.get_discriminator_value("kind", &child_id) {
|
||||
kind_vals.insert(k_val);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if disjoint_base && structural_types.len() == one_of.len() {
|
||||
strategy = "".to_string();
|
||||
for (i, c) in one_of.iter().enumerate() {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &c.obj.type_ {
|
||||
if crate::database::object::is_primitive_type(t) {
|
||||
options.insert(t.clone(), (Some(i), None));
|
||||
} else {
|
||||
options.insert("object".to_string(), (Some(i), None));
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
strategy = if type_vals.len() > 1 && type_vals.len() == one_of.len() {
|
||||
"type".to_string()
|
||||
} else if kind_vals.len() > 1 && kind_vals.len() == one_of.len() {
|
||||
"kind".to_string()
|
||||
} else {
|
||||
"".to_string()
|
||||
};
|
||||
|
||||
if strategy.is_empty() {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "AMBIGUOUS_POLYMORPHISM".to_string(),
|
||||
message: format!("oneOf boundaries must map mathematically unique 'type' or 'kind' discriminators, or strictly contain disjoint primitive types."),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: Some(path.to_string()),
|
||||
schema: Some(root_id.to_string()),
|
||||
..Default::default()
|
||||
}
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
for (i, c) in one_of.iter().enumerate() {
|
||||
let mut child_id = String::new();
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &c.obj.type_ {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
child_id = t.clone();
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(val) = c.obj.get_discriminator_value(&strategy, &child_id) {
|
||||
if options.contains_key(&val) {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "POLYMORPHIC_COLLISION".to_string(),
|
||||
message: format!("Polymorphic boundary defines multiple candidates mapped to the identical discriminator value '{}'.", val),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: Some(path.to_string()),
|
||||
schema: Some(root_id.to_string()),
|
||||
..Default::default()
|
||||
}
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
options.insert(val, (Some(i), None));
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return;
|
||||
}
|
||||
|
||||
if !options.is_empty() {
|
||||
if !strategy.is_empty() {
|
||||
let _ = self.obj.compiled_discriminator.set(strategy);
|
||||
}
|
||||
let _ = self.obj.compiled_options.set(options);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1,3 +1,4 @@
|
||||
pub mod compile;
|
||||
pub mod edge;
|
||||
pub mod r#enum;
|
||||
pub mod executors;
|
||||
@ -28,12 +29,15 @@ use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
use r#type::Type;
|
||||
|
||||
#[derive(serde::Serialize)]
|
||||
pub struct Database {
|
||||
pub enums: HashMap<String, Enum>,
|
||||
pub types: HashMap<String, Type>,
|
||||
pub puncs: HashMap<String, Punc>,
|
||||
pub relations: HashMap<String, Relation>,
|
||||
#[serde(skip)]
|
||||
pub schemas: HashMap<String, Arc<Schema>>,
|
||||
#[serde(skip)]
|
||||
pub executor: Box<dyn DatabaseExecutor + Send + Sync>,
|
||||
}
|
||||
|
||||
@ -209,6 +213,7 @@ impl Database {
|
||||
}
|
||||
|
||||
pub fn compile(&mut self, errors: &mut Vec<crate::drop::Error>) {
|
||||
// Collect existing schemas patched in the databse
|
||||
let mut harvested = Vec::new();
|
||||
for (id, schema_arc) in &self.schemas {
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
@ -233,53 +238,129 @@ impl Database {
|
||||
.as_ref()
|
||||
.compile(self, root_id, id.clone(), errors);
|
||||
}
|
||||
|
||||
// Phase 2: Synthesize Composed Filter References
|
||||
let mut filter_schemas = Vec::new();
|
||||
for (type_name, type_def) in &self.types {
|
||||
for (id, schema_arc) in &type_def.schemas {
|
||||
// Only run synthesis on actual structured, table-backed boundaries. Exclude subschemas!
|
||||
let base_name = id.split('.').last().unwrap_or(id);
|
||||
let is_table_backed = base_name == type_def.name;
|
||||
if is_table_backed && !id.contains('/') {
|
||||
if let Some(filter_schema) = schema_arc.compile_filter(self, id, errors) {
|
||||
filter_schemas.push((
|
||||
type_name.clone(),
|
||||
format!("{}.filter", id),
|
||||
Arc::new(filter_schema),
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut filter_ids = Vec::new();
|
||||
for (type_name, id, filter_arc) in filter_schemas {
|
||||
filter_ids.push(id.clone());
|
||||
self.schemas.insert(id.clone(), filter_arc.clone());
|
||||
if let Some(t) = self.types.get_mut(&type_name) {
|
||||
t.schemas.insert(id, filter_arc);
|
||||
}
|
||||
}
|
||||
|
||||
// Now actively compile the newly injected filters to lock all nested compose references natively
|
||||
for id in filter_ids {
|
||||
if let Some(filter_arc) = self.schemas.get(&id).cloned() {
|
||||
let root_id = id.split('/').next().unwrap_or(&id);
|
||||
filter_arc
|
||||
.as_ref()
|
||||
.compile(self, root_id, id.clone(), errors);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn collect_schemas(&mut self, errors: &mut Vec<crate::drop::Error>) {
|
||||
let mut to_insert = Vec::new();
|
||||
let mut type_insert = Vec::new();
|
||||
let mut punc_insert = Vec::new();
|
||||
let mut enum_insert = Vec::new();
|
||||
let mut global_insert = Vec::new();
|
||||
|
||||
// Pass 1: Extract all Schemas structurally off top level definitions into the master registry.
|
||||
// Validate every node recursively via string filters natively!
|
||||
for type_def in self.types.values() {
|
||||
for (type_name, type_def) in &self.types {
|
||||
for (id, schema_arc) in &type_def.schemas {
|
||||
to_insert.push((id.clone(), Arc::clone(schema_arc)));
|
||||
global_insert.push((id.clone(), Arc::clone(schema_arc)));
|
||||
let mut local_insert = Vec::new();
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
schema_arc,
|
||||
id,
|
||||
id.clone(),
|
||||
&mut to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
}
|
||||
for punc_def in self.puncs.values() {
|
||||
for (id, schema_arc) in &punc_def.schemas {
|
||||
to_insert.push((id.clone(), Arc::clone(schema_arc)));
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
schema_arc,
|
||||
id,
|
||||
id.clone(),
|
||||
&mut to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
}
|
||||
for enum_def in self.enums.values() {
|
||||
for (id, schema_arc) in &enum_def.schemas {
|
||||
to_insert.push((id.clone(), Arc::clone(schema_arc)));
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
schema_arc,
|
||||
id,
|
||||
id.clone(),
|
||||
&mut to_insert,
|
||||
&mut local_insert,
|
||||
errors,
|
||||
);
|
||||
for entry in &local_insert {
|
||||
type_insert.push((type_name.clone(), entry.0.clone(), Arc::clone(&entry.1)));
|
||||
global_insert.push((entry.0.clone(), Arc::clone(&entry.1)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (id, schema_arc) in to_insert {
|
||||
for (punc_name, punc_def) in &self.puncs {
|
||||
for (id, schema_arc) in &punc_def.schemas {
|
||||
global_insert.push((id.clone(), Arc::clone(schema_arc)));
|
||||
let mut local_insert = Vec::new();
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
schema_arc,
|
||||
id,
|
||||
id.clone(),
|
||||
&mut local_insert,
|
||||
errors,
|
||||
);
|
||||
for entry in &local_insert {
|
||||
punc_insert.push((punc_name.clone(), entry.0.clone(), Arc::clone(&entry.1)));
|
||||
global_insert.push((entry.0.clone(), Arc::clone(&entry.1)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (enum_name, enum_def) in &self.enums {
|
||||
for (id, schema_arc) in &enum_def.schemas {
|
||||
global_insert.push((id.clone(), Arc::clone(schema_arc)));
|
||||
let mut local_insert = Vec::new();
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
schema_arc,
|
||||
id,
|
||||
id.clone(),
|
||||
&mut local_insert,
|
||||
errors,
|
||||
);
|
||||
for entry in &local_insert {
|
||||
enum_insert.push((enum_name.clone(), entry.0.clone(), Arc::clone(&entry.1)));
|
||||
global_insert.push((entry.0.clone(), Arc::clone(&entry.1)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Apply global inserts
|
||||
for (id, schema_arc) in global_insert {
|
||||
self.schemas.insert(id, schema_arc);
|
||||
}
|
||||
|
||||
// Apply local scopes
|
||||
for (origin_name, id, schema_arc) in type_insert {
|
||||
if let Some(t) = self.types.get_mut(&origin_name) {
|
||||
t.schemas.insert(id, schema_arc);
|
||||
}
|
||||
}
|
||||
for (origin_name, id, schema_arc) in punc_insert {
|
||||
if let Some(p) = self.puncs.get_mut(&origin_name) {
|
||||
p.schemas.insert(id, schema_arc);
|
||||
}
|
||||
}
|
||||
for (origin_name, id, schema_arc) in enum_insert {
|
||||
if let Some(e) = self.enums.get_mut(&origin_name) {
|
||||
e.schemas.insert(id, schema_arc);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Inspects the Postgres pg_constraint relations catalog to securely identify
|
||||
|
||||
@ -37,7 +37,7 @@ pub struct SchemaObject {
|
||||
#[serde(rename = "additionalProperties")]
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub additional_properties: Option<Arc<Schema>>,
|
||||
#[serde(rename = "$family")]
|
||||
#[serde(rename = "family")]
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub family: Option<String>,
|
||||
|
||||
@ -154,12 +154,15 @@ pub struct SchemaObject {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub extensible: Option<bool>,
|
||||
|
||||
// Contains ALL structural fields perfectly flattened from the ENTIRE Database inheritance tree (e.g. `entity` fields like `id`) as well as local fields hidden inside conditional `cases` blocks.
|
||||
// This JSON exported array gives clients absolute deterministic visibility to O(1) validation and masking bounds without duplicating structural memory.
|
||||
#[serde(rename = "compiledPropertyNames")]
|
||||
#[serde(skip_deserializing)]
|
||||
#[serde(skip_serializing_if = "crate::database::object::is_once_lock_vec_empty")]
|
||||
#[serde(serialize_with = "crate::database::object::serialize_once_lock")]
|
||||
pub compiled_property_names: OnceLock<Vec<String>>,
|
||||
|
||||
// Internal structural representation caching active AST Node maps. Unlike the Go framework counterpart, the JSPG implementation DOES natively include ALL ancestral inheritance boundary schemas because it compiles locally against the raw database graph.
|
||||
#[serde(skip)]
|
||||
pub compiled_properties: OnceLock<BTreeMap<String, Arc<Schema>>>,
|
||||
|
||||
@ -307,7 +310,7 @@ impl SchemaObject {
|
||||
return true;
|
||||
}
|
||||
|
||||
// 2. Implicit table-backed rule: Does its $family boundary map directly to the global database catalog?
|
||||
// 2. Implicit table-backed rule: Does its family boundary map directly to the global database catalog?
|
||||
if let Some(family) = &self.family {
|
||||
let base = family.split('.').next_back().unwrap_or(family);
|
||||
if db.types.contains_key(base) {
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
use crate::database::object::*;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::Value;
|
||||
use std::sync::Arc;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Default)]
|
||||
pub struct Schema {
|
||||
#[serde(flatten)]
|
||||
@ -22,609 +22,6 @@ impl std::ops::DerefMut for Schema {
|
||||
}
|
||||
}
|
||||
|
||||
impl Schema {
|
||||
pub fn compile(
|
||||
&self,
|
||||
db: &crate::database::Database,
|
||||
root_id: &str,
|
||||
path: String,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
if self.obj.compiled_properties.get().is_some() {
|
||||
return;
|
||||
}
|
||||
|
||||
if let Some(format_str) = &self.obj.format {
|
||||
if let Some(fmt) = crate::database::formats::FORMATS.get(format_str.as_str()) {
|
||||
let _ = self
|
||||
.obj
|
||||
.compiled_format
|
||||
.set(crate::database::object::CompiledFormat::Func(fmt.func));
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_str) = &self.obj.pattern {
|
||||
if let Ok(re) = regex::Regex::new(pattern_str) {
|
||||
let _ = self
|
||||
.obj
|
||||
.compiled_pattern
|
||||
.set(crate::database::object::CompiledRegex(re));
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_props) = &self.obj.pattern_properties {
|
||||
let mut compiled = Vec::new();
|
||||
for (k, v) in pattern_props {
|
||||
if let Ok(re) = regex::Regex::new(k) {
|
||||
compiled.push((crate::database::object::CompiledRegex(re), v.clone()));
|
||||
}
|
||||
}
|
||||
if !compiled.is_empty() {
|
||||
let _ = self.obj.compiled_pattern_properties.set(compiled);
|
||||
}
|
||||
}
|
||||
|
||||
let mut props = std::collections::BTreeMap::new();
|
||||
|
||||
// 1. Resolve INHERITANCE dependencies first
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
if let Some(parent) = db.schemas.get(t) {
|
||||
parent.as_ref().compile(db, t, t.clone(), errors);
|
||||
if let Some(p_props) = parent.obj.compiled_properties.get() {
|
||||
props.extend(p_props.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Multiple(types)) = &self.obj.type_ {
|
||||
let mut custom_type_count = 0;
|
||||
for t in types {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
custom_type_count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
if custom_type_count > 1 {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "MULTIPLE_INHERITANCE_PROHIBITED".to_string(),
|
||||
message: format!(
|
||||
"Schema attempts to extend multiple custom object pointers in its type array {:?}. Use 'oneOf' for polymorphism and tagged unions.",
|
||||
types
|
||||
),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: Some(path.clone()),
|
||||
schema: Some(root_id.to_string()),
|
||||
..Default::default()
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
for t in types {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
if let Some(parent) = db.schemas.get(t) {
|
||||
parent.as_ref().compile(db, t, t.clone(), errors);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Add local properties
|
||||
if let Some(local_props) = &self.obj.properties {
|
||||
for (k, v) in local_props {
|
||||
props.insert(k.clone(), v.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Add cases conditionally-defined properties recursively
|
||||
if let Some(cases) = &self.obj.cases {
|
||||
for (i, c) in cases.iter().enumerate() {
|
||||
if let Some(child) = &c.when {
|
||||
child.compile(db, root_id, format!("{}/cases/{}/when", path, i), errors);
|
||||
}
|
||||
if let Some(child) = &c.then {
|
||||
child.compile(db, root_id, format!("{}/cases/{}/then", path, i), errors);
|
||||
if let Some(t_props) = child.obj.compiled_properties.get() {
|
||||
props.extend(t_props.clone());
|
||||
}
|
||||
}
|
||||
if let Some(child) = &c.else_ {
|
||||
child.compile(db, root_id, format!("{}/cases/{}/else", path, i), errors);
|
||||
if let Some(e_props) = child.obj.compiled_properties.get() {
|
||||
props.extend(e_props.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Set the OnceLock!
|
||||
let _ = self.obj.compiled_properties.set(props.clone());
|
||||
let mut names: Vec<String> = props.keys().cloned().collect();
|
||||
names.sort();
|
||||
let _ = self.obj.compiled_property_names.set(names);
|
||||
|
||||
// 5. Compute Edges natively
|
||||
let schema_edges = self.compile_edges(db, root_id, &path, &props, errors);
|
||||
let _ = self.obj.compiled_edges.set(schema_edges);
|
||||
|
||||
// 5. Build our inline children properties recursively NOW! (Depth-first search)
|
||||
if let Some(local_props) = &self.obj.properties {
|
||||
for (k, child) in local_props {
|
||||
child.compile(db, root_id, format!("{}/{}", path, k), errors);
|
||||
}
|
||||
}
|
||||
if let Some(items) = &self.obj.items {
|
||||
items.compile(db, root_id, format!("{}/items", path), errors);
|
||||
}
|
||||
if let Some(pattern_props) = &self.obj.pattern_properties {
|
||||
for (k, child) in pattern_props {
|
||||
child.compile(db, root_id, format!("{}/{}", path, k), errors);
|
||||
}
|
||||
}
|
||||
if let Some(additional_props) = &self.obj.additional_properties {
|
||||
additional_props.compile(
|
||||
db,
|
||||
root_id,
|
||||
format!("{}/additionalProperties", path),
|
||||
errors,
|
||||
);
|
||||
}
|
||||
if let Some(one_of) = &self.obj.one_of {
|
||||
for (i, child) in one_of.iter().enumerate() {
|
||||
child.compile(db, root_id, format!("{}/oneOf/{}", path, i), errors);
|
||||
}
|
||||
}
|
||||
if let Some(arr) = &self.obj.prefix_items {
|
||||
for (i, child) in arr.iter().enumerate() {
|
||||
child.compile(db, root_id, format!("{}/prefixItems/{}", path, i), errors);
|
||||
}
|
||||
}
|
||||
if let Some(child) = &self.obj.not {
|
||||
child.compile(db, root_id, format!("{}/not", path), errors);
|
||||
}
|
||||
if let Some(child) = &self.obj.contains {
|
||||
child.compile(db, root_id, format!("{}/contains", path), errors);
|
||||
}
|
||||
|
||||
self.compile_polymorphism(db, root_id, &path, errors);
|
||||
}
|
||||
|
||||
/// Dynamically infers and compiles all structural database relationships between this Schema
|
||||
/// and its nested children. This functions recursively traverses the JSON Schema abstract syntax
|
||||
/// tree, identifies physical PostgreSQL table boundaries, and locks the resulting relation
|
||||
/// constraint paths directly onto the `compiled_edges` map in O(1) memory.
|
||||
pub fn compile_edges(
|
||||
&self,
|
||||
db: &crate::database::Database,
|
||||
root_id: &str,
|
||||
path: &str,
|
||||
props: &std::collections::BTreeMap<String, std::sync::Arc<Schema>>,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) -> std::collections::BTreeMap<String, crate::database::edge::Edge> {
|
||||
let mut schema_edges = std::collections::BTreeMap::new();
|
||||
|
||||
// Determine the physical Database Table Name this schema structurally represents
|
||||
// Plucks the polymorphic discriminator via dot-notation (e.g. extracting "person" from "full.person")
|
||||
let mut parent_type_name = None;
|
||||
|
||||
if let Some(family) = &self.obj.family {
|
||||
// 1. Explicit horizontal routing
|
||||
parent_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||
} else if path == root_id {
|
||||
// 2. Root nodes trust their exact registry footprint
|
||||
let base_type_name = path.split('.').next_back().unwrap_or(path).to_string();
|
||||
if db.types.contains_key(&base_type_name) {
|
||||
parent_type_name = Some(base_type_name);
|
||||
}
|
||||
} else if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
|
||||
// 3. Nested graphs trust their explicit struct pointer reference
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
parent_type_name = Some(t.split('.').next_back().unwrap_or(t).to_string());
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(p_type) = parent_type_name {
|
||||
// Proceed only if the resolved table physically exists within the Postgres Type hierarchy
|
||||
if let Some(type_def) = db.types.get(&p_type) {
|
||||
// Iterate over all discovered schema boundaries mapped inside the object
|
||||
for (prop_name, prop_schema) in props {
|
||||
let mut child_type_name = None;
|
||||
let mut target_schema = prop_schema.clone();
|
||||
let mut is_array = false;
|
||||
|
||||
// Structurally unpack the inner target entity if the object maps to an array list
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) =
|
||||
&prop_schema.obj.type_
|
||||
{
|
||||
if t == "array" {
|
||||
is_array = true;
|
||||
if let Some(items) = &prop_schema.obj.items {
|
||||
target_schema = items.clone();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Determine the physical Postgres table backing the nested child schema recursively
|
||||
if let Some(family) = &target_schema.obj.family {
|
||||
child_type_name = Some(family.split('.').next_back().unwrap_or(family).to_string());
|
||||
} else if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) =
|
||||
&target_schema.obj.type_
|
||||
{
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
child_type_name = Some(t.split('.').next_back().unwrap_or(t).to_string());
|
||||
}
|
||||
} else if let Some(arr) = &target_schema.obj.one_of {
|
||||
if let Some(first) = arr.first() {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &first.obj.type_
|
||||
{
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
child_type_name = Some(t.split('.').next_back().unwrap_or(t).to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(c_type) = child_type_name {
|
||||
// Skip edge compilation for JSONB columns — they store data inline, not relationally.
|
||||
// The physical column type from field_types is the single source of truth.
|
||||
if let Some(ft) = type_def
|
||||
.field_types
|
||||
.as_ref()
|
||||
.and_then(|v| v.get(prop_name.as_str()))
|
||||
.and_then(|v| v.as_str())
|
||||
{
|
||||
if ft == "jsonb" {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
if db.types.contains_key(&c_type) {
|
||||
// Ensure the child Schema's AST has accurately compiled its own physical property keys so we can
|
||||
// inject them securely for Many-to-Many Twin Deduction disambiguation matching.
|
||||
target_schema.compile(db, root_id, format!("{}/{}", path, prop_name), errors);
|
||||
|
||||
if let Some(compiled_target_props) = target_schema.obj.compiled_properties.get() {
|
||||
let keys_for_ambiguity: Vec<String> =
|
||||
compiled_target_props.keys().cloned().collect();
|
||||
|
||||
// Interrogate the Database catalog graph to discover the exact Foreign Key Constraint connecting the components
|
||||
if let Some((relation, is_forward)) = db.resolve_relation(
|
||||
&p_type,
|
||||
&c_type,
|
||||
prop_name,
|
||||
Some(&keys_for_ambiguity),
|
||||
is_array,
|
||||
Some(root_id),
|
||||
&format!("{}/{}", path, prop_name),
|
||||
errors,
|
||||
) {
|
||||
schema_edges.insert(
|
||||
prop_name.clone(),
|
||||
crate::database::edge::Edge {
|
||||
constraint: relation.constraint.clone(),
|
||||
forward: is_forward,
|
||||
},
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
schema_edges
|
||||
}
|
||||
|
||||
pub fn compile_polymorphism(
|
||||
&self,
|
||||
db: &crate::database::Database,
|
||||
root_id: &str,
|
||||
path: &str,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
let mut options = std::collections::BTreeMap::new();
|
||||
let mut strategy = String::new();
|
||||
|
||||
if let Some(family) = &self.obj.family {
|
||||
let family_base = family.split('.').next_back().unwrap_or(family).to_string();
|
||||
let family_prefix = family
|
||||
.strip_suffix(&family_base)
|
||||
.unwrap_or("")
|
||||
.trim_end_matches('.');
|
||||
|
||||
if let Some(type_def) = db.types.get(&family_base) {
|
||||
if type_def.variations.len() > 1 && type_def.variations.iter().any(|v| v != &family_base) {
|
||||
// Scenario A / B: Table Variations
|
||||
strategy = "type".to_string();
|
||||
for var in &type_def.variations {
|
||||
let target_id = if family_prefix.is_empty() {
|
||||
var.to_string()
|
||||
} else {
|
||||
format!("{}.{}", family_prefix, var)
|
||||
};
|
||||
|
||||
if db.schemas.contains_key(&target_id) {
|
||||
options.insert(var.to_string(), (None, Some(target_id)));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Scenario C: Single Table Inheritance (Horizontal)
|
||||
strategy = "kind".to_string();
|
||||
|
||||
let suffix = format!(".{}", family_base);
|
||||
|
||||
for (id, schema) in &type_def.schemas {
|
||||
if id.ends_with(&suffix) || id == &family_base {
|
||||
if let Some(kind_val) = schema.obj.get_discriminator_value("kind", id) {
|
||||
options.insert(kind_val, (None, Some(id.to_string())));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if let Some(one_of) = &self.obj.one_of {
|
||||
let mut type_vals = std::collections::HashSet::new();
|
||||
let mut kind_vals = std::collections::HashSet::new();
|
||||
let mut disjoint_base = true;
|
||||
let mut structural_types = std::collections::HashSet::new();
|
||||
|
||||
for c in one_of {
|
||||
let mut child_id = String::new();
|
||||
let mut child_is_primitive = false;
|
||||
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &c.obj.type_ {
|
||||
if crate::database::object::is_primitive_type(t) {
|
||||
child_is_primitive = true;
|
||||
structural_types.insert(t.clone());
|
||||
} else {
|
||||
child_id = t.clone();
|
||||
structural_types.insert("object".to_string());
|
||||
}
|
||||
} else {
|
||||
disjoint_base = false;
|
||||
}
|
||||
|
||||
if !child_is_primitive {
|
||||
if let Some(t_val) = c.obj.get_discriminator_value("type", &child_id) {
|
||||
type_vals.insert(t_val);
|
||||
}
|
||||
if let Some(k_val) = c.obj.get_discriminator_value("kind", &child_id) {
|
||||
kind_vals.insert(k_val);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if disjoint_base && structural_types.len() == one_of.len() {
|
||||
strategy = "".to_string();
|
||||
for (i, c) in one_of.iter().enumerate() {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &c.obj.type_ {
|
||||
if crate::database::object::is_primitive_type(t) {
|
||||
options.insert(t.clone(), (Some(i), None));
|
||||
} else {
|
||||
options.insert("object".to_string(), (Some(i), None));
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
strategy = if type_vals.len() > 1 && type_vals.len() == one_of.len() {
|
||||
"type".to_string()
|
||||
} else if kind_vals.len() > 1 && kind_vals.len() == one_of.len() {
|
||||
"kind".to_string()
|
||||
} else {
|
||||
"".to_string()
|
||||
};
|
||||
|
||||
if strategy.is_empty() {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "AMBIGUOUS_POLYMORPHISM".to_string(),
|
||||
message: format!("oneOf boundaries must map mathematically unique 'type' or 'kind' discriminators, or strictly contain disjoint primitive types."),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: Some(path.to_string()),
|
||||
schema: Some(root_id.to_string()),
|
||||
..Default::default()
|
||||
}
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
for (i, c) in one_of.iter().enumerate() {
|
||||
let mut child_id = String::new();
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &c.obj.type_ {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
child_id = t.clone();
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(val) = c.obj.get_discriminator_value(&strategy, &child_id) {
|
||||
if options.contains_key(&val) {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "POLYMORPHIC_COLLISION".to_string(),
|
||||
message: format!("Polymorphic boundary defines multiple candidates mapped to the identical discriminator value '{}'.", val),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: Some(path.to_string()),
|
||||
schema: Some(root_id.to_string()),
|
||||
..Default::default()
|
||||
}
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
options.insert(val, (Some(i), None));
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return;
|
||||
}
|
||||
|
||||
if !options.is_empty() {
|
||||
if !strategy.is_empty() {
|
||||
let _ = self.obj.compiled_discriminator.set(strategy);
|
||||
}
|
||||
let _ = self.obj.compiled_options.set(options);
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(unused_variables)]
|
||||
fn validate_identifier(
|
||||
id: &str,
|
||||
field_name: &str,
|
||||
root_id: &str,
|
||||
path: &str,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
#[cfg(not(test))]
|
||||
for c in id.chars() {
|
||||
if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "INVALID_IDENTIFIER".to_string(),
|
||||
message: format!(
|
||||
"Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.]",
|
||||
c, field_name, id
|
||||
),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: Some(path.to_string()),
|
||||
schema: Some(root_id.to_string()),
|
||||
..Default::default()
|
||||
},
|
||||
});
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn collect_schemas(
|
||||
schema_arc: &Arc<Schema>,
|
||||
root_id: &str,
|
||||
path: String,
|
||||
to_insert: &mut Vec<(String, Arc<Schema>)>,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &schema_arc.obj.type_ {
|
||||
if t == "array" {
|
||||
if let Some(items) = &schema_arc.obj.items {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(it)) = &items.obj.type_ {
|
||||
if !crate::database::object::is_primitive_type(it) {
|
||||
if items.obj.properties.is_some() || items.obj.cases.is_some() {
|
||||
to_insert.push((path.clone(), Arc::clone(schema_arc)));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if !crate::database::object::is_primitive_type(t) {
|
||||
Self::validate_identifier(t, "type", root_id, &path, errors);
|
||||
|
||||
// Is this an explicit inline ad-hoc composition?
|
||||
if schema_arc.obj.properties.is_some() || schema_arc.obj.cases.is_some() {
|
||||
to_insert.push((path.clone(), Arc::clone(schema_arc)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(family) = &schema_arc.obj.family {
|
||||
Self::validate_identifier(family, "$family", root_id, &path, errors);
|
||||
}
|
||||
|
||||
Self::collect_child_schemas(schema_arc, root_id, path, to_insert, errors);
|
||||
}
|
||||
|
||||
pub fn collect_child_schemas(
|
||||
schema_arc: &Arc<Schema>,
|
||||
root_id: &str,
|
||||
path: String,
|
||||
to_insert: &mut Vec<(String, Arc<Schema>)>,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
if let Some(props) = &schema_arc.obj.properties {
|
||||
for (k, v) in props.iter() {
|
||||
let next_path = format!("{}/{}", path, k);
|
||||
Self::collect_schemas(v, root_id, next_path, to_insert, errors);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_props) = &schema_arc.obj.pattern_properties {
|
||||
for (k, v) in pattern_props.iter() {
|
||||
let next_path = format!("{}/{}", path, k);
|
||||
Self::collect_schemas(v, root_id, next_path, to_insert, errors);
|
||||
}
|
||||
}
|
||||
|
||||
let mut map_arr = |arr: &Vec<Arc<Schema>>, sub: &str| {
|
||||
for (i, v) in arr.iter().enumerate() {
|
||||
Self::collect_schemas(
|
||||
v,
|
||||
root_id,
|
||||
format!("{}/{}/{}", path, sub, i),
|
||||
to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
if let Some(arr) = &schema_arc.obj.prefix_items {
|
||||
map_arr(arr, "prefixItems");
|
||||
}
|
||||
|
||||
if let Some(arr) = &schema_arc.obj.one_of {
|
||||
map_arr(arr, "oneOf");
|
||||
}
|
||||
|
||||
let mut map_opt = |opt: &Option<Arc<Schema>>, pass_path: bool, sub: &str| {
|
||||
if let Some(v) = opt {
|
||||
if pass_path {
|
||||
// Arrays explicitly push their wrapper natively.
|
||||
// 'items' becomes a transparent conduit, bypassing self-promotion and skipping the '/items' suffix.
|
||||
Self::collect_child_schemas(v, root_id, path.clone(), to_insert, errors);
|
||||
} else {
|
||||
Self::collect_child_schemas(v, root_id, format!("{}/{}", path, sub), to_insert, errors);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
map_opt(
|
||||
&schema_arc.obj.additional_properties,
|
||||
false,
|
||||
"additionalProperties",
|
||||
);
|
||||
map_opt(&schema_arc.obj.items, true, "items");
|
||||
map_opt(&schema_arc.obj.not, false, "not");
|
||||
map_opt(&schema_arc.obj.contains, false, "contains");
|
||||
map_opt(&schema_arc.obj.property_names, false, "propertyNames");
|
||||
|
||||
if let Some(cases) = &schema_arc.obj.cases {
|
||||
for (i, c) in cases.iter().enumerate() {
|
||||
if let Some(when) = &c.when {
|
||||
Self::collect_schemas(
|
||||
when,
|
||||
root_id,
|
||||
format!("{}/cases/{}/when", path, i),
|
||||
to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
if let Some(then) = &c.then {
|
||||
Self::collect_schemas(
|
||||
then,
|
||||
root_id,
|
||||
format!("{}/cases/{}/then", path, i),
|
||||
to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
if let Some(else_) = &c.else_ {
|
||||
Self::collect_schemas(
|
||||
else_,
|
||||
root_id,
|
||||
format!("{}/cases/{}/else", path, i),
|
||||
to_insert,
|
||||
errors,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'de> Deserialize<'de> for Schema {
|
||||
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||
where
|
||||
|
||||
@ -109,7 +109,7 @@ pub fn jspg_validate(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
}
|
||||
|
||||
#[cfg_attr(not(test), pg_extern)]
|
||||
pub fn jspg_schemas() -> JsonB {
|
||||
pub fn jspg_database() -> JsonB {
|
||||
let engine_opt = {
|
||||
let lock = GLOBAL_JSPG.read().unwrap();
|
||||
lock.clone()
|
||||
@ -117,9 +117,9 @@ pub fn jspg_schemas() -> JsonB {
|
||||
|
||||
match engine_opt {
|
||||
Some(engine) => {
|
||||
let schemas_json = serde_json::to_value(&engine.database.schemas)
|
||||
let database_json = serde_json::to_value(&engine.database)
|
||||
.unwrap_or(serde_json::Value::Object(serde_json::Map::new()));
|
||||
let drop = crate::drop::Drop::success_with_val(schemas_json);
|
||||
let drop = crate::drop::Drop::success_with_val(database_json);
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
None => jspg_failure(),
|
||||
|
||||
@ -717,8 +717,8 @@ impl<'a> Compiler<'a> {
|
||||
let param_index = i + 1;
|
||||
let p_val = format!("${}#>>'{{}}'", param_index);
|
||||
|
||||
if op == "$in" || op == "$nin" {
|
||||
let sql_op = if op == "$in" { "IN" } else { "NOT IN" };
|
||||
if op == "$of" || op == "$nof" {
|
||||
let sql_op = if op == "$of" { "IN" } else { "NOT IN" };
|
||||
let subquery = format!(
|
||||
"(SELECT value{} FROM jsonb_array_elements_text(({})::jsonb))",
|
||||
cast, p_val
|
||||
|
||||
@ -533,6 +533,12 @@ fn test_unique_items_6_1() {
|
||||
crate::tests::runner::run_test_case(&path, 6, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_filter_0_0() {
|
||||
let path = format!("{}/fixtures/filter.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_min_items_0_0() {
|
||||
let path = format!("{}/fixtures/minItems.json", env!("CARGO_MANIFEST_DIR"));
|
||||
|
||||
126
src/tests/mod.rs
126
src/tests/mod.rs
@ -81,38 +81,114 @@ fn test_library_api() {
|
||||
})
|
||||
);
|
||||
|
||||
// 3. Validate jspg_schemas
|
||||
let schemas_drop = jspg_schemas();
|
||||
// 3. Validate jspg_database mapping natively!
|
||||
let db_drop = jspg_database();
|
||||
assert_eq!(
|
||||
schemas_drop.0,
|
||||
db_drop.0,
|
||||
json!({
|
||||
"type": "drop",
|
||||
"response": {
|
||||
"source_schema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"type": { "type": "string" },
|
||||
"name": { "type": "string" },
|
||||
"target": {
|
||||
"type": "target_schema",
|
||||
"compiledPropertyNames": ["value"]
|
||||
}
|
||||
},
|
||||
"required": ["name"],
|
||||
"compiledPropertyNames": ["name", "target", "type"],
|
||||
"compiledEdges": {
|
||||
"target": {
|
||||
"constraint": "fk_test_target",
|
||||
"forward": true
|
||||
}
|
||||
"enums": {},
|
||||
"puncs": {},
|
||||
"relations": {
|
||||
"fk_test_target": {
|
||||
"constraint": "fk_test_target",
|
||||
"destination_columns": ["id"],
|
||||
"destination_type": "target_schema",
|
||||
"prefix": "target",
|
||||
"source_columns": ["target_id"],
|
||||
"source_type": "source_schema"
|
||||
}
|
||||
},
|
||||
"target_schema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"value": { "type": "number" }
|
||||
"types": {
|
||||
"source_schema": {
|
||||
"default_fields": [],
|
||||
"field_types": null,
|
||||
"fields": [],
|
||||
"grouped_fields": null,
|
||||
"hierarchy": ["source_schema", "entity"],
|
||||
"historical": false,
|
||||
"id": "",
|
||||
"longevity": null,
|
||||
"lookup_fields": [],
|
||||
"module": "",
|
||||
"name": "source_schema",
|
||||
"notify": false,
|
||||
"null_fields": [],
|
||||
"ownable": false,
|
||||
"relationship": false,
|
||||
"schemas": {
|
||||
"source_schema": {
|
||||
"compiledEdges": {
|
||||
"target": {
|
||||
"constraint": "fk_test_target",
|
||||
"forward": true
|
||||
}
|
||||
},
|
||||
"compiledPropertyNames": ["name", "target", "type"],
|
||||
"properties": {
|
||||
"name": { "type": "string" },
|
||||
"target": {
|
||||
"compiledPropertyNames": ["value"],
|
||||
"type": "target_schema"
|
||||
},
|
||||
"type": { "type": "string" }
|
||||
},
|
||||
"required": ["name"],
|
||||
"type": "object"
|
||||
},
|
||||
"source_schema.filter": {
|
||||
"compiledPropertyNames": ["name", "target", "type"],
|
||||
"properties": {
|
||||
"name": { "type": ["string.condition", "null"] },
|
||||
"target": { "type": ["target_schema.filter", "null"] },
|
||||
"type": { "type": ["string.condition", "null"] }
|
||||
},
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"sensitive": false,
|
||||
"source": "",
|
||||
"type": "",
|
||||
"variations": ["source_schema"]
|
||||
},
|
||||
"compiledPropertyNames": ["value"]
|
||||
"target_schema": {
|
||||
"default_fields": [],
|
||||
"field_types": null,
|
||||
"fields": [],
|
||||
"grouped_fields": null,
|
||||
"hierarchy": ["target_schema", "entity"],
|
||||
"historical": false,
|
||||
"id": "",
|
||||
"longevity": null,
|
||||
"lookup_fields": [],
|
||||
"module": "",
|
||||
"name": "target_schema",
|
||||
"notify": false,
|
||||
"null_fields": [],
|
||||
"ownable": false,
|
||||
"relationship": false,
|
||||
"schemas": {
|
||||
"target_schema": {
|
||||
"compiledPropertyNames": ["value"],
|
||||
"properties": {
|
||||
"value": { "type": "number" }
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"target_schema.filter": {
|
||||
"compiledPropertyNames": ["value"],
|
||||
"properties": {
|
||||
"value": { "type": ["number.condition", "null"] }
|
||||
},
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"sensitive": false,
|
||||
"source": "",
|
||||
"type": "",
|
||||
"variations": ["target_schema"]
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
@ -20,5 +20,5 @@ pub struct Expect {
|
||||
#[serde(default)]
|
||||
pub sql: Option<Vec<SqlExpectation>>,
|
||||
#[serde(default)]
|
||||
pub schemas: Option<Vec<String>>,
|
||||
pub schemas: Option<std::collections::HashMap<String, serde_json::Value>>,
|
||||
}
|
||||
|
||||
@ -3,13 +3,13 @@ use std::sync::Arc;
|
||||
|
||||
impl Expect {
|
||||
pub fn assert_schemas(&self, db: &Arc<crate::database::Database>) -> Result<(), String> {
|
||||
if let Some(expected_schemas) = &self.schemas {
|
||||
if let Some(expected_map) = &self.schemas {
|
||||
// Collect actual schemas and sort
|
||||
let mut actual: Vec<String> = db.schemas.keys().cloned().collect();
|
||||
actual.sort();
|
||||
|
||||
// Collect expected schemas and sort
|
||||
let mut expected: Vec<String> = expected_schemas.clone();
|
||||
let mut expected: Vec<String> = expected_map.keys().cloned().collect();
|
||||
expected.sort();
|
||||
|
||||
if actual != expected {
|
||||
@ -21,6 +21,23 @@ impl Expect {
|
||||
actual
|
||||
));
|
||||
}
|
||||
|
||||
for (key, expected_val) in expected_map {
|
||||
if expected_val.is_object() && expected_val.as_object().unwrap().is_empty() {
|
||||
continue; // A `{}` means we just wanted to test it was collected/promoted, skip deep match
|
||||
}
|
||||
let actual_ast = db.schemas.get(key).unwrap();
|
||||
let actual_val = serde_json::to_value(actual_ast).unwrap();
|
||||
|
||||
if actual_val != *expected_val {
|
||||
return Err(format!(
|
||||
"Detailed Schema Match Failure for '{}'!\n\nExpected:\n{}\n\nActual:\n{}",
|
||||
key,
|
||||
serde_json::to_string_pretty(expected_val).unwrap(),
|
||||
serde_json::to_string_pretty(&actual_val).unwrap()
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@ -21,7 +21,7 @@ impl<'a> ValidationContext<'a> {
|
||||
if conflicts {
|
||||
result.errors.push(ValidationError {
|
||||
code: "INVALID_SCHEMA".to_string(),
|
||||
message: "$family must be used exclusively without other constraints".to_string(),
|
||||
message: "family must be used exclusively without other constraints".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
return Ok(false);
|
||||
|
||||
24
wipe_test.py
24
wipe_test.py
@ -1,24 +0,0 @@
|
||||
import json
|
||||
|
||||
def load_json(path):
|
||||
with open(path, 'r') as f:
|
||||
return json.load(f)
|
||||
|
||||
def save_json(path, data):
|
||||
with open(path, 'w') as f:
|
||||
json.dump(data, f, indent=4)
|
||||
|
||||
def fix_merger():
|
||||
data = load_json('fixtures/merger.json')
|
||||
last_test = data[0]['tests'][-1]
|
||||
last_test["expect"]["sql"] = []
|
||||
save_json('fixtures/merger.json', data)
|
||||
|
||||
def fix_queryer():
|
||||
data = load_json('fixtures/queryer.json')
|
||||
last_test = data[0]['tests'][-1]
|
||||
last_test["expect"]["sql"] = []
|
||||
save_json('fixtures/queryer.json', data)
|
||||
|
||||
fix_merger()
|
||||
fix_queryer()
|
||||
Reference in New Issue
Block a user