Compare commits

...

7 Commits

34 changed files with 457 additions and 592 deletions

BIN
.DS_Store vendored

Binary file not shown.

View File

@ -28,7 +28,7 @@ These functions operate on the global `GLOBAL_JSPG` engine instance and provide
* `jspg_setup(database jsonb) -> jsonb`: Initializes the engine. Deserializes the full database schema registry (types, enums, puncs, relations) from Postgres and compiles them into memory atomically. * `jspg_setup(database jsonb) -> jsonb`: Initializes the engine. Deserializes the full database schema registry (types, enums, puncs, relations) from Postgres and compiles them into memory atomically.
* `jspg_teardown() -> jsonb`: Clears the current session's engine instance from `GLOBAL_JSPG`, resetting the cache. * `jspg_teardown() -> jsonb`: Clears the current session's engine instance from `GLOBAL_JSPG`, resetting the cache.
* `jspg_schemas() -> jsonb`: Exports the fully compiled AST snapshot (including all inherited dependencies) out of `GLOBAL_JSPG` into standard JSON Schema representations. * `jspg_database() -> jsonb`: Exports the fully compiled snapshot of the database registry (including Types, Puncs, Enums, and Relations) out of `GLOBAL_JSPG` into standard JSON Schema representations.
--- ---
@ -36,6 +36,17 @@ These functions operate on the global `GLOBAL_JSPG` engine instance and provide
JSPG augments standard JSON Schema 2020-12 to provide an opinionated, strict, and highly ergonomic Object-Oriented paradigm. Developers defining Punc Data Models should follow these conventions. JSPG augments standard JSON Schema 2020-12 to provide an opinionated, strict, and highly ergonomic Object-Oriented paradigm. Developers defining Punc Data Models should follow these conventions.
### Realms (Topological Boundaries)
JSPG strictly organizes schemas into three distinct topological boundaries called **Realms** to prevent cross-contamination and ensure secure API generation:
* **Type Realm (`database.types`)**: Represents physical Postgres tables or structural JSONB bubbles. Table-backed entities here are strictly evaluated for their `type` or `kind` discriminators if they possess polymorphic variations.
* **Punc Realm (`database.puncs`)**: Represents API endpoint Contracts (functions). Contains strictly `.request` and `.response` shapes. These cannot be inherited by standard data models.
* **Enum Realm (`database.enums`)**: Represents simple restricted value lists. Handled universally across all lookups.
The core execution engines natively enforce these boundaries:
* **Validator**: Routes dynamically using a single schema key, transparently switching domains to validate Punc requests/responses from the `Punc` realm, or raw instance payloads from the `Type` realm.
* **Merger**: Strictly bounded to the `Type` Realm. It is philosophically impossible and mathematically illegal to attempt to UPSERT an API endpoint.
* **Queryer**: Routes recursively. Safely evaluates API boundary inputs directly from the `Punc` realm, while tracing underlying table targets back through the `Type` realm to physically compile SQL `SELECT` statements.
### Types of Types ### Types of Types
* **Table-Backed (Entity Types)**: Primarily defined in root `types` schemas. These represent physical Postgres tables. * **Table-Backed (Entity Types)**: Primarily defined in root `types` schemas. These represent physical Postgres tables.
* They are implicitly registered in the Global Registry using their precise key name mapped from the database compilation phase. * They are implicitly registered in the Global Registry using their precise key name mapped from the database compilation phase.
@ -45,12 +56,14 @@ JSPG augments standard JSON Schema 2020-12 to provide an opinionated, strict, an
* **Global Schema Registration**: Roots must be attached to the top-level keys mapped from the `types`, `enums`, or `puncs` database tables. * **Global Schema Registration**: Roots must be attached to the top-level keys mapped from the `types`, `enums`, or `puncs` database tables.
* They can re-use the standard `type` discriminator locally for `oneOf` polymorphism without conflicting with global Postgres Table constraints. * They can re-use the standard `type` discriminator locally for `oneOf` polymorphism without conflicting with global Postgres Table constraints.
### Discriminators & The Dot Convention (A.B) ### Discriminators & The `<Variant>.<Base>` Convention
In Punc, polymorphic targets like explicit tagged unions or STI (Single Table Inheritance) rely on discriminators. Because Punc favors universal consistency, a schema's data contract must be explicit and mathematically identical regardless of the routing context an endpoint consumes it through. In Punc, polymorphic targets like explicit tagged unions or STI (Single Table Inheritance) rely on discriminators. The system heavily leverages a standard `<Variant>.<Base>` dot-notation to enforce topological boundaries deterministically.
**The 2-Tier Paradigm**: The system inherently prevents "God Tables" by restricting routing to exactly two dimensions, guaranteeing absolute $O(1)$ lookups without ambiguity: **The 2-Tier Paradigm**: The system prevents "God Tables" by restricting routing to exactly two dimensions, guaranteeing absolute $O(1)$ lookups without ambiguity:
1. **Vertical Routing (`type`)**: Identifies the specific Postgres Table lineage (e.g. `person` vs `organization`). 1. **Base (Vertical Routing)**: Represents the core physical lineage or foundational structural boundary. For entities, this is the table `type` (e.g. `person` or `widget`). For composed schemas, this is the root structural archetype (e.g., `filter`).
2. **Horizontal Routing (`kind.type`)**: Natively evaluates Single Table Inheritance. The runtime dynamically concatenates `$kind.$type` to yield the namespace-protected schema key (e.g. `light.person`), maintaining collision-free schema registration. 2. **Variant (Horizontal Routing)**: Represents the specific contextual projection or runtime mutation applied to the Base. For STI entities, this is the `kind` (e.g., `light`, `heavy`, `stock`). For composed filters, the variant identifies the entity it targets (e.g., `person`, `invoice`).
When an object is evaluated for STI polymorphism, the runtime natively extracts its `$kind` and `$type` values, dynamically concatenating them as `<Variant>.<Base>` (e.g. `light.person` or `stock.widget`) to yield the namespace-protected schema key.
Therefore, any schema that participates in polymorphic discrimination MUST explicitly define its discriminator properties natively inside its `properties` block. However, to stay DRY and maintain flexible APIs, you **DO NOT** need to hardcode `const` values, nor should you add them to your `required` array. The Punc engine treats `type` and `kind` as **magic properties**. Therefore, any schema that participates in polymorphic discrimination MUST explicitly define its discriminator properties natively inside its `properties` block. However, to stay DRY and maintain flexible APIs, you **DO NOT** need to hardcode `const` values, nor should you add them to your `required` array. The Punc engine treats `type` and `kind` as **magic properties**.
@ -80,6 +93,7 @@ Punc completely abandons the standard JSON Schema `$ref` keyword. Instead, it ov
* **Implicit Keyword Shadowing**: Unlike standard JSON Schema inheritance, local property definitions natively override and shadow inherited properties. * **Implicit Keyword Shadowing**: Unlike standard JSON Schema inheritance, local property definitions natively override and shadow inherited properties.
* **Primitive Array Shorthand (Optionality)**: The `type` array syntax is heavily optimized for nullable fields. Defining `"type": ["budget", "null"]` natively builds a nullable strict, generating `Budget? budget;` in Dart. You can freely mix primitives like `["string", "number", "null"]`. * **Primitive Array Shorthand (Optionality)**: The `type` array syntax is heavily optimized for nullable fields. Defining `"type": ["budget", "null"]` natively builds a nullable strict, generating `Budget? budget;` in Dart. You can freely mix primitives like `["string", "number", "null"]`.
* **Strict Array Constraint**: To explicitly prevent mathematically ambiguous Multiple Inheritance, a `type` array is strictly constrained to at most **ONE** Custom Object Pointer. Defining `"type": ["person", "organization"]` will intentionally trigger a fatal database compilation error natively instructing developers to build a proper tagged union (`oneOf`) instead. * **Strict Array Constraint**: To explicitly prevent mathematically ambiguous Multiple Inheritance, a `type` array is strictly constrained to at most **ONE** Custom Object Pointer. Defining `"type": ["person", "organization"]` will intentionally trigger a fatal database compilation error natively instructing developers to build a proper tagged union (`oneOf`) instead.
* **Dynamic Type Bindings (`"$sibling.[suffix]"`)**: If a `type` string begins with a `$` (e.g., `"type": "$kind.filter"`), the JSPG engine treats it as a Dynamic Pointer. During compile time, it safely defers boundary checks. During runtime validation, the engine dynamically reads the literal string value of the referenced sibling property (`kind`) on the *current parent JSON object*, evaluates the substitution (e.g., `"person.filter"`), and instantly routes execution to that schema in $O(1)$ time. This enables incredibly powerful dynamic JSONB shapes (like a generic `filter` column inside a `search` table) without forcing downstream code generators to build unmaintainable unions.
### Polymorphism (`family` and `oneOf`) ### Polymorphism (`family` and `oneOf`)
Polymorphism is how an object boundary can dynamically take on entirely different shapes based on the payload provided at runtime. Punc utilizes the static database metadata generated from Postgres (`db.types`) to enforce these boundaries deterministically, rather than relying on ambiguous tree-traversals. Polymorphism is how an object boundary can dynamically take on entirely different shapes based on the payload provided at runtime. Punc utilizes the static database metadata generated from Postgres (`db.types`) to enforce these boundaries deterministically, rather than relying on ambiguous tree-traversals.
@ -92,7 +106,7 @@ Polymorphism is how an object boundary can dynamically take on entirely differen
* **Scenario B: Prefixed Tables (Vertical Projection)** * **Scenario B: Prefixed Tables (Vertical Projection)**
* *Setup*: `{ "family": "light.organization" }` * *Setup*: `{ "family": "light.organization" }`
* *Execution*: The engine sees the prefix `light.` and base `organization`. It queries `db.types.get("organization").variations` and dynamically prepends the prefix to discover the relevant UI schemas. * *Execution*: The engine sees the prefix `light.` and base `organization`. It queries `db.types.get("organization").variations` and dynamically prepends the prefix to discover the relevant UI schemas.
* *Options*: `person` -> `light.person`, `organization` -> `light.organization`. (If a projection like `light.bot` does not exist in `db.schemas`, it is safely ignored). * *Options*: `person` -> `light.person`, `organization` -> `light.organization`. (If a projection like `light.bot` does not exist in the Type Registry, it is safely ignored).
* **Scenario C: Single Table Inheritance (Horizontal Routing)** * **Scenario C: Single Table Inheritance (Horizontal Routing)**
* *Setup*: `{ "family": "widget" }` (Where `widget` is a table type but has no external variations). * *Setup*: `{ "family": "widget" }` (Where `widget` is a table type but has no external variations).
* *Execution*: The engine queries `db.types.get("widget").variations` and finds only `["widget"]`. Since it lacks table inheritance, it is treated as STI. The engine scans the specific, confined `schemas` array directly under `db.types.get("widget")` for any registered key terminating in the base `.widget` (e.g., `stock.widget`). The `family` automatically uses `kind` as the discriminator. * *Execution*: The engine queries `db.types.get("widget").variations` and finds only `["widget"]`. Since it lacks table inheritance, it is treated as STI. The engine scans the specific, confined `schemas` array directly under `db.types.get("widget")` for any registered key terminating in the base `.widget` (e.g., `stock.widget`). The `family` automatically uses `kind` as the discriminator.
@ -171,7 +185,7 @@ When compiling nested object graphs or arrays, the JSPG engine must dynamically
### Subschema Promotion ### Subschema Promotion
To seamlessly support deeply nested Object and Array structures, JSPG aggressively promotes them to standalone topological entities during the database compilation phase. To seamlessly support deeply nested Object and Array structures, JSPG aggressively promotes them to standalone topological entities during the database compilation phase.
* **Path Generation:** While evaluating a unified graph originating from a base `types`, `enums`, or `puncs` key, the compiler tracks its exact path descent into nested objects and arrays. It dynamically calculates a localized alias string by appending a `/` pathing syntax (e.g., `base_schema_key/nested/path`) representing exactly its structural constraints. * **Path Generation:** While evaluating a unified graph originating from a base `types`, `enums`, or `puncs` key, the compiler tracks its exact path descent into nested objects and arrays. It dynamically calculates a localized alias string by appending a `/` pathing syntax (e.g., `base_schema_key/nested/path`) representing exactly its structural constraints.
* **Promotion:** This nested subschema chunk is mathematically elevated to its own independent key in the `db.schemas` cache registry using its full path. This guarantees that $O(1)$ WebSockets or isolated queries can natively target any arbitrary nested sub-object of a massive database topology directly without recursively re-parsing its parent's AST block every read. Note that you cannot use the `type` attribute to statically inherit from these automatically promoted subschemas. * **Promotion:** This nested subschema chunk is mathematically elevated to an independent subschema entry natively within its parent's internal scope (e.g. inside `db.types.get("base").schemas`) using its full path. This guarantees that $O(1)$ WebSockets or isolated queries can natively target any arbitrary nested sub-object of a massive database topology directly without recursively re-parsing its parent's AST block every read. Note that you cannot use the `type` attribute to statically inherit from these automatically promoted subschemas.
* **Primitive Confinement:** Purely scalar or primitive branches (like `oneOf: [{type: "string"}, {type: "null"}]`) bypass global topological promotion. They are evaluated directly within the execution engine via isolated Tuple Indexes to explicitly protect the global DB Registry and Go Mixer from memory bloat. * **Primitive Confinement:** Purely scalar or primitive branches (like `oneOf: [{type: "string"}, {type: "null"}]`) bypass global topological promotion. They are evaluated directly within the execution engine via isolated Tuple Indexes to explicitly protect the global DB Registry and Go Mixer from memory bloat.
--- ---

155
fixtures/dynamicType.json Normal file
View File

@ -0,0 +1,155 @@
[
{
"description": "Dynamic type binding ($sibling.suffix) validation",
"database": {
"types": [
{
"name": "person",
"schemas": {
"person.filter": {
"properties": {
"age": {
"type": "integer"
}
}
}
}
},
{
"name": "widget",
"schemas": {
"widget.filter": {
"properties": {
"weight": {
"type": "integer"
}
}
}
}
},
{
"name": "search",
"schemas": {
"search": {
"properties": {
"kind": {
"type": "string"
},
"filter": {
"type": "$kind.filter"
}
}
}
}
}
]
},
"tests": [
{
"description": "Valid person filter payload",
"data": {
"kind": "person",
"filter": {
"age": 30
}
},
"schema_id": "search",
"action": "validate",
"expect": {
"success": true
}
},
{
"description": "Invalid person filter payload (fails constraint)",
"data": {
"kind": "person",
"filter": {
"age": "thirty"
}
},
"schema_id": "search",
"action": "validate",
"expect": {
"success": false,
"errors": [
{
"code": "INVALID_TYPE",
"details": {
"path": "filter/age"
}
}
]
}
},
{
"description": "Valid widget filter payload",
"data": {
"kind": "widget",
"filter": {
"weight": 500
}
},
"schema_id": "search",
"action": "validate",
"expect": {
"success": true
}
},
{
"description": "Fails resolution if kind doesn't match an existing schema",
"data": {
"kind": "unknown",
"filter": {
"weight": 500
}
},
"schema_id": "search",
"action": "validate",
"expect": {
"success": false,
"errors": [
{
"code": "DYNAMIC_TYPE_RESOLUTION_FAILED",
"details": {
"path": "filter"
}
},
{
"code": "STRICT_PROPERTY_VIOLATION",
"details": {
"path": "filter/weight"
}
}
]
}
},
{
"description": "Fails resolution if discriminator is missing",
"data": {
"filter": {
"weight": 500
}
},
"schema_id": "search",
"action": "validate",
"expect": {
"success": false,
"errors": [
{
"code": "DYNAMIC_TYPE_RESOLUTION_FAILED",
"details": {
"path": "filter"
}
},
{
"code": "STRICT_PROPERTY_VIOLATION",
"details": {
"path": "filter/weight"
}
}
]
}
}
]
}
]

View File

@ -107,17 +107,17 @@
"search": { "search": {
"type": "object", "type": "object",
"properties": { "properties": {
"kind": {
"type": "string"
},
"name": { "name": {
"type": "string" "type": "string"
}, },
"filter": { "filter": {
"type": "filter" "type": "$kind.filter"
} }
} }
}, },
"filter": {
"type": "object"
},
"condition": { "condition": {
"type": "object", "type": "object",
"properties": { "properties": {
@ -172,7 +172,7 @@
"schemas": { "schemas": {
"person": {}, "person": {},
"person.filter": { "person.filter": {
"type": "filter", "type": "object",
"compiledPropertyNames": [ "compiledPropertyNames": [
"$and", "$and",
"$or", "$or",
@ -244,7 +244,7 @@
}, },
"address": {}, "address": {},
"address.filter": { "address.filter": {
"type": "filter", "type": "object",
"compiledPropertyNames": [ "compiledPropertyNames": [
"$and", "$and",
"$or", "$or",
@ -287,18 +287,18 @@
} }
} }
}, },
"filter": {},
"condition": {}, "condition": {},
"string.condition": {}, "string.condition": {},
"integer.condition": {}, "integer.condition": {},
"date.condition": {}, "date.condition": {},
"search": {}, "search": {},
"search.filter": { "search.filter": {
"type": "filter", "type": "object",
"compiledPropertyNames": [ "compiledPropertyNames": [
"$and", "$and",
"$or", "$or",
"filter", "filter",
"kind",
"name" "name"
], ],
"properties": { "properties": {
@ -312,6 +312,7 @@
"$and", "$and",
"$or", "$or",
"filter", "filter",
"kind",
"name" "name"
], ],
"type": "search.filter" "type": "search.filter"
@ -327,6 +328,7 @@
"$and", "$and",
"$or", "$or",
"filter", "filter",
"kind",
"name" "name"
], ],
"type": "search.filter" "type": "search.filter"
@ -334,7 +336,13 @@
}, },
"filter": { "filter": {
"type": [ "type": [
"filter.filter", "$kind.filter.filter",
"null"
]
},
"kind": {
"type": [
"string.condition",
"null" "null"
] ]
}, },

23
log.txt
View File

@ -1,23 +0,0 @@
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.60s
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
running 1 test
test tests::test_library_api ... FAILED
failures:
---- tests::test_library_api stdout ----
thread 'tests::test_library_api' (110325727) panicked at src/tests/mod.rs:86:3:
assertion `left == right` failed
left: Object {"response": Object {"enums": Object {}, "puncs": Object {}, "relations": Object {"fk_test_target": Object {"constraint": String("fk_test_target"), "destination_columns": Array [String("id")], "destination_type": String("target_schema"), "id": String("11111111-1111-1111-1111-111111111111"), "prefix": String("target"), "source_columns": Array [String("target_id")], "source_type": String("source_schema"), "type": String("relation")}}, "types": Object {"source_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("source_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("source_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"source_schema": Object {"compiledEdges": Object {"target": Object {"constraint": String("fk_test_target"), "forward": Bool(true)}}, "compiledPropertyNames": Array [String("name"), String("target"), String("type")], "properties": Object {"name": Object {"type": String("string")}, "target": Object {"compiledPropertyNames": Array [String("value")], "type": String("target_schema")}, "type": Object {"type": String("string")}}, "required": Array [String("name")], "type": String("object")}, "source_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "properties": Object {"$and": Object {"items": Object {"type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "name": Object {"type": Array [String("string.condition"), String("null")]}, "target": Object {"type": Array [String("target_schema.filter"), String("null")]}, "type": Object {"type": Array [String("string.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("source_schema")]}, "target_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("target_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("target_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"target_schema": Object {"compiledPropertyNames": Array [String("value")], "properties": Object {"value": Object {"type": String("number")}}, "type": String("object")}, "target_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "properties": Object {"$and": Object {"items": Object {"type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "value": Object {"type": Array [String("number.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("target_schema")]}}}, "type": String("drop")}
right: Object {"response": Object {"enums": Object {}, "puncs": Object {}, "relations": Object {"fk_test_target": Object {"constraint": String("fk_test_target"), "destination_columns": Array [String("id")], "destination_type": String("target_schema"), "id": String("11111111-1111-1111-1111-111111111111"), "prefix": String("target"), "source_columns": Array [String("target_id")], "source_type": String("source_schema"), "type": String("relation")}}, "types": Object {"source_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("source_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("source_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"source_schema": Object {"compiledEdges": Object {"target": Object {"constraint": String("fk_test_target"), "forward": Bool(true)}}, "compiledPropertyNames": Array [String("name"), String("target"), String("type")], "properties": Object {"name": Object {"type": String("string")}, "target": Object {"compiledPropertyNames": Array [String("value")], "type": String("target_schema")}, "type": Object {"type": String("string")}}, "required": Array [String("name")], "type": String("object")}, "source_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "properties": Object {"$and": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "name": Object {"type": Array [String("string.condition"), String("null")]}, "target": Object {"type": Array [String("target_schema.filter"), String("null")]}, "type": Object {"type": Array [String("string.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("source_schema")]}, "target_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("target_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("target_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"target_schema": Object {"compiledPropertyNames": Array [String("value")], "properties": Object {"value": Object {"type": String("number")}}, "type": String("object")}, "target_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "properties": Object {"$and": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "value": Object {"type": Array [String("number.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("target_schema")]}}}, "type": String("drop")}
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
failures:
tests::test_library_api
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 1357 filtered out; finished in 0.00s
error: test failed, to rerun pass `--lib`

View File

@ -1,23 +0,0 @@
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.35s
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
running 1 test
test tests::test_library_api ... FAILED
failures:
---- tests::test_library_api stdout ----
thread 'tests::test_library_api' (110334696) panicked at src/tests/mod.rs:86:3:
assertion `left == right` failed
left: Object {"response": Object {"enums": Object {}, "puncs": Object {}, "relations": Object {"fk_test_target": Object {"constraint": String("fk_test_target"), "destination_columns": Array [String("id")], "destination_type": String("target_schema"), "id": String("11111111-1111-1111-1111-111111111111"), "prefix": String("target"), "source_columns": Array [String("target_id")], "source_type": String("source_schema"), "type": String("relation")}}, "types": Object {"source_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("source_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("source_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"source_schema": Object {"compiledEdges": Object {"target": Object {"constraint": String("fk_test_target"), "forward": Bool(true)}}, "compiledPropertyNames": Array [String("name"), String("target"), String("type")], "properties": Object {"name": Object {"type": String("string")}, "target": Object {"compiledPropertyNames": Array [String("value")], "type": String("target_schema")}, "type": Object {"type": String("string")}}, "required": Array [String("name")], "type": String("object")}, "source_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "properties": Object {"$and": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "name": Object {"type": Array [String("string.condition"), String("null")]}, "target": Object {"type": Array [String("target_schema.filter"), String("null")]}, "type": Object {"type": Array [String("string.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("source_schema")]}, "target_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("target_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("target_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"target_schema": Object {"compiledPropertyNames": Array [String("value")], "properties": Object {"value": Object {"type": String("number")}}, "type": String("object")}, "target_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "properties": Object {"$and": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "value": Object {"type": Array [String("number.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("target_schema")]}}}, "type": String("drop")}
right: Object {"response": Object {"enums": Object {}, "puncs": Object {}, "relations": Object {"fk_test_target": Object {"constraint": String("fk_test_target"), "destination_columns": Array [String("id")], "destination_type": String("target_schema"), "id": String("11111111-1111-1111-1111-111111111111"), "prefix": String("target"), "source_columns": Array [String("target_id")], "source_type": String("source_schema"), "type": String("relation")}}, "types": Object {"source_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("source_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("source_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"source_schema": Object {"compiledEdges": Object {"target": Object {"constraint": String("fk_test_target"), "forward": Bool(true)}}, "compiledPropertyNames": Array [String("name"), String("target"), String("type")], "properties": Object {"name": Object {"type": String("string")}, "target": Object {"compiledPropertyNames": Array [String("value")], "type": String("target_schema")}, "type": Object {"type": String("string")}}, "required": Array [String("name")], "type": String("object")}, "source_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "properties": Object {"$and": Object {"items": Object {"type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "name": Object {"type": Array [String("string.condition"), String("null")]}, "target": Object {"type": Array [String("target_schema.filter"), String("null")]}, "type": Object {"type": Array [String("string.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("source_schema")]}, "target_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("target_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("target_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"target_schema": Object {"compiledPropertyNames": Array [String("value")], "properties": Object {"value": Object {"type": String("number")}}, "type": String("object")}, "target_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "properties": Object {"$and": Object {"items": Object {"type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "value": Object {"type": Array [String("number.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("target_schema")]}}}, "type": String("drop")}
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
failures:
tests::test_library_api
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 1357 filtered out; finished in 0.00s
error: test failed, to rerun pass `--lib`

View File

@ -1,55 +0,0 @@
import json
import os
import glob
fixtures_dir = 'fixtures'
for filepath in glob.glob(os.path.join(fixtures_dir, '*.json')):
try:
with open(filepath, 'r') as f:
data = json.load(f)
except Exception as e:
continue
changed = False
for suite in data:
db = suite.get("database")
if not db or "schemas" not in db:
continue
legacy_schemas = db["schemas"]
# Make sure types array is ready
if "types" not in db:
db["types"] = []
# Push schemas into types
for schema_id, schema_def in legacy_schemas.items():
base_name = schema_id.split('.')[-1]
# Find an existing type with base_name first
found = False
for t in db["types"]:
if t.get("name") == base_name:
if "schemas" not in t:
t["schemas"] = {}
t["schemas"][schema_id] = schema_def
found = True
break
if not found:
db["types"].append({
"name": base_name,
"variations": [base_name], # Optional placeholder, shouldn't break anything
"hierarchy": [base_name, "entity"],
"schemas": {
schema_id: schema_def
}
})
# Clean up legacy global map
del db["schemas"]
changed = True
if changed:
with open(filepath, 'w') as f:
json.dump(data, f, indent=2)
print("Migrated legacy schemas to types in", filepath)

View File

@ -1,54 +0,0 @@
import json
import os
import glob
fixtures_dir = 'fixtures'
for filepath in glob.glob(os.path.join(fixtures_dir, '*.json')):
try:
with open(filepath, 'r') as f:
data = json.load(f)
except Exception as e:
print(f"Failed to load {filepath}: {e}")
continue
changed = False
for suite in data:
db = suite.get("database")
if not db or "schemas" not in db:
continue
legacy_schemas = db["schemas"]
# Make sure types array is ready
if "types" not in db:
db["types"] = []
# Push schemas into types
for schema_id, schema_def in legacy_schemas.items():
base_name = schema_id.split('.')[-1]
# Find an existing type with base_name first
found = False
for t in db["types"]:
if t.get("name") == base_name:
if "schemas" not in t:
t["schemas"] = {}
t["schemas"][schema_id] = schema_def
found = True
break
if not found:
db["types"].append({
"name": base_name,
"schemas": {
schema_id: schema_def
}
})
# Clean up legacy global map
del db["schemas"]
changed = True
if changed:
with open(filepath, 'w') as f:
json.dump(data, f, indent=2)
print("Migrated legacy schemas to types properly in", filepath)

View File

@ -1,41 +0,0 @@
const fs = require('fs');
const path = require('path');
function updateFile(filePath) {
let content = fs.readFileSync(filePath, 'utf8');
let data;
try {
data = JSON.parse(content);
} catch (e) {
console.error("Failed to parse " + filePath, e);
return;
}
let changed = false;
for (let suite of data) {
if (suite.database && suite.database.puncs && suite.database.puncs.length > 0) {
if (!suite.database.types) suite.database.types = [];
for (let punc of suite.database.puncs) {
// Determine if we should push it to types.
// Basically all of them should go to types except maybe if they are explicitly being tested as Puncs?
// But the tests construct Queryer and Merger using these ids, which query the Type Realm.
suite.database.types.push(punc);
}
delete suite.database.puncs;
changed = true;
}
}
if (changed) {
fs.writeFileSync(filePath, JSON.stringify(data, null, 2));
console.log("Reverted puncs to types in " + filePath);
}
}
let fixturesDir = 'fixtures';
let files = fs.readdirSync(fixturesDir);
for (let file of files) {
if (file.endsWith('.json')) {
updateFile(path.join(fixturesDir, file));
}
}

View File

@ -1,29 +0,0 @@
import json
import os
import glob
fixtures_dir = 'fixtures'
for filepath in glob.glob(os.path.join(fixtures_dir, '*.json')):
with open(filepath, 'r') as f:
try:
data = json.load(f)
except Exception as e:
print("Failed to parse", filepath, e)
continue
changed = False
for suite in data:
db = suite.get("database", {})
puncs = db.get("puncs", [])
if puncs:
if "types" not in db:
db["types"] = []
for punc in puncs:
db["types"].append(punc)
del db["puncs"]
changed = True
if changed:
with open(filepath, 'w') as f:
json.dump(data, f, indent=2)
print("Reverted puncs to types in", filepath)

View File

@ -1,43 +0,0 @@
const fs = require('fs');
const path = require('path');
function updateFile(filePath) {
let content = fs.readFileSync(filePath, 'utf8');
let data;
try {
data = JSON.parse(content);
} catch (e) {
console.error("Failed to parse " + filePath, e);
return;
}
let changed = false;
for (let suite of data) {
if (suite.database && suite.database.schemas) {
if (!suite.database.puncs) suite.database.puncs = [];
for (let id of Object.keys(suite.database.schemas)) {
let schema = suite.database.schemas[id];
let puncType = {
name: id,
schemas: { [id]: schema }
};
suite.database.puncs.push(puncType);
}
delete suite.database.schemas;
changed = true;
}
}
if (changed) {
fs.writeFileSync(filePath, JSON.stringify(data, null, 2));
console.log("Updated " + filePath);
}
}
let fixturesDir = 'fixtures';
let files = fs.readdirSync(fixturesDir);
for (let file of files) {
if (file.endsWith('.json')) {
updateFile(path.join(fixturesDir, file));
}
}

View File

@ -1,33 +0,0 @@
import json
import os
fixtures_dir = 'fixtures'
for filename in os.listdir(fixtures_dir):
if not filename.endswith('.json'):
continue
filepath = os.path.join(fixtures_dir, filename)
with open(filepath, 'r') as f:
try:
data = json.load(f)
except json.JSONDecodeError:
print("Failed to parse", filepath)
continue
changed = False
for suite in data:
db = suite.get('database', {})
if 'schemas' in db:
if 'types' not in db:
db['types'] = []
for id_str, schema in db['schemas'].items():
target_type = {
'name': id_str,
'schemas': { id_str: schema }
}
db['types'].append(target_type)
del db['schemas']
changed = True
if changed:
with open(filepath, 'w') as f:
json.dump(data, f, indent=2)
print("Updated", filepath)

View File

@ -12,11 +12,11 @@ impl Schema {
) { ) {
#[cfg(not(test))] #[cfg(not(test))]
for c in id.chars() { for c in id.chars() {
if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' { if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' && c != '$' {
errors.push(crate::drop::Error { errors.push(crate::drop::Error {
code: "INVALID_IDENTIFIER".to_string(), code: "INVALID_IDENTIFIER".to_string(),
message: format!( message: format!(
"Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.]", "Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.$]",
c, field_name, id c, field_name, id
), ),
details: crate::drop::ErrorDetails { details: crate::drop::ErrorDetails {

View File

@ -74,8 +74,8 @@ impl Schema {
); );
let mut wrapper_obj = SchemaObject::default(); let mut wrapper_obj = SchemaObject::default();
// Conceptually link this directly into the STI lineage of the base `filter` object // Filters are just plain objects containing conditions, no inheritance required
wrapper_obj.type_ = Some(SchemaTypeOrArray::Single("filter".to_string())); wrapper_obj.type_ = Some(SchemaTypeOrArray::Single("object".to_string()));
wrapper_obj.properties = Some(filter_props); wrapper_obj.properties = Some(filter_props);
return Some(Schema { return Some(Schema {

View File

@ -51,8 +51,8 @@ impl Schema {
// 1. Resolve INHERITANCE dependencies first // 1. Resolve INHERITANCE dependencies first
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ { if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
if !crate::database::object::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) && !t.starts_with('$') {
if let Some(parent) = db.get_scoped_schema(crate::database::realm::SchemaRealm::Type, t) { if let Some(parent) = db.schemas.get(t).cloned() {
parent.as_ref().compile(db, t, t.clone(), errors); parent.as_ref().compile(db, t, t.clone(), errors);
if let Some(p_props) = parent.obj.compiled_properties.get() { if let Some(p_props) = parent.obj.compiled_properties.get() {
props.extend(p_props.clone()); props.extend(p_props.clone());
@ -85,8 +85,8 @@ impl Schema {
} }
for t in types { for t in types {
if !crate::database::object::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) && !t.starts_with('$') {
if let Some(parent) = db.get_scoped_schema(crate::database::realm::SchemaRealm::Type, t) { if let Some(parent) = db.schemas.get(t).cloned() {
parent.as_ref().compile(db, t, t.clone(), errors); parent.as_ref().compile(db, t, t.clone(), errors);
} }
} }

View File

@ -12,7 +12,10 @@ impl Schema {
let mut strategy = String::new(); let mut strategy = String::new();
if let Some(family) = &self.obj.family { if let Some(family) = &self.obj.family {
// Formalize the <Variant>.<Base> topology
// family_base extracts the 'Base' (e.g. 'widget', 'person')
let family_base = family.split('.').next_back().unwrap_or(family).to_string(); let family_base = family.split('.').next_back().unwrap_or(family).to_string();
// family_prefix extracts the 'Variant' (e.g. 'stock', 'light')
let family_prefix = family let family_prefix = family
.strip_suffix(&family_base) .strip_suffix(&family_base)
.unwrap_or("") .unwrap_or("")
@ -29,7 +32,7 @@ impl Schema {
format!("{}.{}", family_prefix, var) format!("{}.{}", family_prefix, var)
}; };
if db.get_scoped_schema(crate::database::realm::SchemaRealm::Type, &target_id).is_some() { if db.schemas.get(&target_id).is_some() {
options.insert(var.to_string(), (None, Some(target_id))); options.insert(var.to_string(), (None, Some(target_id)));
} }
} }

View File

@ -6,7 +6,6 @@ pub mod formats;
pub mod object; pub mod object;
pub mod page; pub mod page;
pub mod punc; pub mod punc;
pub mod realm;
pub mod relation; pub mod relation;
pub mod schema; pub mod schema;
pub mod r#type; pub mod r#type;
@ -21,7 +20,6 @@ use executors::pgrx::SpiExecutor;
use executors::mock::MockExecutor; use executors::mock::MockExecutor;
use punc::Punc; use punc::Punc;
use realm::SchemaRealm;
use relation::Relation; use relation::Relation;
use schema::Schema; use schema::Schema;
use serde_json::Value; use serde_json::Value;
@ -36,6 +34,8 @@ pub struct Database {
pub puncs: HashMap<String, Punc>, pub puncs: HashMap<String, Punc>,
pub relations: HashMap<String, Relation>, pub relations: HashMap<String, Relation>,
#[serde(skip)] #[serde(skip)]
pub schemas: HashMap<String, Arc<Schema>>,
#[serde(skip)]
pub executor: Box<dyn DatabaseExecutor + Send + Sync>, pub executor: Box<dyn DatabaseExecutor + Send + Sync>,
} }
@ -46,6 +46,7 @@ impl Database {
types: HashMap::new(), types: HashMap::new(),
relations: HashMap::new(), relations: HashMap::new(),
puncs: HashMap::new(), puncs: HashMap::new(),
schemas: HashMap::new(),
#[cfg(not(test))] #[cfg(not(test))]
executor: Box::new(SpiExecutor::new()), executor: Box::new(SpiExecutor::new()),
#[cfg(test)] #[cfg(test)]
@ -196,19 +197,25 @@ impl Database {
for (_, enum_def) in &self.enums { for (_, enum_def) in &self.enums {
for (schema_id, schema_arc) in &enum_def.schemas { for (schema_id, schema_arc) in &enum_def.schemas {
let root_id = schema_id.split('/').next().unwrap_or(schema_id); let root_id = schema_id.split('/').next().unwrap_or(schema_id);
schema_arc.as_ref().compile(self, root_id, schema_id.clone(), errors); schema_arc
.as_ref()
.compile(self, root_id, schema_id.clone(), errors);
} }
} }
for (_, type_def) in &self.types { for (_, type_def) in &self.types {
for (schema_id, schema_arc) in &type_def.schemas { for (schema_id, schema_arc) in &type_def.schemas {
let root_id = schema_id.split('/').next().unwrap_or(schema_id); let root_id = schema_id.split('/').next().unwrap_or(schema_id);
schema_arc.as_ref().compile(self, root_id, schema_id.clone(), errors); schema_arc
.as_ref()
.compile(self, root_id, schema_id.clone(), errors);
} }
} }
for (_, punc_def) in &self.puncs { for (_, punc_def) in &self.puncs {
for (schema_id, schema_arc) in &punc_def.schemas { for (schema_id, schema_arc) in &punc_def.schemas {
let root_id = schema_id.split('/').next().unwrap_or(schema_id); let root_id = schema_id.split('/').next().unwrap_or(schema_id);
schema_arc.as_ref().compile(self, root_id, schema_id.clone(), errors); schema_arc
.as_ref()
.compile(self, root_id, schema_id.clone(), errors);
} }
} }
@ -234,6 +241,7 @@ impl Database {
let mut filter_ids = Vec::new(); let mut filter_ids = Vec::new();
for (type_name, id, filter_arc) in filter_schemas { for (type_name, id, filter_arc) in filter_schemas {
filter_ids.push((type_name.clone(), id.clone())); filter_ids.push((type_name.clone(), id.clone()));
self.schemas.insert(id.clone(), filter_arc.clone());
if let Some(t) = self.types.get_mut(&type_name) { if let Some(t) = self.types.get_mut(&type_name) {
t.schemas.insert(id, filter_arc); t.schemas.insert(id, filter_arc);
} }
@ -241,7 +249,12 @@ impl Database {
// Now actively compile the newly injected filters to lock all nested compose references natively // Now actively compile the newly injected filters to lock all nested compose references natively
for (type_name, id) in filter_ids { for (type_name, id) in filter_ids {
if let Some(filter_arc) = self.types.get(&type_name).and_then(|t| t.schemas.get(&id)).cloned() { if let Some(filter_arc) = self
.types
.get(&type_name)
.and_then(|t| t.schemas.get(&id))
.cloned()
{
let root_id = id.split('/').next().unwrap_or(&id); let root_id = id.split('/').next().unwrap_or(&id);
filter_arc filter_arc
.as_ref() .as_ref()
@ -259,6 +272,7 @@ impl Database {
// Validate every node recursively via string filters natively! // Validate every node recursively via string filters natively!
for (type_name, type_def) in &self.types { for (type_name, type_def) in &self.types {
for (id, schema_arc) in &type_def.schemas { for (id, schema_arc) in &type_def.schemas {
self.schemas.insert(id.clone(), Arc::clone(schema_arc));
let mut local_insert = Vec::new(); let mut local_insert = Vec::new();
crate::database::schema::Schema::collect_schemas( crate::database::schema::Schema::collect_schemas(
schema_arc, schema_arc,
@ -275,6 +289,7 @@ impl Database {
for (punc_name, punc_def) in &self.puncs { for (punc_name, punc_def) in &self.puncs {
for (id, schema_arc) in &punc_def.schemas { for (id, schema_arc) in &punc_def.schemas {
self.schemas.insert(id.clone(), Arc::clone(schema_arc));
let mut local_insert = Vec::new(); let mut local_insert = Vec::new();
crate::database::schema::Schema::collect_schemas( crate::database::schema::Schema::collect_schemas(
schema_arc, schema_arc,
@ -291,6 +306,7 @@ impl Database {
for (enum_name, enum_def) in &self.enums { for (enum_name, enum_def) in &self.enums {
for (id, schema_arc) in &enum_def.schemas { for (id, schema_arc) in &enum_def.schemas {
self.schemas.insert(id.clone(), Arc::clone(schema_arc));
let mut local_insert = Vec::new(); let mut local_insert = Vec::new();
crate::database::schema::Schema::collect_schemas( crate::database::schema::Schema::collect_schemas(
schema_arc, schema_arc,
@ -305,57 +321,27 @@ impl Database {
} }
} }
// Apply local scopes // Apply local scopes and global schema map
for (origin_name, id, schema_arc) in type_insert { for (origin_name, id, schema_arc) in type_insert {
self.schemas.insert(id.clone(), schema_arc.clone());
if let Some(t) = self.types.get_mut(&origin_name) { if let Some(t) = self.types.get_mut(&origin_name) {
t.schemas.insert(id, schema_arc); t.schemas.insert(id, schema_arc);
} }
} }
for (origin_name, id, schema_arc) in punc_insert { for (origin_name, id, schema_arc) in punc_insert {
self.schemas.insert(id.clone(), schema_arc.clone());
if let Some(p) = self.puncs.get_mut(&origin_name) { if let Some(p) = self.puncs.get_mut(&origin_name) {
p.schemas.insert(id, schema_arc); p.schemas.insert(id, schema_arc);
} }
} }
for (origin_name, id, schema_arc) in enum_insert { for (origin_name, id, schema_arc) in enum_insert {
self.schemas.insert(id.clone(), schema_arc.clone());
if let Some(e) = self.enums.get_mut(&origin_name) { if let Some(e) = self.enums.get_mut(&origin_name) {
e.schemas.insert(id, schema_arc); e.schemas.insert(id, schema_arc);
} }
} }
} }
pub fn get_scoped_schema(&self, realm: SchemaRealm, schema_id: &str) -> Option<Arc<Schema>> {
// Punc Realm natively maps mathematically to `.request` and `.response` shapes
if realm == SchemaRealm::Punc {
if schema_id.ends_with(".request") || schema_id.ends_with(".response") {
let punc_name = schema_id
.trim_end_matches(".request")
.trim_end_matches(".response");
return self.puncs.get(punc_name).and_then(|p| p.schemas.get(schema_id).cloned());
}
}
let clean_id = schema_id.trim_end_matches(".filter");
let root_id = clean_id.split('/').next().unwrap_or(clean_id);
let base_name = root_id.split('.').next_back().unwrap_or(root_id);
// Puncs and Types can lookup Table boundaries
if realm == SchemaRealm::Type || realm == SchemaRealm::Punc {
if let Some(type_def) = self.types.get(base_name) {
if let Some(schema) = type_def.schemas.get(schema_id) {
return Some(schema.clone());
}
}
}
// All realms can intrinsically look up enumerations
if let Some(enum_def) = self.enums.get(base_name) {
if let Some(schema) = enum_def.schemas.get(schema_id) {
return Some(schema.clone());
}
}
None
}
/// Inspects the Postgres pg_constraint relations catalog to securely identify /// Inspects the Postgres pg_constraint relations catalog to securely identify
/// the precise Foreign Key connecting a parent and child hierarchy path. /// the precise Foreign Key connecting a parent and child hierarchy path.

View File

@ -1,6 +0,0 @@
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum SchemaRealm {
Enum,
Type,
Punc,
}

View File

@ -22,6 +22,27 @@ impl std::ops::DerefMut for Schema {
} }
} }
impl Schema {
/// Returns true if the schema acts purely as a type pointer (composition without overriding constraints)
pub fn is_proxy(&self) -> bool {
self.obj.properties.is_none()
&& self.obj.pattern_properties.is_none()
&& self.obj.additional_properties.is_none()
&& self.obj.required.is_none()
&& self.obj.dependencies.is_none()
&& self.obj.items.is_none()
&& self.obj.prefix_items.is_none()
&& self.obj.contains.is_none()
&& self.obj.format.is_none()
&& self.obj.enum_.is_none()
&& self.obj.const_.is_none()
&& self.obj.cases.is_none()
&& self.obj.one_of.is_none()
&& self.obj.not.is_none()
&& self.obj.family.is_none()
}
}
impl<'de> Deserialize<'de> for Schema { impl<'de> Deserialize<'de> for Schema {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error> fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where where

View File

@ -4,7 +4,6 @@
pub mod cache; pub mod cache;
use crate::database::Database; use crate::database::Database;
use crate::database::realm::SchemaRealm;
use crate::database::r#type::Type; use crate::database::r#type::Type;
use serde_json::Value; use serde_json::Value;
use std::sync::Arc; use std::sync::Arc;
@ -25,7 +24,7 @@ impl Merger {
pub fn merge(&self, schema_id: &str, data: Value) -> crate::drop::Drop { pub fn merge(&self, schema_id: &str, data: Value) -> crate::drop::Drop {
let mut notifications_queue = Vec::new(); let mut notifications_queue = Vec::new();
let target_schema = match self.db.get_scoped_schema(SchemaRealm::Type, schema_id) { let target_schema = match self.db.schemas.get(schema_id) {
Some(s) => Arc::clone(&s), Some(s) => Arc::clone(&s),
None => { None => {
return crate::drop::Drop::with_errors(vec![crate::drop::Error { return crate::drop::Drop::with_errors(vec![crate::drop::Error {
@ -146,7 +145,7 @@ impl Merger {
if let Some((idx_opt, target_id_opt)) = options.get(v) { if let Some((idx_opt, target_id_opt)) = options.get(v) {
if let Some(target_id) = target_id_opt { if let Some(target_id) = target_id_opt {
if let Some(target_schema) = if let Some(target_schema) =
self.db.get_scoped_schema(SchemaRealm::Type, target_id) self.db.schemas.get(target_id)
{ {
schema = target_schema.clone(); schema = target_schema.clone();
} else { } else {

View File

@ -1,5 +1,4 @@
use crate::database::Database; use crate::database::Database;
use crate::database::realm::SchemaRealm;
use std::sync::Arc; use std::sync::Arc;
pub struct Compiler<'a> { pub struct Compiler<'a> {
@ -25,15 +24,11 @@ pub struct Node<'a> {
impl<'a> Compiler<'a> { impl<'a> Compiler<'a> {
/// Compiles a JSON schema into a nested PostgreSQL query returning JSONB /// Compiles a JSON schema into a nested PostgreSQL query returning JSONB
pub fn compile(&self, schema_id: &str, filter_keys: &[String]) -> Result<String, String> { pub fn compile(&self, schema_id: &str, filter_keys: &[String]) -> Result<String, String> {
let realm = if schema_id.ends_with(".request") || schema_id.ends_with(".response") {
SchemaRealm::Punc
} else {
SchemaRealm::Type
};
let schema = self let schema = self
.db .db
.get_scoped_schema(realm, schema_id) .schemas
.get(schema_id)
.cloned()
.ok_or_else(|| format!("Schema not found: {}", schema_id))?; .ok_or_else(|| format!("Schema not found: {}", schema_id))?;
let target_schema = schema; let target_schema = schema;
@ -157,7 +152,7 @@ impl<'a> Compiler<'a> {
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &node.schema.obj.type_ { if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &node.schema.obj.type_ {
if !crate::database::object::is_primitive_type(t) { if !crate::database::object::is_primitive_type(t) {
// If it's just an ad-hoc struct ref, we should resolve it // If it's just an ad-hoc struct ref, we should resolve it
if let Some(target_schema) = self.db.get_scoped_schema(SchemaRealm::Type, t) { if let Some(target_schema) = self.db.schemas.get(t).cloned() {
let mut ref_node = node.clone(); let mut ref_node = node.clone();
ref_node.schema = target_schema.clone(); ref_node.schema = target_schema.clone();
ref_node.schema_id = Some(t.clone()); ref_node.schema_id = Some(t.clone());
@ -312,7 +307,7 @@ impl<'a> Compiler<'a> {
for (disc_val, (idx_opt, target_id_opt)) in options { for (disc_val, (idx_opt, target_id_opt)) in options {
if let Some(target_id) = target_id_opt { if let Some(target_id) = target_id_opt {
if let Some(target_schema) = self.db.get_scoped_schema(SchemaRealm::Type, target_id) { if let Some(target_schema) = self.db.schemas.get(target_id).cloned() {
let mut child_node = node.clone(); let mut child_node = node.clone();
child_node.schema = target_schema.clone(); child_node.schema = target_schema.clone();
child_node.schema_id = Some(target_id.clone()); child_node.schema_id = Some(target_id.clone());

View File

@ -1247,6 +1247,36 @@ fn test_const_17_1() {
crate::tests::runner::run_test_case(&path, 17, 1).unwrap(); crate::tests::runner::run_test_case(&path, 17, 1).unwrap();
} }
#[test]
fn test_dynamic_type_0_0() {
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
crate::tests::runner::run_test_case(&path, 0, 0).unwrap();
}
#[test]
fn test_dynamic_type_0_1() {
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
crate::tests::runner::run_test_case(&path, 0, 1).unwrap();
}
#[test]
fn test_dynamic_type_0_2() {
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
crate::tests::runner::run_test_case(&path, 0, 2).unwrap();
}
#[test]
fn test_dynamic_type_0_3() {
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
crate::tests::runner::run_test_case(&path, 0, 3).unwrap();
}
#[test]
fn test_dynamic_type_0_4() {
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
crate::tests::runner::run_test_case(&path, 0, 4).unwrap();
}
#[test] #[test]
fn test_property_names_0_0() { fn test_property_names_0_0() {
let path = format!("{}/fixtures/propertyNames.json", env!("CARGO_MANIFEST_DIR")); let path = format!("{}/fixtures/propertyNames.json", env!("CARGO_MANIFEST_DIR"));

View File

@ -160,7 +160,7 @@ fn test_library_api() {
"target": { "type": ["target_schema.filter", "null"] }, "target": { "type": ["target_schema.filter", "null"] },
"type": { "type": ["string.condition", "null"] } "type": { "type": ["string.condition", "null"] }
}, },
"type": "filter" "type": "object"
} }
}, },
"sensitive": false, "sensitive": false,
@ -211,7 +211,7 @@ fn test_library_api() {
}, },
"value": { "type": ["number.condition", "null"] } "value": { "type": ["number.condition", "null"] }
}, },
"type": "filter" "type": "object"
} }
}, },
"sensitive": false, "sensitive": false,

View File

@ -35,12 +35,7 @@ impl Expect {
if expected_val.is_object() && expected_val.as_object().unwrap().is_empty() { if expected_val.is_object() && expected_val.as_object().unwrap().is_empty() {
continue; // A `{}` means we just wanted to test it was collected/promoted, skip deep match continue; // A `{}` means we just wanted to test it was collected/promoted, skip deep match
} }
let schema_realm = if key.ends_with(".request") || key.ends_with(".response") { let actual_ast = db.schemas.get(key).cloned().unwrap();
crate::database::realm::SchemaRealm::Punc
} else {
crate::database::realm::SchemaRealm::Type
};
let actual_ast = db.get_scoped_schema(schema_realm, key).unwrap();
let actual_val = serde_json::to_value(actual_ast).unwrap(); let actual_val = serde_json::to_value(actual_ast).unwrap();
if actual_val != *expected_val { if actual_val != *expected_val {

View File

@ -15,6 +15,7 @@ pub struct ValidationContext<'a> {
pub extensible: bool, pub extensible: bool,
pub reporter: bool, pub reporter: bool,
pub overrides: HashSet<String>, pub overrides: HashSet<String>,
pub parent: Option<&'a serde_json::Value>,
} }
impl<'a> ValidationContext<'a> { impl<'a> ValidationContext<'a> {
@ -38,6 +39,7 @@ impl<'a> ValidationContext<'a> {
extensible: effective_extensible, extensible: effective_extensible,
reporter, reporter,
overrides, overrides,
parent: None,
} }
} }
@ -57,6 +59,7 @@ impl<'a> ValidationContext<'a> {
overrides: HashSet<String>, overrides: HashSet<String>,
extensible: bool, extensible: bool,
reporter: bool, reporter: bool,
parent_instance: Option<&'a serde_json::Value>,
) -> Self { ) -> Self {
let effective_extensible = schema.extensible.unwrap_or(extensible); let effective_extensible = schema.extensible.unwrap_or(extensible);
@ -70,6 +73,7 @@ impl<'a> ValidationContext<'a> {
extensible: effective_extensible, extensible: effective_extensible,
reporter, reporter,
overrides, overrides,
parent: parent_instance,
} }
} }
@ -81,6 +85,7 @@ impl<'a> ValidationContext<'a> {
HashSet::new(), HashSet::new(),
self.extensible, self.extensible,
reporter, reporter,
self.parent,
) )
} }

View File

@ -10,7 +10,6 @@ pub use error::ValidationError;
pub use result::ValidationResult; pub use result::ValidationResult;
use crate::database::Database; use crate::database::Database;
use crate::database::realm::SchemaRealm;
use crate::validator::rules::util::is_integer; use crate::validator::rules::util::is_integer;
use serde_json::Value; use serde_json::Value;
use std::sync::Arc; use std::sync::Arc;
@ -43,11 +42,7 @@ impl Validator {
} }
pub fn validate(&self, schema_id: &str, instance: &Value) -> crate::drop::Drop { pub fn validate(&self, schema_id: &str, instance: &Value) -> crate::drop::Drop {
let schema_opt = if schema_id.ends_with(".request") || schema_id.ends_with(".response") { let schema_opt = self.db.schemas.get(schema_id);
self.db.get_scoped_schema(SchemaRealm::Punc, schema_id)
} else {
self.db.get_scoped_schema(SchemaRealm::Type, schema_id)
};
if let Some(schema) = schema_opt { if let Some(schema) = schema_opt {
let ctx = ValidationContext::new( let ctx = ValidationContext::new(

View File

@ -57,6 +57,7 @@ impl<'a> ValidationContext<'a> {
HashSet::new(), HashSet::new(),
self.extensible, self.extensible,
false, false,
Some(self.instance),
); );
let check = derived.validate()?; let check = derived.validate()?;
@ -108,6 +109,7 @@ impl<'a> ValidationContext<'a> {
HashSet::new(), HashSet::new(),
self.extensible, self.extensible,
false, false,
Some(self.instance),
); );
let item_res = derived.validate()?; let item_res = derived.validate()?;
result.merge(item_res); result.merge(item_res);
@ -137,6 +139,7 @@ impl<'a> ValidationContext<'a> {
HashSet::new(), HashSet::new(),
self.extensible, self.extensible,
false, false,
Some(self.instance),
); );
let item_res = derived.validate()?; let item_res = derived.validate()?;
result.merge(item_res); result.merge(item_res);

View File

@ -12,6 +12,7 @@ pub mod numeric;
pub mod object; pub mod object;
pub mod polymorphism; pub mod polymorphism;
pub mod string; pub mod string;
pub mod r#type;
pub mod util; pub mod util;
impl<'a> ValidationContext<'a> { impl<'a> ValidationContext<'a> {
@ -28,7 +29,7 @@ impl<'a> ValidationContext<'a> {
if !self.validate_family(&mut result)? { if !self.validate_family(&mut result)? {
return Ok(result); return Ok(result);
} }
if !self.validate_type_inheritance(&mut result)? { if !self.validate_type(&mut result)? {
return Ok(result); return Ok(result);
} }

View File

@ -191,6 +191,7 @@ impl<'a> ValidationContext<'a> {
HashSet::new(), HashSet::new(),
next_extensible, next_extensible,
false, false,
Some(self.instance),
); );
let item_res = derived.validate()?; let item_res = derived.validate()?;
@ -220,6 +221,7 @@ impl<'a> ValidationContext<'a> {
HashSet::new(), HashSet::new(),
next_extensible, next_extensible,
false, false,
Some(self.instance),
); );
let item_res = derived.validate()?; let item_res = derived.validate()?;
result.merge(item_res); result.merge(item_res);
@ -265,6 +267,7 @@ impl<'a> ValidationContext<'a> {
HashSet::new(), HashSet::new(),
next_extensible, next_extensible,
false, false,
Some(self.instance),
); );
let item_res = derived.validate()?; let item_res = derived.validate()?;
result.merge(item_res); result.merge(item_res);

View File

@ -1,7 +1,6 @@
use crate::validator::context::ValidationContext; use crate::validator::context::ValidationContext;
use crate::validator::error::ValidationError; use crate::validator::error::ValidationError;
use crate::validator::result::ValidationResult; use crate::validator::result::ValidationResult;
use crate::database::realm::SchemaRealm;
impl<'a> ValidationContext<'a> { impl<'a> ValidationContext<'a> {
pub(crate) fn validate_family( pub(crate) fn validate_family(
@ -100,8 +99,8 @@ impl<'a> ValidationContext<'a> {
if let Some(val) = instance_val { if let Some(val) = instance_val {
if let Some((idx_opt, target_id_opt)) = options.get(&val) { if let Some((idx_opt, target_id_opt)) = options.get(&val) {
if let Some(target_id) = target_id_opt { if let Some(target_id) = target_id_opt {
if let Some(target_schema) = self.db.get_scoped_schema(SchemaRealm::Type, target_id) { if let Some(target_schema) = self.db.schemas.get(target_id) {
let derived = self.derive_for_schema(&target_schema, false); let derived = self.derive_for_schema(target_schema, false);
let sub_res = derived.validate()?; let sub_res = derived.validate()?;
let is_valid = sub_res.is_valid(); let is_valid = sub_res.is_valid();
result.merge(sub_res); result.merge(sub_res);
@ -177,78 +176,4 @@ impl<'a> ValidationContext<'a> {
return Ok(false); return Ok(false);
} }
} }
pub(crate) fn validate_type_inheritance(
&self,
result: &mut ValidationResult,
) -> Result<bool, ValidationError> {
// Core inheritance logic replaces legacy routing
let payload_primitive = match self.instance {
serde_json::Value::Null => "null",
serde_json::Value::Bool(_) => "boolean",
serde_json::Value::Number(n) => {
if n.is_i64() || n.is_u64() {
"integer"
} else {
"number"
}
}
serde_json::Value::String(_) => "string",
serde_json::Value::Array(_) => "array",
serde_json::Value::Object(_) => "object",
};
let mut custom_types = Vec::new();
match &self.schema.type_ {
Some(crate::database::object::SchemaTypeOrArray::Single(t)) => {
if !crate::database::object::is_primitive_type(t) {
custom_types.push(t.clone());
}
}
Some(crate::database::object::SchemaTypeOrArray::Multiple(arr)) => {
if arr.contains(&payload_primitive.to_string())
|| (payload_primitive == "integer" && arr.contains(&"number".to_string()))
{
// It natively matched a primitive in the array options, skip forcing custom proxy fallback
} else {
for t in arr {
if !crate::database::object::is_primitive_type(t) {
custom_types.push(t.clone());
}
}
}
}
None => {}
}
for t in custom_types {
if let Some(global_schema) = self.db.get_scoped_schema(SchemaRealm::Type, &t) {
let mut new_overrides = self.overrides.clone();
if let Some(props) = &self.schema.properties {
new_overrides.extend(props.keys().map(|k| k.to_string()));
}
let mut shadow = self.derive(
&global_schema,
self.instance,
&self.path,
new_overrides,
self.extensible,
true, // Reporter mode
);
shadow.root = &global_schema;
result.merge(shadow.validate()?);
} else {
result.errors.push(ValidationError {
code: "INHERITANCE_RESOLUTION_FAILED".to_string(),
message: format!(
"Inherited entity pointer '{}' was not found in schema registry",
t
),
path: self.path.to_string(),
});
}
}
Ok(true)
}
} }

138
src/validator/rules/type.rs Normal file
View File

@ -0,0 +1,138 @@
use crate::validator::context::ValidationContext;
use crate::validator::error::ValidationError;
use crate::validator::result::ValidationResult;
impl<'a> ValidationContext<'a> {
pub(crate) fn validate_type(
&self,
result: &mut ValidationResult,
) -> Result<bool, ValidationError> {
let payload_primitive = match self.instance {
serde_json::Value::Null => "null",
serde_json::Value::Bool(_) => "boolean",
serde_json::Value::Number(n) => {
if n.is_i64() || n.is_u64() {
"integer"
} else {
"number"
}
}
serde_json::Value::String(_) => "string",
serde_json::Value::Array(_) => "array",
serde_json::Value::Object(_) => "object",
};
let mut custom_types = Vec::new();
match &self.schema.type_ {
Some(crate::database::object::SchemaTypeOrArray::Single(t)) => {
if !crate::database::object::is_primitive_type(t) {
custom_types.push(t.clone());
}
}
Some(crate::database::object::SchemaTypeOrArray::Multiple(arr)) => {
if arr.contains(&payload_primitive.to_string())
|| (payload_primitive == "integer" && arr.contains(&"number".to_string()))
{
// It natively matched a primitive in the array options, skip forcing custom proxy fallback
} else {
for t in arr {
if !crate::database::object::is_primitive_type(t) {
custom_types.push(t.clone());
}
}
}
}
None => {}
}
for t in custom_types {
let mut target_id = t.clone();
// 1. DYNAMIC TYPE (Composition)
if t.starts_with('$') {
let parts: Vec<&str> = t.split('.').collect();
let var_name = &parts[0][1..]; // Remove the $ prefix
let suffix = if parts.len() > 1 {
format!(".{}", parts[1..].join("."))
} else {
String::new()
};
let mut resolved = false;
if let Some(parent) = self.parent {
if let Some(obj) = parent.as_object() {
if let Some(val) = obj.get(var_name) {
if let Some(str_val) = val.as_str() {
target_id = format!("{}{}", str_val, suffix);
resolved = true;
}
}
}
}
if !resolved {
result.errors.push(ValidationError {
code: "DYNAMIC_TYPE_RESOLUTION_FAILED".to_string(),
message: format!(
"Dynamic type pointer '{}' could not resolve discriminator property '{}' on parent instance",
t, var_name
),
path: self.path.to_string(),
});
continue;
}
}
// 2. Fetch and apply
if let Some(global_schema) = self.db.schemas.get(&target_id) {
let mut new_overrides = self.overrides.clone();
if let Some(props) = &self.schema.properties {
new_overrides.extend(props.keys().map(|k| k.to_string()));
}
let mut shadow = self.derive(
&global_schema,
self.instance,
&self.path,
new_overrides,
self.extensible,
true, // Reporter mode
self.parent,
);
shadow.root = &global_schema;
result.merge(shadow.validate()?);
} else {
// 3. Error handling pathways
if t.starts_with('$') {
result.errors.push(ValidationError {
code: "DYNAMIC_TYPE_RESOLUTION_FAILED".to_string(),
message: format!(
"Resolved dynamic type pointer '{}' was not found in schema registry",
target_id
),
path: self.path.to_string(),
});
} else if self.schema.is_proxy() {
result.errors.push(ValidationError {
code: "PROXY_TYPE_RESOLUTION_FAILED".to_string(),
message: format!(
"Composed proxy entity pointer '{}' was not found in schema registry",
target_id
),
path: self.path.to_string(),
});
} else {
result.errors.push(ValidationError {
code: "INHERITANCE_RESOLUTION_FAILED".to_string(),
message: format!(
"Inherited entity pointer '{}' was not found in schema registry",
target_id
),
path: self.path.to_string(),
});
}
}
}
Ok(true)
}
}

View File

@ -1,81 +0,0 @@
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.43s
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
running 11 tests
test tests::test_minimum_0_2 ... ok
test tests::test_minimum_1_4 ... ok
test tests::test_minimum_1_0 ... FAILED
test tests::test_minimum_1_1 ... FAILED
test tests::test_minimum_0_3 ... FAILED
test tests::test_minimum_1_5 ... ok
test tests::test_minimum_1_3 ... FAILED
test tests::test_minimum_0_0 ... FAILED
test tests::test_minimum_0_1 ... FAILED
test tests::test_minimum_1_2 ... FAILED
test tests::test_minimum_1_6 ... FAILED
failures:
---- tests::test_minimum_1_0 stdout ----
TEST VALIDATE ERROR FOR 'negative above the minimum is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
thread 'tests::test_minimum_1_0' (110318318) panicked at src/tests/fixtures.rs:3503:54:
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'negative above the minimum is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
---- tests::test_minimum_1_1 stdout ----
TEST VALIDATE ERROR FOR 'positive above the minimum is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
thread 'tests::test_minimum_1_1' (110318319) panicked at src/tests/fixtures.rs:3509:54:
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'positive above the minimum is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
---- tests::test_minimum_0_3 stdout ----
TEST VALIDATE ERROR FOR 'ignores non-numbers': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_0_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
thread 'tests::test_minimum_0_3' (110318317) panicked at src/tests/fixtures.rs:3497:54:
called `Result::unwrap()` on an `Err` value: "[minimum validation] Validate Test 'ignores non-numbers' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_0_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
---- tests::test_minimum_1_3 stdout ----
TEST VALIDATE ERROR FOR 'boundary point with float is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
thread 'tests::test_minimum_1_3' (110318321) panicked at src/tests/fixtures.rs:3521:54:
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'boundary point with float is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
---- tests::test_minimum_0_0 stdout ----
TEST VALIDATE ERROR FOR 'above the minimum is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_0_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
thread 'tests::test_minimum_0_0' (110318314) panicked at src/tests/fixtures.rs:3479:54:
called `Result::unwrap()` on an `Err` value: "[minimum validation] Validate Test 'above the minimum is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_0_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
---- tests::test_minimum_0_1 stdout ----
TEST VALIDATE ERROR FOR 'boundary point is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_0_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
thread 'tests::test_minimum_0_1' (110318315) panicked at src/tests/fixtures.rs:3485:54:
called `Result::unwrap()` on an `Err` value: "[minimum validation] Validate Test 'boundary point is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_0_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
---- tests::test_minimum_1_2 stdout ----
TEST VALIDATE ERROR FOR 'boundary point is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
thread 'tests::test_minimum_1_2' (110318320) panicked at src/tests/fixtures.rs:3515:54:
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'boundary point is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
---- tests::test_minimum_1_6 stdout ----
TEST VALIDATE ERROR FOR 'ignores non-numbers': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
thread 'tests::test_minimum_1_6' (110318324) panicked at src/tests/fixtures.rs:3539:54:
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'ignores non-numbers' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
failures:
tests::test_minimum_0_0
tests::test_minimum_0_1
tests::test_minimum_0_3
tests::test_minimum_1_0
tests::test_minimum_1_1
tests::test_minimum_1_2
tests::test_minimum_1_3
tests::test_minimum_1_6
test result: FAILED. 3 passed; 8 failed; 0 ignored; 0 measured; 1347 filtered out; finished in 0.00s
error: test failed, to rerun pass `--lib`

View File

@ -1,23 +0,0 @@
Compiling jspg v0.1.0 (/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg)
Finished `test` profile [unoptimized + debuginfo] target(s) in 7.59s
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
running 1 test
test tests::test_merge_0_0 ... FAILED
failures:
---- tests::test_merge_0_0 stdout ----
TEST VALIDATE ERROR FOR 'valid with both properties': Expected success: true, Got: false. Actual Errors: [Error { code: "MISSING_TYPE", message: "Schema mechanically requires type discrimination 'base_0'", details: ErrorDetails { path: Some(""), cause: None, context: None, schema: None } }]
thread 'tests::test_merge_0_0' (110369726) panicked at src/tests/fixtures.rs:4307:54:
called `Result::unwrap()` on an `Err` value: "[merging: properties accumulate] Validate Test 'valid with both properties' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"MISSING_TYPE\", message: \"Schema mechanically requires type discrimination 'base_0'\", details: ErrorDetails { path: Some(\"\"), cause: None, context: None, schema: None } }]"
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
failures:
tests::test_merge_0_0
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 1357 filtered out; finished in 0.00s
error: test failed, to rerun pass `--lib`

View File

@ -1 +1 @@
1.0.126 1.0.129