Compare commits
35 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 146efaa2d9 | |||
| d0294eec3f | |||
| 02ab4b6438 | |||
| 2a8b991269 | |||
| ce9c9baac9 | |||
| 3034406706 | |||
| 3d918a1acc | |||
| 1f9b407074 | |||
| 6ea6007d86 | |||
| c129864c89 | |||
| 777fc8bbf8 | |||
| 803d62b2fb | |||
| 8845dcdef2 | |||
| 40e08cbf09 | |||
| c7372891d8 | |||
| 952c5036be | |||
| 1fb378def2 | |||
| 6cc4f4ad86 | |||
| ba5079fb73 | |||
| 98e7f5da12 | |||
| 9599b4cbc3 | |||
| f51799f0b1 | |||
| c8757e1709 | |||
| e45265b242 | |||
| ec867f142f | |||
| e9b5c82809 | |||
| 628471e5d5 | |||
| 0093aea790 | |||
| 45ebc57e0c | |||
| 00319b570b | |||
| 9ab3689808 | |||
| bff0884ad2 | |||
| 4e2cb488cc | |||
| a1e6ac8cb0 | |||
| 120f488d93 |
1
Cargo.lock
generated
1
Cargo.lock
generated
@ -1663,6 +1663,7 @@ version = "1.0.149"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "83fc039473c5595ace860d8c4fafa220ff474b3fc6bfdb4293327f1a37e94d86"
|
||||
dependencies = [
|
||||
"indexmap",
|
||||
"itoa",
|
||||
"memchr",
|
||||
"serde",
|
||||
|
||||
@ -6,7 +6,7 @@ edition = "2024"
|
||||
[dependencies]
|
||||
pgrx = "0.16.1"
|
||||
serde = { version = "1.0.228", features = ["derive", "rc"] }
|
||||
serde_json = "1.0.149"
|
||||
serde_json = { version = "1.0.149", features = ["preserve_order"] }
|
||||
lazy_static = "1.5.0"
|
||||
once_cell = "1.21.3"
|
||||
ahash = "0.8.12"
|
||||
@ -30,7 +30,7 @@ pgrx-tests = "0.16.1"
|
||||
|
||||
[build-dependencies]
|
||||
serde = { version = "1.0.228", features = ["derive"] }
|
||||
serde_json = "1.0.149"
|
||||
serde_json = { version = "1.0.149", features = ["preserve_order"] }
|
||||
|
||||
[lib]
|
||||
crate-type = ["cdylib", "lib"]
|
||||
|
||||
77
GEMINI.md
77
GEMINI.md
@ -28,7 +28,7 @@ These functions operate on the global `GLOBAL_JSPG` engine instance and provide
|
||||
|
||||
* `jspg_setup(database jsonb) -> jsonb`: Initializes the engine. Deserializes the full database schema registry (types, enums, puncs, relations) from Postgres and compiles them into memory atomically.
|
||||
* `jspg_teardown() -> jsonb`: Clears the current session's engine instance from `GLOBAL_JSPG`, resetting the cache.
|
||||
* `jspg_schemas() -> jsonb`: Exports the fully compiled AST snapshot (including all inherited dependencies) out of `GLOBAL_JSPG` into standard JSON Schema representations.
|
||||
* `jspg_database() -> jsonb`: Exports the fully compiled snapshot of the database registry (including Types, Puncs, Enums, and Relations) out of `GLOBAL_JSPG` into standard JSON Schema representations.
|
||||
|
||||
---
|
||||
|
||||
@ -36,6 +36,17 @@ These functions operate on the global `GLOBAL_JSPG` engine instance and provide
|
||||
|
||||
JSPG augments standard JSON Schema 2020-12 to provide an opinionated, strict, and highly ergonomic Object-Oriented paradigm. Developers defining Punc Data Models should follow these conventions.
|
||||
|
||||
### Realms (Topological Boundaries)
|
||||
JSPG strictly organizes schemas into three distinct topological boundaries called **Realms** to prevent cross-contamination and ensure secure API generation:
|
||||
* **Type Realm (`database.types`)**: Represents physical Postgres tables or structural JSONB bubbles. Table-backed entities here are strictly evaluated for their `type` or `kind` discriminators if they possess polymorphic variations.
|
||||
* **Punc Realm (`database.puncs`)**: Represents API endpoint Contracts (functions). Contains strictly `.request` and `.response` shapes. These cannot be inherited by standard data models.
|
||||
* **Enum Realm (`database.enums`)**: Represents simple restricted value lists. Handled universally across all lookups.
|
||||
|
||||
The core execution engines natively enforce these boundaries:
|
||||
* **Validator**: Routes dynamically using a single schema key, transparently switching domains to validate Punc requests/responses from the `Punc` realm, or raw instance payloads from the `Type` realm.
|
||||
* **Merger**: Strictly bounded to the `Type` Realm. It is philosophically impossible and mathematically illegal to attempt to UPSERT an API endpoint.
|
||||
* **Queryer**: Routes recursively. Safely evaluates API boundary inputs directly from the `Punc` realm, while tracing underlying table targets back through the `Type` realm to physically compile SQL `SELECT` statements.
|
||||
|
||||
### Types of Types
|
||||
* **Table-Backed (Entity Types)**: Primarily defined in root `types` schemas. These represent physical Postgres tables.
|
||||
* They are implicitly registered in the Global Registry using their precise key name mapped from the database compilation phase.
|
||||
@ -45,12 +56,14 @@ JSPG augments standard JSON Schema 2020-12 to provide an opinionated, strict, an
|
||||
* **Global Schema Registration**: Roots must be attached to the top-level keys mapped from the `types`, `enums`, or `puncs` database tables.
|
||||
* They can re-use the standard `type` discriminator locally for `oneOf` polymorphism without conflicting with global Postgres Table constraints.
|
||||
|
||||
### Discriminators & The Dot Convention (A.B)
|
||||
In Punc, polymorphic targets like explicit tagged unions or STI (Single Table Inheritance) rely on discriminators. Because Punc favors universal consistency, a schema's data contract must be explicit and mathematically identical regardless of the routing context an endpoint consumes it through.
|
||||
### Discriminators & The `<Variant>.<Base>` Convention
|
||||
In Punc, polymorphic targets like explicit tagged unions or STI (Single Table Inheritance) rely on discriminators. The system heavily leverages a standard `<Variant>.<Base>` dot-notation to enforce topological boundaries deterministically.
|
||||
|
||||
**The 2-Tier Paradigm**: The system inherently prevents "God Tables" by restricting routing to exactly two dimensions, guaranteeing absolute $O(1)$ lookups without ambiguity:
|
||||
1. **Vertical Routing (`type`)**: Identifies the specific Postgres Table lineage (e.g. `person` vs `organization`).
|
||||
2. **Horizontal Routing (`kind.type`)**: Natively evaluates Single Table Inheritance. The runtime dynamically concatenates `$kind.$type` to yield the namespace-protected schema key (e.g. `light.person`), maintaining collision-free schema registration.
|
||||
**The 2-Tier Paradigm**: The system prevents "God Tables" by restricting routing to exactly two dimensions, guaranteeing absolute $O(1)$ lookups without ambiguity:
|
||||
1. **Base (Vertical Routing)**: Represents the core physical lineage or foundational structural boundary. For entities, this is the table `type` (e.g. `person` or `widget`). For composed schemas, this is the root structural archetype (e.g., `filter`).
|
||||
2. **Variant (Horizontal Routing)**: Represents the specific contextual projection or runtime mutation applied to the Base. For STI entities, this is the `kind` (e.g., `light`, `heavy`, `stock`). For composed filters, the variant identifies the entity it targets (e.g., `person`, `invoice`).
|
||||
|
||||
When an object is evaluated for STI polymorphism, the runtime natively extracts its `$kind` and `$type` values, dynamically concatenating them as `<Variant>.<Base>` (e.g. `light.person` or `stock.widget`) to yield the namespace-protected schema key.
|
||||
|
||||
Therefore, any schema that participates in polymorphic discrimination MUST explicitly define its discriminator properties natively inside its `properties` block. However, to stay DRY and maintain flexible APIs, you **DO NOT** need to hardcode `const` values, nor should you add them to your `required` array. The Punc engine treats `type` and `kind` as **magic properties**.
|
||||
|
||||
@ -80,6 +93,7 @@ Punc completely abandons the standard JSON Schema `$ref` keyword. Instead, it ov
|
||||
* **Implicit Keyword Shadowing**: Unlike standard JSON Schema inheritance, local property definitions natively override and shadow inherited properties.
|
||||
* **Primitive Array Shorthand (Optionality)**: The `type` array syntax is heavily optimized for nullable fields. Defining `"type": ["budget", "null"]` natively builds a nullable strict, generating `Budget? budget;` in Dart. You can freely mix primitives like `["string", "number", "null"]`.
|
||||
* **Strict Array Constraint**: To explicitly prevent mathematically ambiguous Multiple Inheritance, a `type` array is strictly constrained to at most **ONE** Custom Object Pointer. Defining `"type": ["person", "organization"]` will intentionally trigger a fatal database compilation error natively instructing developers to build a proper tagged union (`oneOf`) instead.
|
||||
* **Dynamic Type Bindings (`"$sibling.[suffix]"`)**: If a `type` string begins with a `$` (e.g., `"type": "$kind.filter"`), the JSPG engine treats it as a Dynamic Pointer. During compile time, it safely defers boundary checks. During runtime validation, the engine dynamically reads the literal string value of the referenced sibling property (`kind`) on the *current parent JSON object*, evaluates the substitution (e.g., `"person.filter"`), and instantly routes execution to that schema in $O(1)$ time. This enables incredibly powerful dynamic JSONB shapes (like a generic `filter` column inside a `search` table) without forcing downstream code generators to build unmaintainable unions.
|
||||
|
||||
### Polymorphism (`family` and `oneOf`)
|
||||
Polymorphism is how an object boundary can dynamically take on entirely different shapes based on the payload provided at runtime. Punc utilizes the static database metadata generated from Postgres (`db.types`) to enforce these boundaries deterministically, rather than relying on ambiguous tree-traversals.
|
||||
@ -92,11 +106,15 @@ Polymorphism is how an object boundary can dynamically take on entirely differen
|
||||
* **Scenario B: Prefixed Tables (Vertical Projection)**
|
||||
* *Setup*: `{ "family": "light.organization" }`
|
||||
* *Execution*: The engine sees the prefix `light.` and base `organization`. It queries `db.types.get("organization").variations` and dynamically prepends the prefix to discover the relevant UI schemas.
|
||||
* *Options*: `person` -> `light.person`, `organization` -> `light.organization`. (If a projection like `light.bot` does not exist in `db.schemas`, it is safely ignored).
|
||||
* *Options*: `person` -> `light.person`, `organization` -> `light.organization`. (If a projection like `light.bot` does not exist in the Type Registry, it is safely ignored).
|
||||
* **Scenario C: Single Table Inheritance (Horizontal Routing)**
|
||||
* *Setup*: `{ "family": "widget" }` (Where `widget` is a table type but has no external variations).
|
||||
* *Execution*: The engine queries `db.types.get("widget").variations` and finds only `["widget"]`. Since it lacks table inheritance, it is treated as STI. The engine scans the specific, confined `schemas` array directly under `db.types.get("widget")` for any registered key terminating in the base `.widget` (e.g., `stock.widget`). The `family` automatically uses `kind` as the discriminator.
|
||||
* *Options*: `stock` -> `stock.widget`, `tasks` -> `tasks.widget`.
|
||||
* **Scenario D: JSONB Bubble Inheritance (Field-Backed)**
|
||||
* *Setup*: `{ "family": "panel" }` (Where `panel` is NOT a table type, but rather an isolated JSONB boundary defined within another table's `schemas`).
|
||||
* *Execution*: The engine observes `panel` is not in `db.types` (because it has no physical table). It falls back to scanning the global `db.schemas` registry for any registered key terminating in the base `.panel` (e.g., `balance.panel`, `units.panel`). The `family` automatically uses `kind` as the discriminator.
|
||||
* *Options*: `balance` -> `balance.panel`, `units` -> `units.panel`.
|
||||
|
||||
* **`oneOf` (Strict Tagged Unions)**: A hardcoded list of candidate schemas. Unlike `family` which relies on global DB metadata, `oneOf` forces pure mathematical structural evaluation of the provided candidates. It strictly bans typical JSON Schema "Union of Sets" fallback searches. Every candidate MUST possess a mathematically unique discriminator payload to allow $O(1)$ routing.
|
||||
* **Disjoint Types**: `oneOf: [{ "type": "person" }, { "type": "widget" }]`. The engine succeeds because the native `type` acts as a unique discriminator (`"person"` vs `"widget"`).
|
||||
@ -152,6 +170,17 @@ It evaluates as an **Independent Declarative Rules Engine**. Every `Case` block
|
||||
### Format Leniency for Empty Strings
|
||||
To simplify frontend form validation, format validators specifically for `uuid`, `date-time`, and `email` explicitly allow empty strings (`""`), treating them as "present but unset".
|
||||
|
||||
### Filters & Conditions
|
||||
In the Punc architecture, filters are automatically synthesized, strongly-typed JSON Schema boundaries that dictate the exact querying capabilities for any given entity or enum. They are completely generated for you; you never write them manually.
|
||||
|
||||
* **Conditions**: A condition schema is the contract defining the mathematical operations allowed on a primitive field. For example, a `string.condition` allows `$eq`, `$ne`, `$gt`, `$gte`, `$lt`, `$lte`, `$of` (IN), and `$nof` (NOT IN).
|
||||
* **Enum Conditions**: When JSPG synthesizes an enum, it dynamically generates an `<enum>.condition` (e.g., `address_kind.condition`). This strongly-typed condition perfectly mirrors the operations of a `string.condition`, but strictly limits the arrays and inputs of `$eq`, `$ne`, `$of`, and `$nof` to the exact variations defined by that Enum. This context ensures that UI generators know exactly when to render `<Select>` dropdowns instead of generic `<Text>` boxes.
|
||||
* **Filters**: A filter schema (e.g., `person.filter`) is an object containing condition properties used to filter entities. It natively supports structural composition:
|
||||
* **Inherited Properties**: Filters automatically inherit all valid database columns from their base type schema, immediately converting them to their respective `.condition` schemas.
|
||||
* **Relational Proxies**: If a table has a foreign key to another table, the filter automatically generates a proxy property pointing to the related entity's filter (e.g., the `person` filter automatically gains an `organization` property that points to `organization.filter`), allowing infinitely deep nested queries natively.
|
||||
* **Logical Operators (`$and`, `$or`)**: Every filter automatically includes `$and` and `$or` arrays, which recursively accept the exact same filter schema, allowing complex logical grouping.
|
||||
* **Ad-Hoc Extensions (`ad_hoc`)**: Fields stored purely in JSONB bubbles that lack formal database columns can still be queried using the `ad_hoc` object, which passes standard, unvalidated string conditions.
|
||||
|
||||
---
|
||||
|
||||
## 3. Database
|
||||
@ -171,7 +200,7 @@ When compiling nested object graphs or arrays, the JSPG engine must dynamically
|
||||
### Subschema Promotion
|
||||
To seamlessly support deeply nested Object and Array structures, JSPG aggressively promotes them to standalone topological entities during the database compilation phase.
|
||||
* **Path Generation:** While evaluating a unified graph originating from a base `types`, `enums`, or `puncs` key, the compiler tracks its exact path descent into nested objects and arrays. It dynamically calculates a localized alias string by appending a `/` pathing syntax (e.g., `base_schema_key/nested/path`) representing exactly its structural constraints.
|
||||
* **Promotion:** This nested subschema chunk is mathematically elevated to its own independent key in the `db.schemas` cache registry using its full path. This guarantees that $O(1)$ WebSockets or isolated queries can natively target any arbitrary nested sub-object of a massive database topology directly without recursively re-parsing its parent's AST block every read. Note that you cannot use the `type` attribute to statically inherit from these automatically promoted subschemas.
|
||||
* **Promotion:** This nested subschema chunk is mathematically elevated to an independent subschema entry natively within its parent's internal scope (e.g. inside `db.types.get("base").schemas`) using its full path. This guarantees that $O(1)$ WebSockets or isolated queries can natively target any arbitrary nested sub-object of a massive database topology directly without recursively re-parsing its parent's AST block every read. Note that you cannot use the `type` attribute to statically inherit from these automatically promoted subschemas.
|
||||
* **Primitive Confinement:** Purely scalar or primitive branches (like `oneOf: [{type: "string"}, {type: "null"}]`) bypass global topological promotion. They are evaluated directly within the execution engine via isolated Tuple Indexes to explicitly protect the global DB Registry and Go Mixer from memory bloat.
|
||||
|
||||
---
|
||||
@ -224,19 +253,14 @@ The Merger provides an automated, high-performance graph synchronization engine.
|
||||
The Queryer transforms Postgres into a pre-compiled Semantic Query Engine, designed to serve the exact shape of Punc responses directly via SQL.
|
||||
|
||||
### API Reference
|
||||
* `jspg_query(schema_id text, filters jsonb) -> jsonb`: Compiles the JSON Schema AST of `schema_id` directly into pre-planned, nested multi-JOIN SQL execution trees. Processes `filters` structurally.
|
||||
* `jspg_query(schema_id text, filter jsonb) -> jsonb`: Compiles the JSON Schema AST of `schema_id` directly into pre-planned, nested multi-JOIN SQL execution trees. Processes the `filter` structurally.
|
||||
|
||||
### Core Features
|
||||
|
||||
* **Caching Strategy (DashMap SQL Caching)**: The Queryer securely caches its compiled, static SQL string templates per schema permutation inside the `GLOBAL_JSPG` concurrent `DashMap`. This eliminates recursive AST schema crawling on consecutive requests. Furthermore, it evaluates the strings via Postgres SPI (Server Programming Interface) Prepared Statements, leveraging native database caching of execution plans for extreme performance.
|
||||
* **Schema-to-SQL Compilation**: Compiles JSON Schema ASTs spanning deep arrays directly into static, pre-planned SQL multi-JOIN queries. This explicitly features the `Smart Merge` evaluation engine which natively translates properties through `type` inheritances, mapping JSON fields specifically to their physical database table aliases during translation.
|
||||
* **Root Null-Stripping Optimization**: Unlike traditional nested document builders, the Queryer intelligently defers Postgres' natively recursive `jsonb_strip_nulls` execution to the absolute apex of the compiled query pipeline. The compiler organically layers millions of rapid `jsonb_build_object()` sub-query allocations instantly, wrapping them in a singular overarching pass. This strips all empty optionals uniformly before exiting the database, maximizing CPU throughput.
|
||||
* **Dynamic Filtering**: Binds parameters natively through `cue.filters` objects. The queryer enforces a strict, structured, MongoDB-style operator syntax to map incoming JSON request constraints directly to their originating structural table columns. Filters support both flat path notation (e.g., `"contacts/is_primary": {...}`) and deeply nested recursive JSON structures (e.g., `{"contacts": {"is_primary": {...}}}`). The queryer recursively traverses and flattens these structures at AST compilation time.
|
||||
* **Equality / Inequality**: `{"$eq": value}`, `{"$ne": value}` automatically map to `=` and `!=`.
|
||||
* **Comparison**: `{"$gt": ...}`, `{"$gte": ...}`, `{"$lt": ...}`, `{"$lte": ...}` directly compile to Postgres comparison operators (`> `, `>=`, `<`, `<=`).
|
||||
* **Array Inclusion**: `{"$of": [values]}`, `{"$nof": [values]}` use native `jsonb_array_elements_text()` bindings to enforce `IN` and `NOT IN` logic without runtime SQL injection risks.
|
||||
* **Text Matching (ILIKE)**: Evaluates `$eq` or `$ne` against string fields containing the `%` character natively into Postgres `ILIKE` and `NOT ILIKE` partial substring matches.
|
||||
* **Type Casting**: Safely resolves dynamic combinations by casting values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`).
|
||||
* **Dynamic Filter Execution**: Evaluates the structured `filter` payload and recursively traverses and flattens its paths at AST compilation time. It safely binds parameter constraints using standard operations (e.g., mapping `$eq` to `=`, `$of` to `IN`, `$gt` to `>`) and automatically casts values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`). Text matching naturally evaluates `$eq` or `$ne` against string fields containing the `%` character natively into Postgres `ILIKE` and `NOT ILIKE`.
|
||||
* **Polymorphic SQL Generation (`family`)**: Compiles `family` properties by analyzing the **Physical Database Variations**, *not* the schema descendants.
|
||||
* **The Dot Convention**: When a schema requests `family: "target.schema"`, the compiler extracts the base type (e.g. `schema`) and looks up its Physical Table definition.
|
||||
* **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into sub-queries for each variation. To ensure safe resolution, the compiler dynamically evaluates correlation boundaries: it attempts standard Relational Edge discovery first. If no explicit relational edge exists (indicating pure Table Inheritance rather than a standard foreign-key graph relationship), it safely invokes a **Table Parity Fallback**. This generates an explicit ID correlation constraint (`AND inner.id = outer.id`), perfectly binding the structural variations back to the parent row to eliminate Cartesian products.
|
||||
@ -261,3 +285,26 @@ JSPG abandons the standard `cargo pgrx test` model in favor of native OS testing
|
||||
3. **Modular Test Dispatcher**: The `src/tests/types/` module deserializes the abstract JSON test payloads into `Suite`, `Case`, and `Expect` data structures.
|
||||
* The `compile` action natively asserts the exact output shape of `jspg_stems`, allowing structural and relationship mapping logic to be tested purely through JSON without writing brute-force manual tests in Rust.
|
||||
4. **Unit Context Execution**: When `cargo test` executes, the runner iterates the JSON payloads. Because the tests run natively inside the module via `#cfg(test)`, the Rust compiler globally erases `pgrx` C-linkage, instantiates the `MockExecutor`, and allows for pure structural evaluation of complex database logic completely in memory in parallel.
|
||||
|
||||
### SQL Expectation Formatting & Auto-Variablization
|
||||
|
||||
Because JSPG SQL compilation generates large, complex relational statements (often featuring dynamically generated UUIDs or timestamps), manually updating expected SQL strings in the test fixtures is error-prone and tedious. To streamline this, JSPG includes a built-in intelligent test fixture formatter.
|
||||
|
||||
**When to use it:**
|
||||
Whenever you modify the internal SQL generation logic (in the Queryer or Merger) and need to update the expected SQL outputs across the entire test suite.
|
||||
|
||||
**How to run it:**
|
||||
Run the test suite sequentially while passing the `UPDATE_EXPECT=1` environment variable:
|
||||
```bash
|
||||
UPDATE_EXPECT=1 cargo test --test-threads=1
|
||||
```
|
||||
*Note: The `--test-threads=1` flag is strictly required to prevent parallel tests from concurrently overwriting the same JSON fixture files and corrupting them.*
|
||||
|
||||
**How it works (Intelligent Variablization):**
|
||||
The JSPG engine natively generates actual, random UUIDs in memory for records inserted during `merger` tests. To assert relational integrity without hardcoding ephemeral random strings, the formatter utilizes an intelligent variable extraction map:
|
||||
1. **Payload Extraction**: Before evaluating the SQL output, the test runner recursively scans the JSON of the `data` and `mocks` blocks for that specific test case. It maps any physical UUID it finds to its exact JSON path (e.g., `3333...` -> `mocks.0.id`).
|
||||
2. **SQL Canonicalization**: The test runner utilizes `sqlparser` to format the raw engine SQL into pristine, multi-line readable structures.
|
||||
3. **Variable Mapping**: It scans the formatted SQL using regex for UUIDs. If it encounters a UUID matching the payload extraction map, it replaces it with a template tag like `{{uuid:mocks.0.id}}` or `{{uuid:data.customer_id}}`.
|
||||
4. **Generated Fallbacks**: If it encounters a brand-new random UUID that wasn't provided in the inputs (e.g., a newly generated ID for an `INSERT`), it assigns it a sequential tracking variable like `{{uuid:generated_0}}`. Every subsequent appearance of that *exact* same random UUID in the SQL transaction will reuse the `{{uuid:generated_0}}` tag. Timestamps are naturally replaced with `{{timestamp}}`.
|
||||
|
||||
This guarantees the `assert_pattern` execution engine can strictly validate that the exact same ID generated for a parent entity is correctly passed as a foreign key to its children across complex database transactions.
|
||||
|
||||
155
fixtures/dynamicType.json
Normal file
155
fixtures/dynamicType.json
Normal file
@ -0,0 +1,155 @@
|
||||
[
|
||||
{
|
||||
"description": "Dynamic type binding ($sibling.suffix) validation",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
"name": "person",
|
||||
"schemas": {
|
||||
"person.filter": {
|
||||
"properties": {
|
||||
"age": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "widget",
|
||||
"schemas": {
|
||||
"widget.filter": {
|
||||
"properties": {
|
||||
"weight": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "search",
|
||||
"schemas": {
|
||||
"search": {
|
||||
"properties": {
|
||||
"kind": {
|
||||
"type": "string"
|
||||
},
|
||||
"filter": {
|
||||
"type": "$kind.filter"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "Valid person filter payload",
|
||||
"data": {
|
||||
"kind": "person",
|
||||
"filter": {
|
||||
"age": 30
|
||||
}
|
||||
},
|
||||
"schema_id": "search",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Invalid person filter payload (fails constraint)",
|
||||
"data": {
|
||||
"kind": "person",
|
||||
"filter": {
|
||||
"age": "thirty"
|
||||
}
|
||||
},
|
||||
"schema_id": "search",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "INVALID_TYPE",
|
||||
"details": {
|
||||
"path": "filter/age"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Valid widget filter payload",
|
||||
"data": {
|
||||
"kind": "widget",
|
||||
"filter": {
|
||||
"weight": 500
|
||||
}
|
||||
},
|
||||
"schema_id": "search",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Fails resolution if kind doesn't match an existing schema",
|
||||
"data": {
|
||||
"kind": "unknown",
|
||||
"filter": {
|
||||
"weight": 500
|
||||
}
|
||||
},
|
||||
"schema_id": "search",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "DYNAMIC_TYPE_RESOLUTION_FAILED",
|
||||
"details": {
|
||||
"path": "filter"
|
||||
}
|
||||
},
|
||||
{
|
||||
"code": "STRICT_PROPERTY_VIOLATION",
|
||||
"details": {
|
||||
"path": "filter/weight"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Fails resolution if discriminator is missing",
|
||||
"data": {
|
||||
"filter": {
|
||||
"weight": 500
|
||||
}
|
||||
},
|
||||
"schema_id": "search",
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "DYNAMIC_TYPE_RESOLUTION_FAILED",
|
||||
"details": {
|
||||
"path": "filter"
|
||||
}
|
||||
},
|
||||
{
|
||||
"code": "STRICT_PROPERTY_VIOLATION",
|
||||
"details": {
|
||||
"path": "filter/weight"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
@ -3,7 +3,24 @@
|
||||
"description": "Filter Synthesis Object-Oriented Composition",
|
||||
"database": {
|
||||
"puncs": [],
|
||||
"enums": [],
|
||||
"enums": [
|
||||
{
|
||||
"id": "enum1",
|
||||
"name": "gender",
|
||||
"module": "core",
|
||||
"source": "gender",
|
||||
"schemas": {
|
||||
"gender": {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
"male",
|
||||
"female",
|
||||
"other"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
"relations": [
|
||||
{
|
||||
"id": "rel1",
|
||||
@ -46,6 +63,9 @@
|
||||
"billing_address": {
|
||||
"type": "address"
|
||||
},
|
||||
"gender": {
|
||||
"type": "gender"
|
||||
},
|
||||
"birth_date": {
|
||||
"type": "string",
|
||||
"format": "date-time"
|
||||
@ -107,11 +127,14 @@
|
||||
"search": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"kind": {
|
||||
"type": "string"
|
||||
},
|
||||
"name": {
|
||||
"type": "string"
|
||||
},
|
||||
"filter": {
|
||||
"type": "filter"
|
||||
"type": "$kind.filter"
|
||||
}
|
||||
}
|
||||
},
|
||||
@ -170,55 +193,117 @@
|
||||
"expect": {
|
||||
"success": true,
|
||||
"schemas": {
|
||||
"gender": {},
|
||||
"gender.condition": {
|
||||
"type": "condition",
|
||||
"compiledPropertyNames": [
|
||||
"kind",
|
||||
"$eq",
|
||||
"$ne",
|
||||
"$of",
|
||||
"$nof"
|
||||
],
|
||||
"properties": {
|
||||
"$eq": {
|
||||
"type": [
|
||||
"gender",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$ne": {
|
||||
"type": [
|
||||
"gender",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$nof": {
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"type": "gender"
|
||||
}
|
||||
},
|
||||
"$of": {
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"type": "gender"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"person": {},
|
||||
"person.filter": {
|
||||
"type": "filter",
|
||||
"compiledPropertyNames": [
|
||||
"$and",
|
||||
"$or",
|
||||
"first_name",
|
||||
"age",
|
||||
"billing_address",
|
||||
"gender",
|
||||
"birth_date",
|
||||
"first_name"
|
||||
"tags",
|
||||
"ad_hoc",
|
||||
"$and",
|
||||
"$or"
|
||||
],
|
||||
"properties": {
|
||||
"$and": {
|
||||
"items": {
|
||||
"compiledPropertyNames": [
|
||||
"first_name",
|
||||
"age",
|
||||
"billing_address",
|
||||
"gender",
|
||||
"birth_date",
|
||||
"tags",
|
||||
"ad_hoc",
|
||||
"$and",
|
||||
"$or"
|
||||
],
|
||||
"type": "person.filter"
|
||||
},
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"compiledPropertyNames": [
|
||||
"$and",
|
||||
"$or",
|
||||
"age",
|
||||
"billing_address",
|
||||
"birth_date",
|
||||
"first_name"
|
||||
],
|
||||
"type": "person.filter"
|
||||
}
|
||||
]
|
||||
},
|
||||
"$or": {
|
||||
"items": {
|
||||
"compiledPropertyNames": [
|
||||
"first_name",
|
||||
"age",
|
||||
"billing_address",
|
||||
"gender",
|
||||
"birth_date",
|
||||
"tags",
|
||||
"ad_hoc",
|
||||
"$and",
|
||||
"$or"
|
||||
],
|
||||
"type": "person.filter"
|
||||
},
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"compiledPropertyNames": [
|
||||
"$and",
|
||||
"$or",
|
||||
"age",
|
||||
"billing_address",
|
||||
"birth_date",
|
||||
"first_name"
|
||||
],
|
||||
"type": "person.filter"
|
||||
}
|
||||
]
|
||||
},
|
||||
"first_name": {
|
||||
"ad_hoc": {
|
||||
"compiledPropertyNames": [
|
||||
"foo"
|
||||
],
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": [
|
||||
"string.condition",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": [
|
||||
"string.condition",
|
||||
"filter",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
@ -239,16 +324,35 @@
|
||||
"date.condition",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"first_name": {
|
||||
"type": [
|
||||
"string.condition",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"gender": {
|
||||
"type": [
|
||||
"gender.condition",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"tags": {
|
||||
"type": [
|
||||
"string.condition",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"type": "filter"
|
||||
},
|
||||
"address": {},
|
||||
"address.filter": {
|
||||
"type": "filter",
|
||||
"compiledPropertyNames": [
|
||||
"city",
|
||||
"$and",
|
||||
"$or",
|
||||
"city"
|
||||
"$or"
|
||||
],
|
||||
"properties": {
|
||||
"$and": {
|
||||
@ -258,9 +362,9 @@
|
||||
],
|
||||
"items": {
|
||||
"compiledPropertyNames": [
|
||||
"city",
|
||||
"$and",
|
||||
"$or",
|
||||
"city"
|
||||
"$or"
|
||||
],
|
||||
"type": "address.filter"
|
||||
}
|
||||
@ -272,9 +376,9 @@
|
||||
],
|
||||
"items": {
|
||||
"compiledPropertyNames": [
|
||||
"city",
|
||||
"$and",
|
||||
"$or",
|
||||
"city"
|
||||
"$or"
|
||||
],
|
||||
"type": "address.filter"
|
||||
}
|
||||
@ -287,8 +391,8 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
"filter": {},
|
||||
"condition": {},
|
||||
"filter": {},
|
||||
"string.condition": {},
|
||||
"integer.condition": {},
|
||||
"date.condition": {},
|
||||
@ -296,10 +400,11 @@
|
||||
"search.filter": {
|
||||
"type": "filter",
|
||||
"compiledPropertyNames": [
|
||||
"$and",
|
||||
"$or",
|
||||
"kind",
|
||||
"name",
|
||||
"filter",
|
||||
"name"
|
||||
"$and",
|
||||
"$or"
|
||||
],
|
||||
"properties": {
|
||||
"$and": {
|
||||
@ -309,10 +414,11 @@
|
||||
],
|
||||
"items": {
|
||||
"compiledPropertyNames": [
|
||||
"$and",
|
||||
"$or",
|
||||
"kind",
|
||||
"name",
|
||||
"filter",
|
||||
"name"
|
||||
"$and",
|
||||
"$or"
|
||||
],
|
||||
"type": "search.filter"
|
||||
}
|
||||
@ -324,17 +430,24 @@
|
||||
],
|
||||
"items": {
|
||||
"compiledPropertyNames": [
|
||||
"$and",
|
||||
"$or",
|
||||
"kind",
|
||||
"name",
|
||||
"filter",
|
||||
"name"
|
||||
"$and",
|
||||
"$or"
|
||||
],
|
||||
"type": "search.filter"
|
||||
}
|
||||
},
|
||||
"filter": {
|
||||
"type": [
|
||||
"filter.filter",
|
||||
"$kind.filter.filter",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"kind": {
|
||||
"type": [
|
||||
"string.condition",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
|
||||
2030
fixtures/merger.json
2030
fixtures/merger.json
File diff suppressed because it is too large
Load Diff
@ -1,6 +1,6 @@
|
||||
[
|
||||
{
|
||||
"description": "Vertical family Routing (Across Tables)",
|
||||
"description": "Vertical family Routing (Scenario A)",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
@ -153,7 +153,7 @@
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "Matrix family Routing (Vertical + Horizontal Intersections)",
|
||||
"description": "Matrix family Routing (Scenario B)",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
@ -284,7 +284,7 @@
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "Horizontal family Routing (Virtual Variations)",
|
||||
"description": "Horizontal family Routing (Scenario C)",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
@ -776,5 +776,123 @@
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "JSONB Field Bubble family Routing (Scenario D)",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
"name": "dashboard",
|
||||
"variations": [
|
||||
"dashboard"
|
||||
],
|
||||
"schemas": {
|
||||
"dashboard": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"type": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"panel": {
|
||||
"type": "object",
|
||||
"required": [
|
||||
"id",
|
||||
"kind"
|
||||
],
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "string"
|
||||
},
|
||||
"kind": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"balance.panel": {
|
||||
"type": "panel",
|
||||
"properties": {
|
||||
"amount": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
},
|
||||
"units.panel": {
|
||||
"type": "panel",
|
||||
"properties": {
|
||||
"count": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "family_panel",
|
||||
"schemas": {
|
||||
"family_panel": {
|
||||
"family": "panel"
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "Successfully routes to nested balance panel",
|
||||
"schema_id": "family_panel",
|
||||
"data": {
|
||||
"id": "123",
|
||||
"kind": "balance",
|
||||
"amount": 500
|
||||
},
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Fails validation on routed schema due to invalid property type",
|
||||
"schema_id": "family_panel",
|
||||
"data": {
|
||||
"id": "123",
|
||||
"kind": "balance",
|
||||
"amount": "not_an_int"
|
||||
},
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "INVALID_TYPE",
|
||||
"details": {
|
||||
"path": "amount"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"description": "Fails when discriminator does not match any bubble schema",
|
||||
"schema_id": "family_panel",
|
||||
"data": {
|
||||
"id": "123",
|
||||
"kind": "unknown_panel"
|
||||
},
|
||||
"action": "validate",
|
||||
"expect": {
|
||||
"success": false,
|
||||
"errors": [
|
||||
{
|
||||
"code": "NO_FAMILY_MATCH",
|
||||
"details": {
|
||||
"path": ""
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
File diff suppressed because it is too large
Load Diff
2
flows
2
flows
Submodule flows updated: 4d61e13e00...0d9bd8644e
23
log.txt
23
log.txt
@ -1,23 +0,0 @@
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.60s
|
||||
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
|
||||
|
||||
running 1 test
|
||||
test tests::test_library_api ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
---- tests::test_library_api stdout ----
|
||||
|
||||
thread 'tests::test_library_api' (110325727) panicked at src/tests/mod.rs:86:3:
|
||||
assertion `left == right` failed
|
||||
left: Object {"response": Object {"enums": Object {}, "puncs": Object {}, "relations": Object {"fk_test_target": Object {"constraint": String("fk_test_target"), "destination_columns": Array [String("id")], "destination_type": String("target_schema"), "id": String("11111111-1111-1111-1111-111111111111"), "prefix": String("target"), "source_columns": Array [String("target_id")], "source_type": String("source_schema"), "type": String("relation")}}, "types": Object {"source_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("source_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("source_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"source_schema": Object {"compiledEdges": Object {"target": Object {"constraint": String("fk_test_target"), "forward": Bool(true)}}, "compiledPropertyNames": Array [String("name"), String("target"), String("type")], "properties": Object {"name": Object {"type": String("string")}, "target": Object {"compiledPropertyNames": Array [String("value")], "type": String("target_schema")}, "type": Object {"type": String("string")}}, "required": Array [String("name")], "type": String("object")}, "source_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "properties": Object {"$and": Object {"items": Object {"type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "name": Object {"type": Array [String("string.condition"), String("null")]}, "target": Object {"type": Array [String("target_schema.filter"), String("null")]}, "type": Object {"type": Array [String("string.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("source_schema")]}, "target_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("target_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("target_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"target_schema": Object {"compiledPropertyNames": Array [String("value")], "properties": Object {"value": Object {"type": String("number")}}, "type": String("object")}, "target_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "properties": Object {"$and": Object {"items": Object {"type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "value": Object {"type": Array [String("number.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("target_schema")]}}}, "type": String("drop")}
|
||||
right: Object {"response": Object {"enums": Object {}, "puncs": Object {}, "relations": Object {"fk_test_target": Object {"constraint": String("fk_test_target"), "destination_columns": Array [String("id")], "destination_type": String("target_schema"), "id": String("11111111-1111-1111-1111-111111111111"), "prefix": String("target"), "source_columns": Array [String("target_id")], "source_type": String("source_schema"), "type": String("relation")}}, "types": Object {"source_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("source_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("source_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"source_schema": Object {"compiledEdges": Object {"target": Object {"constraint": String("fk_test_target"), "forward": Bool(true)}}, "compiledPropertyNames": Array [String("name"), String("target"), String("type")], "properties": Object {"name": Object {"type": String("string")}, "target": Object {"compiledPropertyNames": Array [String("value")], "type": String("target_schema")}, "type": Object {"type": String("string")}}, "required": Array [String("name")], "type": String("object")}, "source_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "properties": Object {"$and": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "name": Object {"type": Array [String("string.condition"), String("null")]}, "target": Object {"type": Array [String("target_schema.filter"), String("null")]}, "type": Object {"type": Array [String("string.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("source_schema")]}, "target_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("target_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("target_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"target_schema": Object {"compiledPropertyNames": Array [String("value")], "properties": Object {"value": Object {"type": String("number")}}, "type": String("object")}, "target_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "properties": Object {"$and": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "value": Object {"type": Array [String("number.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("target_schema")]}}}, "type": String("drop")}
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
|
||||
|
||||
failures:
|
||||
tests::test_library_api
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 1357 filtered out; finished in 0.00s
|
||||
|
||||
error: test failed, to rerun pass `--lib`
|
||||
23
log_test.txt
23
log_test.txt
@ -1,23 +0,0 @@
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.35s
|
||||
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
|
||||
|
||||
running 1 test
|
||||
test tests::test_library_api ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
---- tests::test_library_api stdout ----
|
||||
|
||||
thread 'tests::test_library_api' (110334696) panicked at src/tests/mod.rs:86:3:
|
||||
assertion `left == right` failed
|
||||
left: Object {"response": Object {"enums": Object {}, "puncs": Object {}, "relations": Object {"fk_test_target": Object {"constraint": String("fk_test_target"), "destination_columns": Array [String("id")], "destination_type": String("target_schema"), "id": String("11111111-1111-1111-1111-111111111111"), "prefix": String("target"), "source_columns": Array [String("target_id")], "source_type": String("source_schema"), "type": String("relation")}}, "types": Object {"source_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("source_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("source_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"source_schema": Object {"compiledEdges": Object {"target": Object {"constraint": String("fk_test_target"), "forward": Bool(true)}}, "compiledPropertyNames": Array [String("name"), String("target"), String("type")], "properties": Object {"name": Object {"type": String("string")}, "target": Object {"compiledPropertyNames": Array [String("value")], "type": String("target_schema")}, "type": Object {"type": String("string")}}, "required": Array [String("name")], "type": String("object")}, "source_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "properties": Object {"$and": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "name": Object {"type": Array [String("string.condition"), String("null")]}, "target": Object {"type": Array [String("target_schema.filter"), String("null")]}, "type": Object {"type": Array [String("string.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("source_schema")]}, "target_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("target_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("target_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"target_schema": Object {"compiledPropertyNames": Array [String("value")], "properties": Object {"value": Object {"type": String("number")}}, "type": String("object")}, "target_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "properties": Object {"$and": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "value": Object {"type": Array [String("number.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("target_schema")]}}}, "type": String("drop")}
|
||||
right: Object {"response": Object {"enums": Object {}, "puncs": Object {}, "relations": Object {"fk_test_target": Object {"constraint": String("fk_test_target"), "destination_columns": Array [String("id")], "destination_type": String("target_schema"), "id": String("11111111-1111-1111-1111-111111111111"), "prefix": String("target"), "source_columns": Array [String("target_id")], "source_type": String("source_schema"), "type": String("relation")}}, "types": Object {"source_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("source_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("source_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"source_schema": Object {"compiledEdges": Object {"target": Object {"constraint": String("fk_test_target"), "forward": Bool(true)}}, "compiledPropertyNames": Array [String("name"), String("target"), String("type")], "properties": Object {"name": Object {"type": String("string")}, "target": Object {"compiledPropertyNames": Array [String("value")], "type": String("target_schema")}, "type": Object {"type": String("string")}}, "required": Array [String("name")], "type": String("object")}, "source_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("name"), String("target"), String("type")], "properties": Object {"$and": Object {"items": Object {"type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"type": String("source_schema.filter")}, "type": Array [String("array"), String("null")]}, "name": Object {"type": Array [String("string.condition"), String("null")]}, "target": Object {"type": Array [String("target_schema.filter"), String("null")]}, "type": Object {"type": Array [String("string.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("source_schema")]}, "target_schema": Object {"default_fields": Array [], "field_types": Null, "fields": Array [], "grouped_fields": Null, "hierarchy": Array [String("target_schema"), String("entity")], "historical": Bool(false), "id": String(""), "longevity": Null, "lookup_fields": Array [], "module": String(""), "name": String("target_schema"), "notify": Bool(false), "null_fields": Array [], "ownable": Bool(false), "relationship": Bool(false), "schemas": Object {"target_schema": Object {"compiledPropertyNames": Array [String("value")], "properties": Object {"value": Object {"type": String("number")}}, "type": String("object")}, "target_schema.filter": Object {"compiledPropertyNames": Array [String("$and"), String("$or"), String("value")], "properties": Object {"$and": Object {"items": Object {"type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "$or": Object {"items": Object {"type": String("target_schema.filter")}, "type": Array [String("array"), String("null")]}, "value": Object {"type": Array [String("number.condition"), String("null")]}}, "type": String("filter")}}, "sensitive": Bool(false), "source": String(""), "type": String(""), "variations": Array [String("target_schema")]}}}, "type": String("drop")}
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
|
||||
|
||||
failures:
|
||||
tests::test_library_api
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 1357 filtered out; finished in 0.00s
|
||||
|
||||
error: test failed, to rerun pass `--lib`
|
||||
@ -1,55 +0,0 @@
|
||||
import json
|
||||
import os
|
||||
import glob
|
||||
|
||||
fixtures_dir = 'fixtures'
|
||||
for filepath in glob.glob(os.path.join(fixtures_dir, '*.json')):
|
||||
try:
|
||||
with open(filepath, 'r') as f:
|
||||
data = json.load(f)
|
||||
except Exception as e:
|
||||
continue
|
||||
|
||||
changed = False
|
||||
for suite in data:
|
||||
db = suite.get("database")
|
||||
if not db or "schemas" not in db:
|
||||
continue
|
||||
|
||||
legacy_schemas = db["schemas"]
|
||||
# Make sure types array is ready
|
||||
if "types" not in db:
|
||||
db["types"] = []
|
||||
|
||||
# Push schemas into types
|
||||
for schema_id, schema_def in legacy_schemas.items():
|
||||
base_name = schema_id.split('.')[-1]
|
||||
|
||||
# Find an existing type with base_name first
|
||||
found = False
|
||||
for t in db["types"]:
|
||||
if t.get("name") == base_name:
|
||||
if "schemas" not in t:
|
||||
t["schemas"] = {}
|
||||
t["schemas"][schema_id] = schema_def
|
||||
found = True
|
||||
break
|
||||
|
||||
if not found:
|
||||
db["types"].append({
|
||||
"name": base_name,
|
||||
"variations": [base_name], # Optional placeholder, shouldn't break anything
|
||||
"hierarchy": [base_name, "entity"],
|
||||
"schemas": {
|
||||
schema_id: schema_def
|
||||
}
|
||||
})
|
||||
|
||||
# Clean up legacy global map
|
||||
del db["schemas"]
|
||||
changed = True
|
||||
|
||||
if changed:
|
||||
with open(filepath, 'w') as f:
|
||||
json.dump(data, f, indent=2)
|
||||
print("Migrated legacy schemas to types in", filepath)
|
||||
@ -1,54 +0,0 @@
|
||||
import json
|
||||
import os
|
||||
import glob
|
||||
|
||||
fixtures_dir = 'fixtures'
|
||||
for filepath in glob.glob(os.path.join(fixtures_dir, '*.json')):
|
||||
try:
|
||||
with open(filepath, 'r') as f:
|
||||
data = json.load(f)
|
||||
except Exception as e:
|
||||
print(f"Failed to load {filepath}: {e}")
|
||||
continue
|
||||
|
||||
changed = False
|
||||
for suite in data:
|
||||
db = suite.get("database")
|
||||
if not db or "schemas" not in db:
|
||||
continue
|
||||
|
||||
legacy_schemas = db["schemas"]
|
||||
# Make sure types array is ready
|
||||
if "types" not in db:
|
||||
db["types"] = []
|
||||
|
||||
# Push schemas into types
|
||||
for schema_id, schema_def in legacy_schemas.items():
|
||||
base_name = schema_id.split('.')[-1]
|
||||
|
||||
# Find an existing type with base_name first
|
||||
found = False
|
||||
for t in db["types"]:
|
||||
if t.get("name") == base_name:
|
||||
if "schemas" not in t:
|
||||
t["schemas"] = {}
|
||||
t["schemas"][schema_id] = schema_def
|
||||
found = True
|
||||
break
|
||||
|
||||
if not found:
|
||||
db["types"].append({
|
||||
"name": base_name,
|
||||
"schemas": {
|
||||
schema_id: schema_def
|
||||
}
|
||||
})
|
||||
|
||||
# Clean up legacy global map
|
||||
del db["schemas"]
|
||||
changed = True
|
||||
|
||||
if changed:
|
||||
with open(filepath, 'w') as f:
|
||||
json.dump(data, f, indent=2)
|
||||
print("Migrated legacy schemas to types properly in", filepath)
|
||||
@ -1,41 +0,0 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
function updateFile(filePath) {
|
||||
let content = fs.readFileSync(filePath, 'utf8');
|
||||
let data;
|
||||
try {
|
||||
data = JSON.parse(content);
|
||||
} catch (e) {
|
||||
console.error("Failed to parse " + filePath, e);
|
||||
return;
|
||||
}
|
||||
|
||||
let changed = false;
|
||||
for (let suite of data) {
|
||||
if (suite.database && suite.database.puncs && suite.database.puncs.length > 0) {
|
||||
if (!suite.database.types) suite.database.types = [];
|
||||
for (let punc of suite.database.puncs) {
|
||||
// Determine if we should push it to types.
|
||||
// Basically all of them should go to types except maybe if they are explicitly being tested as Puncs?
|
||||
// But the tests construct Queryer and Merger using these ids, which query the Type Realm.
|
||||
suite.database.types.push(punc);
|
||||
}
|
||||
delete suite.database.puncs;
|
||||
changed = true;
|
||||
}
|
||||
}
|
||||
|
||||
if (changed) {
|
||||
fs.writeFileSync(filePath, JSON.stringify(data, null, 2));
|
||||
console.log("Reverted puncs to types in " + filePath);
|
||||
}
|
||||
}
|
||||
|
||||
let fixturesDir = 'fixtures';
|
||||
let files = fs.readdirSync(fixturesDir);
|
||||
for (let file of files) {
|
||||
if (file.endsWith('.json')) {
|
||||
updateFile(path.join(fixturesDir, file));
|
||||
}
|
||||
}
|
||||
@ -1,29 +0,0 @@
|
||||
import json
|
||||
import os
|
||||
import glob
|
||||
|
||||
fixtures_dir = 'fixtures'
|
||||
for filepath in glob.glob(os.path.join(fixtures_dir, '*.json')):
|
||||
with open(filepath, 'r') as f:
|
||||
try:
|
||||
data = json.load(f)
|
||||
except Exception as e:
|
||||
print("Failed to parse", filepath, e)
|
||||
continue
|
||||
|
||||
changed = False
|
||||
for suite in data:
|
||||
db = suite.get("database", {})
|
||||
puncs = db.get("puncs", [])
|
||||
if puncs:
|
||||
if "types" not in db:
|
||||
db["types"] = []
|
||||
for punc in puncs:
|
||||
db["types"].append(punc)
|
||||
del db["puncs"]
|
||||
changed = True
|
||||
|
||||
if changed:
|
||||
with open(filepath, 'w') as f:
|
||||
json.dump(data, f, indent=2)
|
||||
print("Reverted puncs to types in", filepath)
|
||||
@ -1,43 +0,0 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
function updateFile(filePath) {
|
||||
let content = fs.readFileSync(filePath, 'utf8');
|
||||
let data;
|
||||
try {
|
||||
data = JSON.parse(content);
|
||||
} catch (e) {
|
||||
console.error("Failed to parse " + filePath, e);
|
||||
return;
|
||||
}
|
||||
|
||||
let changed = false;
|
||||
for (let suite of data) {
|
||||
if (suite.database && suite.database.schemas) {
|
||||
if (!suite.database.puncs) suite.database.puncs = [];
|
||||
for (let id of Object.keys(suite.database.schemas)) {
|
||||
let schema = suite.database.schemas[id];
|
||||
let puncType = {
|
||||
name: id,
|
||||
schemas: { [id]: schema }
|
||||
};
|
||||
suite.database.puncs.push(puncType);
|
||||
}
|
||||
delete suite.database.schemas;
|
||||
changed = true;
|
||||
}
|
||||
}
|
||||
|
||||
if (changed) {
|
||||
fs.writeFileSync(filePath, JSON.stringify(data, null, 2));
|
||||
console.log("Updated " + filePath);
|
||||
}
|
||||
}
|
||||
|
||||
let fixturesDir = 'fixtures';
|
||||
let files = fs.readdirSync(fixturesDir);
|
||||
for (let file of files) {
|
||||
if (file.endsWith('.json')) {
|
||||
updateFile(path.join(fixturesDir, file));
|
||||
}
|
||||
}
|
||||
@ -1,33 +0,0 @@
|
||||
import json
|
||||
import os
|
||||
|
||||
fixtures_dir = 'fixtures'
|
||||
|
||||
for filename in os.listdir(fixtures_dir):
|
||||
if not filename.endswith('.json'):
|
||||
continue
|
||||
filepath = os.path.join(fixtures_dir, filename)
|
||||
with open(filepath, 'r') as f:
|
||||
try:
|
||||
data = json.load(f)
|
||||
except json.JSONDecodeError:
|
||||
print("Failed to parse", filepath)
|
||||
continue
|
||||
changed = False
|
||||
for suite in data:
|
||||
db = suite.get('database', {})
|
||||
if 'schemas' in db:
|
||||
if 'types' not in db:
|
||||
db['types'] = []
|
||||
for id_str, schema in db['schemas'].items():
|
||||
target_type = {
|
||||
'name': id_str,
|
||||
'schemas': { id_str: schema }
|
||||
}
|
||||
db['types'].append(target_type)
|
||||
del db['schemas']
|
||||
changed = True
|
||||
if changed:
|
||||
with open(filepath, 'w') as f:
|
||||
json.dump(data, f, indent=2)
|
||||
print("Updated", filepath)
|
||||
@ -12,11 +12,11 @@ impl Schema {
|
||||
) {
|
||||
#[cfg(not(test))]
|
||||
for c in id.chars() {
|
||||
if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' {
|
||||
if !c.is_ascii_lowercase() && !c.is_ascii_digit() && c != '_' && c != '.' && c != '$' {
|
||||
errors.push(crate::drop::Error {
|
||||
code: "INVALID_IDENTIFIER".to_string(),
|
||||
message: format!(
|
||||
"Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.]",
|
||||
"Invalid character '{}' in JSON Schema '{}' property: '{}'. Identifiers must exclusively contain [a-z0-9_.$]",
|
||||
c, field_name, id
|
||||
),
|
||||
details: crate::drop::ErrorDetails {
|
||||
|
||||
87
src/database/compile/condition.rs
Normal file
87
src/database/compile/condition.rs
Normal file
@ -0,0 +1,87 @@
|
||||
use crate::database::object::{SchemaObject, SchemaTypeOrArray};
|
||||
use crate::database::schema::Schema;
|
||||
use crate::database::r#enum::Enum;
|
||||
use indexmap::IndexMap;
|
||||
use std::sync::Arc;
|
||||
|
||||
impl Enum {
|
||||
pub fn compile_condition(&self) -> Schema {
|
||||
let mut props = IndexMap::new();
|
||||
let enum_name = &self.name;
|
||||
|
||||
let mut eq_obj = SchemaObject::default();
|
||||
eq_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
enum_name.clone(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
props.insert(
|
||||
"$eq".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: eq_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut ne_obj = SchemaObject::default();
|
||||
ne_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
enum_name.clone(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
props.insert(
|
||||
"$ne".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: ne_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut of_obj = SchemaObject::default();
|
||||
of_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"array".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
of_obj.items = Some(Arc::new(Schema {
|
||||
obj: SchemaObject {
|
||||
type_: Some(SchemaTypeOrArray::Single(enum_name.clone())),
|
||||
..Default::default()
|
||||
},
|
||||
always_fail: false,
|
||||
}));
|
||||
props.insert(
|
||||
"$of".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: of_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut nof_obj = SchemaObject::default();
|
||||
nof_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"array".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
nof_obj.items = Some(Arc::new(Schema {
|
||||
obj: SchemaObject {
|
||||
type_: Some(SchemaTypeOrArray::Single(enum_name.clone())),
|
||||
..Default::default()
|
||||
},
|
||||
always_fail: false,
|
||||
}));
|
||||
props.insert(
|
||||
"$nof".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: nof_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut cond_obj = SchemaObject::default();
|
||||
cond_obj.type_ = Some(SchemaTypeOrArray::Single("condition".to_string()));
|
||||
cond_obj.properties = Some(props);
|
||||
|
||||
Schema {
|
||||
obj: cond_obj,
|
||||
always_fail: false,
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1,4 +1,5 @@
|
||||
use crate::database::schema::Schema;
|
||||
use indexmap::IndexMap;
|
||||
|
||||
impl Schema {
|
||||
/// Dynamically infers and compiles all structural database relationships between this Schema
|
||||
@ -10,10 +11,10 @@ impl Schema {
|
||||
db: &crate::database::Database,
|
||||
root_id: &str,
|
||||
path: &str,
|
||||
props: &std::collections::BTreeMap<String, std::sync::Arc<Schema>>,
|
||||
props: &IndexMap<String, std::sync::Arc<Schema>>,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) -> std::collections::BTreeMap<String, crate::database::edge::Edge> {
|
||||
let mut schema_edges = std::collections::BTreeMap::new();
|
||||
) -> IndexMap<String, crate::database::edge::Edge> {
|
||||
let mut schema_edges = IndexMap::new();
|
||||
|
||||
// Determine the physical Database Table Name this schema structurally represents
|
||||
// Plucks the polymorphic discriminator via dot-notation (e.g. extracting "person" from "full.person")
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
use crate::database::Database;
|
||||
use crate::database::object::{SchemaObject, SchemaTypeOrArray};
|
||||
use crate::database::schema::Schema;
|
||||
use std::collections::BTreeMap;
|
||||
use indexmap::IndexMap;
|
||||
use std::sync::Arc;
|
||||
|
||||
impl Schema {
|
||||
@ -12,9 +12,37 @@ impl Schema {
|
||||
_errors: &mut Vec<crate::drop::Error>,
|
||||
) -> Option<Schema> {
|
||||
if let Some(props) = self.obj.compiled_properties.get() {
|
||||
let mut filter_props = BTreeMap::new();
|
||||
let mut filter_props = IndexMap::new();
|
||||
for (key, child) in props {
|
||||
if let Some(mut filter_type) = Self::resolve_filter_type(child) {
|
||||
let mut structural_filter = None;
|
||||
|
||||
let is_array = match &child.obj.type_ {
|
||||
Some(SchemaTypeOrArray::Single(t)) => t == "array",
|
||||
Some(SchemaTypeOrArray::Multiple(types)) => types.contains(&"array".to_string()),
|
||||
None => false,
|
||||
};
|
||||
|
||||
if is_array {
|
||||
if let Some(items) = &child.obj.items {
|
||||
if !items.is_proxy() {
|
||||
structural_filter = items.compile_filter(_db, "", _errors);
|
||||
}
|
||||
}
|
||||
} else if !child.is_proxy() {
|
||||
structural_filter = child.compile_filter(_db, "", _errors);
|
||||
}
|
||||
|
||||
if let Some(mut inline_schema) = structural_filter {
|
||||
inline_schema.obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"filter".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
|
||||
filter_props.insert(
|
||||
key.clone(),
|
||||
Arc::new(inline_schema),
|
||||
);
|
||||
} else if let Some(mut filter_type) = Self::resolve_filter_type(child, _db) {
|
||||
filter_type.push("null".to_string());
|
||||
|
||||
let mut child_obj = SchemaObject::default();
|
||||
@ -31,50 +59,52 @@ impl Schema {
|
||||
}
|
||||
|
||||
if !filter_props.is_empty() {
|
||||
let root_filter_type = format!("{}.filter", root_id);
|
||||
if !root_id.is_empty() {
|
||||
let root_filter_type = format!("{}.filter", root_id);
|
||||
|
||||
let mut and_obj = SchemaObject::default();
|
||||
and_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"array".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
and_obj.items = Some(Arc::new(Schema {
|
||||
obj: SchemaObject {
|
||||
type_: Some(SchemaTypeOrArray::Single(root_filter_type.clone())),
|
||||
..Default::default()
|
||||
},
|
||||
always_fail: false,
|
||||
}));
|
||||
filter_props.insert(
|
||||
"$and".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: and_obj,
|
||||
let mut and_obj = SchemaObject::default();
|
||||
and_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"array".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
and_obj.items = Some(Arc::new(Schema {
|
||||
obj: SchemaObject {
|
||||
type_: Some(SchemaTypeOrArray::Single(root_filter_type.clone())),
|
||||
..Default::default()
|
||||
},
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
}));
|
||||
filter_props.insert(
|
||||
"$and".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: and_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut or_obj = SchemaObject::default();
|
||||
or_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"array".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
or_obj.items = Some(Arc::new(Schema {
|
||||
obj: SchemaObject {
|
||||
type_: Some(SchemaTypeOrArray::Single(root_filter_type.clone())),
|
||||
..Default::default()
|
||||
},
|
||||
always_fail: false,
|
||||
}));
|
||||
filter_props.insert(
|
||||
"$or".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: or_obj,
|
||||
let mut or_obj = SchemaObject::default();
|
||||
or_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"array".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
or_obj.items = Some(Arc::new(Schema {
|
||||
obj: SchemaObject {
|
||||
type_: Some(SchemaTypeOrArray::Single(root_filter_type.clone())),
|
||||
..Default::default()
|
||||
},
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
}));
|
||||
filter_props.insert(
|
||||
"$or".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: or_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
let mut wrapper_obj = SchemaObject::default();
|
||||
// Conceptually link this directly into the STI lineage of the base `filter` object
|
||||
// Filters now inherit from the base 'filter' type
|
||||
wrapper_obj.type_ = Some(SchemaTypeOrArray::Single("filter".to_string()));
|
||||
wrapper_obj.properties = Some(filter_props);
|
||||
|
||||
@ -87,16 +117,16 @@ impl Schema {
|
||||
None
|
||||
}
|
||||
|
||||
fn resolve_filter_type(schema: &Arc<Schema>) -> Option<Vec<String>> {
|
||||
fn resolve_filter_type(schema: &Arc<Schema>, db: &Database) -> Option<Vec<String>> {
|
||||
if let Some(type_) = &schema.obj.type_ {
|
||||
match type_ {
|
||||
SchemaTypeOrArray::Single(t) => {
|
||||
return Self::map_filter_string(t, schema);
|
||||
return Self::map_filter_string(t, schema, db);
|
||||
}
|
||||
SchemaTypeOrArray::Multiple(types) => {
|
||||
for t in types {
|
||||
if t != "null" {
|
||||
return Self::map_filter_string(t, schema);
|
||||
return Self::map_filter_string(t, schema, db);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -105,7 +135,7 @@ impl Schema {
|
||||
None
|
||||
}
|
||||
|
||||
fn map_filter_string(t: &str, schema: &Arc<Schema>) -> Option<Vec<String>> {
|
||||
fn map_filter_string(t: &str, schema: &Arc<Schema>, db: &Database) -> Option<Vec<String>> {
|
||||
match t {
|
||||
"string" => {
|
||||
if let Some(fmt) = &schema.obj.format {
|
||||
@ -119,11 +149,20 @@ impl Schema {
|
||||
"number" => Some(vec!["number.condition".to_string()]),
|
||||
"boolean" => Some(vec!["boolean.condition".to_string()]),
|
||||
"object" => None, // Inline structures are ignored in Composed References
|
||||
"array" => None, // We don't filter primitive arrays or map complex arrays yet
|
||||
"array" => {
|
||||
if let Some(items) = &schema.obj.items {
|
||||
return Self::resolve_filter_type(items, db);
|
||||
}
|
||||
None
|
||||
},
|
||||
"null" => None,
|
||||
custom => {
|
||||
// Assume anything else is a Relational cross-boundary that already has its own .filter dynamically built
|
||||
Some(vec![format!("{}.filter", custom)])
|
||||
if db.enums.contains_key(custom) {
|
||||
Some(vec![format!("{}.condition", custom)])
|
||||
} else {
|
||||
// Assume anything else is a Relational cross-boundary that already has its own .filter dynamically built
|
||||
Some(vec![format!("{}.filter", custom)])
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,9 +1,11 @@
|
||||
pub mod collection;
|
||||
pub mod condition;
|
||||
pub mod edges;
|
||||
pub mod filter;
|
||||
pub mod polymorphism;
|
||||
|
||||
use crate::database::schema::Schema;
|
||||
use indexmap::IndexMap;
|
||||
|
||||
impl Schema {
|
||||
pub fn compile(
|
||||
@ -47,12 +49,12 @@ impl Schema {
|
||||
}
|
||||
}
|
||||
|
||||
let mut props = std::collections::BTreeMap::new();
|
||||
let mut props = IndexMap::new();
|
||||
|
||||
// 1. Resolve INHERITANCE dependencies first
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &self.obj.type_ {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
if let Some(parent) = db.get_scoped_schema(crate::database::realm::SchemaRealm::Type, t) {
|
||||
if !crate::database::object::is_primitive_type(t) && !t.starts_with('$') {
|
||||
if let Some(parent) = db.schemas.get(t).cloned() {
|
||||
parent.as_ref().compile(db, t, t.clone(), errors);
|
||||
if let Some(p_props) = parent.obj.compiled_properties.get() {
|
||||
props.extend(p_props.clone());
|
||||
@ -85,8 +87,8 @@ impl Schema {
|
||||
}
|
||||
|
||||
for t in types {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
if let Some(parent) = db.get_scoped_schema(crate::database::realm::SchemaRealm::Type, t) {
|
||||
if !crate::database::object::is_primitive_type(t) && !t.starts_with('$') {
|
||||
if let Some(parent) = db.schemas.get(t).cloned() {
|
||||
parent.as_ref().compile(db, t, t.clone(), errors);
|
||||
}
|
||||
}
|
||||
@ -123,8 +125,7 @@ impl Schema {
|
||||
|
||||
// 4. Set the OnceLock!
|
||||
let _ = self.obj.compiled_properties.set(props.clone());
|
||||
let mut names: Vec<String> = props.keys().cloned().collect();
|
||||
names.sort();
|
||||
let names: Vec<String> = props.keys().cloned().collect();
|
||||
let _ = self.obj.compiled_property_names.set(names);
|
||||
|
||||
// 5. Compute Edges natively
|
||||
|
||||
@ -1,3 +1,4 @@
|
||||
use indexmap::IndexSet;
|
||||
use crate::database::schema::Schema;
|
||||
|
||||
impl Schema {
|
||||
@ -8,11 +9,14 @@ impl Schema {
|
||||
path: &str,
|
||||
errors: &mut Vec<crate::drop::Error>,
|
||||
) {
|
||||
let mut options = std::collections::BTreeMap::new();
|
||||
let mut strategy = String::new();
|
||||
let mut options = indexmap::IndexMap::new();
|
||||
let strategy: &str;
|
||||
|
||||
if let Some(family) = &self.obj.family {
|
||||
// Formalize the <Variant>.<Base> topology
|
||||
// family_base extracts the 'Base' (e.g. 'widget', 'person')
|
||||
let family_base = family.split('.').next_back().unwrap_or(family).to_string();
|
||||
// family_prefix extracts the 'Variant' (e.g. 'stock', 'light')
|
||||
let family_prefix = family
|
||||
.strip_suffix(&family_base)
|
||||
.unwrap_or("")
|
||||
@ -21,7 +25,7 @@ impl Schema {
|
||||
if let Some(type_def) = db.types.get(&family_base) {
|
||||
if type_def.variations.len() > 1 && type_def.variations.iter().any(|v| v != &family_base) {
|
||||
// Scenario A / B: Table Variations
|
||||
strategy = "type".to_string();
|
||||
strategy = "type";
|
||||
for var in &type_def.variations {
|
||||
let target_id = if family_prefix.is_empty() {
|
||||
var.to_string()
|
||||
@ -29,13 +33,13 @@ impl Schema {
|
||||
format!("{}.{}", family_prefix, var)
|
||||
};
|
||||
|
||||
if db.get_scoped_schema(crate::database::realm::SchemaRealm::Type, &target_id).is_some() {
|
||||
if db.schemas.get(&target_id).is_some() {
|
||||
options.insert(var.to_string(), (None, Some(target_id)));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Scenario C: Single Table Inheritance (Horizontal)
|
||||
strategy = "kind".to_string();
|
||||
strategy = "kind";
|
||||
|
||||
let suffix = format!(".{}", family_base);
|
||||
|
||||
@ -47,12 +51,25 @@ impl Schema {
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Scenario D: Field-Backed JSONB Bubble STI (No explicit table representation)
|
||||
strategy = "kind";
|
||||
let suffix = format!(".{}", family_base);
|
||||
|
||||
// Scan the entire database schemas registry for matching suffixes
|
||||
for (id, schema) in &db.schemas {
|
||||
if id.ends_with(&suffix) || id == &family_base {
|
||||
if let Some(kind_val) = schema.obj.get_discriminator_value("kind", id) {
|
||||
options.insert(kind_val, (None, Some(id.to_string())));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if let Some(one_of) = &self.obj.one_of {
|
||||
let mut type_vals = std::collections::HashSet::new();
|
||||
let mut kind_vals = std::collections::HashSet::new();
|
||||
let mut type_vals = IndexSet::new();
|
||||
let mut kind_vals = IndexSet::new();
|
||||
let mut disjoint_base = true;
|
||||
let mut structural_types = std::collections::HashSet::new();
|
||||
let mut structural_types = IndexSet::new();
|
||||
|
||||
for c in one_of {
|
||||
let mut child_id = String::new();
|
||||
@ -81,7 +98,7 @@ impl Schema {
|
||||
}
|
||||
|
||||
if disjoint_base && structural_types.len() == one_of.len() {
|
||||
strategy = "".to_string();
|
||||
strategy = "";
|
||||
for (i, c) in one_of.iter().enumerate() {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &c.obj.type_ {
|
||||
if crate::database::object::is_primitive_type(t) {
|
||||
@ -93,11 +110,11 @@ impl Schema {
|
||||
}
|
||||
} else {
|
||||
strategy = if type_vals.len() > 1 && type_vals.len() == one_of.len() {
|
||||
"type".to_string()
|
||||
"type"
|
||||
} else if kind_vals.len() > 1 && kind_vals.len() == one_of.len() {
|
||||
"kind".to_string()
|
||||
"kind"
|
||||
} else {
|
||||
"".to_string()
|
||||
""
|
||||
};
|
||||
|
||||
if strategy.is_empty() {
|
||||
@ -145,7 +162,7 @@ impl Schema {
|
||||
|
||||
if !options.is_empty() {
|
||||
if !strategy.is_empty() {
|
||||
let _ = self.obj.compiled_discriminator.set(strategy);
|
||||
let _ = self.obj.compiled_discriminator.set(strategy.to_string());
|
||||
}
|
||||
let _ = self.obj.compiled_options.set(options);
|
||||
}
|
||||
|
||||
@ -1,4 +1,5 @@
|
||||
use crate::database::schema::Schema;
|
||||
use indexmap::IndexMap;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::sync::Arc;
|
||||
|
||||
@ -10,5 +11,5 @@ pub struct Enum {
|
||||
pub source: String,
|
||||
pub values: Vec<String>,
|
||||
#[serde(default)]
|
||||
pub schemas: std::collections::BTreeMap<String, Arc<Schema>>,
|
||||
pub schemas: IndexMap<String, Arc<Schema>>,
|
||||
}
|
||||
|
||||
@ -6,7 +6,6 @@ pub mod formats;
|
||||
pub mod object;
|
||||
pub mod page;
|
||||
pub mod punc;
|
||||
pub mod realm;
|
||||
pub mod relation;
|
||||
pub mod schema;
|
||||
pub mod r#type;
|
||||
@ -21,20 +20,21 @@ use executors::pgrx::SpiExecutor;
|
||||
use executors::mock::MockExecutor;
|
||||
|
||||
use punc::Punc;
|
||||
use realm::SchemaRealm;
|
||||
use relation::Relation;
|
||||
use schema::Schema;
|
||||
use serde_json::Value;
|
||||
use std::collections::HashMap;
|
||||
use indexmap::IndexMap;
|
||||
use std::sync::Arc;
|
||||
use r#type::Type;
|
||||
|
||||
#[derive(serde::Serialize)]
|
||||
pub struct Database {
|
||||
pub enums: HashMap<String, Enum>,
|
||||
pub types: HashMap<String, Type>,
|
||||
pub puncs: HashMap<String, Punc>,
|
||||
pub relations: HashMap<String, Relation>,
|
||||
pub enums: IndexMap<String, Enum>,
|
||||
pub types: IndexMap<String, Type>,
|
||||
pub puncs: IndexMap<String, Punc>,
|
||||
pub relations: IndexMap<String, Relation>,
|
||||
#[serde(skip)]
|
||||
pub schemas: IndexMap<String, Arc<Schema>>,
|
||||
#[serde(skip)]
|
||||
pub executor: Box<dyn DatabaseExecutor + Send + Sync>,
|
||||
}
|
||||
@ -42,10 +42,11 @@ pub struct Database {
|
||||
impl Database {
|
||||
pub fn new(val: &serde_json::Value) -> (Self, crate::drop::Drop) {
|
||||
let mut db = Self {
|
||||
enums: HashMap::new(),
|
||||
types: HashMap::new(),
|
||||
relations: HashMap::new(),
|
||||
puncs: HashMap::new(),
|
||||
enums: IndexMap::new(),
|
||||
types: IndexMap::new(),
|
||||
relations: IndexMap::new(),
|
||||
puncs: IndexMap::new(),
|
||||
schemas: IndexMap::new(),
|
||||
#[cfg(not(test))]
|
||||
executor: Box::new(SpiExecutor::new()),
|
||||
#[cfg(test)]
|
||||
@ -190,29 +191,54 @@ impl Database {
|
||||
}
|
||||
|
||||
pub fn compile(&mut self, errors: &mut Vec<crate::drop::Error>) {
|
||||
// Phase 1: Registration
|
||||
self.collect_schemas(errors);
|
||||
|
||||
// Formally evaluate properties with strict 3-pass Ordered Graph execution natively
|
||||
// Phase 2: Formally evaluate properties with strict 3-pass Ordered Graph execution natively
|
||||
for (_, enum_def) in &self.enums {
|
||||
for (schema_id, schema_arc) in &enum_def.schemas {
|
||||
let root_id = schema_id.split('/').next().unwrap_or(schema_id);
|
||||
schema_arc.as_ref().compile(self, root_id, schema_id.clone(), errors);
|
||||
}
|
||||
for (schema_id, schema_arc) in &enum_def.schemas {
|
||||
let root_id = schema_id.split('/').next().unwrap_or(schema_id);
|
||||
schema_arc
|
||||
.as_ref()
|
||||
.compile(self, root_id, schema_id.clone(), errors);
|
||||
}
|
||||
}
|
||||
for (_, type_def) in &self.types {
|
||||
for (schema_id, schema_arc) in &type_def.schemas {
|
||||
let root_id = schema_id.split('/').next().unwrap_or(schema_id);
|
||||
schema_arc.as_ref().compile(self, root_id, schema_id.clone(), errors);
|
||||
}
|
||||
for (schema_id, schema_arc) in &type_def.schemas {
|
||||
let root_id = schema_id.split('/').next().unwrap_or(schema_id);
|
||||
schema_arc
|
||||
.as_ref()
|
||||
.compile(self, root_id, schema_id.clone(), errors);
|
||||
}
|
||||
}
|
||||
for (_, punc_def) in &self.puncs {
|
||||
for (schema_id, schema_arc) in &punc_def.schemas {
|
||||
let root_id = schema_id.split('/').next().unwrap_or(schema_id);
|
||||
schema_arc.as_ref().compile(self, root_id, schema_id.clone(), errors);
|
||||
}
|
||||
for (schema_id, schema_arc) in &punc_def.schemas {
|
||||
let root_id = schema_id.split('/').next().unwrap_or(schema_id);
|
||||
schema_arc
|
||||
.as_ref()
|
||||
.compile(self, root_id, schema_id.clone(), errors);
|
||||
}
|
||||
}
|
||||
|
||||
// Phase 2: Synthesize Composed Filter References
|
||||
// Phase 3: Synthesize Virtual Boundaries
|
||||
let mut compile_ids = self.compile_filters(errors);
|
||||
let mut condition_ids = self.compile_conditions();
|
||||
compile_ids.append(&mut condition_ids);
|
||||
|
||||
// Phase 4: Compile Virtual Boundaries
|
||||
// Now actively compile the newly injected schemas to lock all nested compose references natively
|
||||
for (_, id) in compile_ids {
|
||||
if let Some(schema_arc) = self.schemas.get(&id).cloned() {
|
||||
let root_id = id.split('/').next().unwrap_or(&id);
|
||||
schema_arc
|
||||
.as_ref()
|
||||
.compile(self, root_id, id.clone(), errors);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Synthesizes Composed Filter References for all table-backed boundaries.
|
||||
fn compile_filters(&mut self, errors: &mut Vec<crate::drop::Error>) -> Vec<(String, String)> {
|
||||
let mut filter_schemas = Vec::new();
|
||||
for (type_name, type_def) in &self.types {
|
||||
for (id, schema_arc) in &type_def.schemas {
|
||||
@ -234,20 +260,35 @@ impl Database {
|
||||
let mut filter_ids = Vec::new();
|
||||
for (type_name, id, filter_arc) in filter_schemas {
|
||||
filter_ids.push((type_name.clone(), id.clone()));
|
||||
self.schemas.insert(id.clone(), filter_arc.clone());
|
||||
if let Some(t) = self.types.get_mut(&type_name) {
|
||||
t.schemas.insert(id, filter_arc);
|
||||
}
|
||||
}
|
||||
filter_ids
|
||||
}
|
||||
|
||||
// Now actively compile the newly injected filters to lock all nested compose references natively
|
||||
for (type_name, id) in filter_ids {
|
||||
if let Some(filter_arc) = self.types.get(&type_name).and_then(|t| t.schemas.get(&id)).cloned() {
|
||||
let root_id = id.split('/').next().unwrap_or(&id);
|
||||
filter_arc
|
||||
.as_ref()
|
||||
.compile(self, root_id, id.clone(), errors);
|
||||
/// Synthesizes strong Enum Conditions mirroring the string.condition capabilities.
|
||||
fn compile_conditions(&mut self) -> Vec<(String, String)> {
|
||||
let mut enum_conditions = Vec::new();
|
||||
for (enum_name, enum_def) in &self.enums {
|
||||
let cond_schema = enum_def.compile_condition();
|
||||
enum_conditions.push((
|
||||
enum_name.clone(),
|
||||
format!("{}.condition", enum_name),
|
||||
Arc::new(cond_schema),
|
||||
));
|
||||
}
|
||||
|
||||
let mut condition_ids = Vec::new();
|
||||
for (enum_name, id, cond_arc) in enum_conditions {
|
||||
condition_ids.push((enum_name.clone(), id.clone()));
|
||||
self.schemas.insert(id.clone(), cond_arc.clone());
|
||||
if let Some(e) = self.enums.get_mut(&enum_name) {
|
||||
e.schemas.insert(id.clone(), cond_arc.clone());
|
||||
}
|
||||
}
|
||||
condition_ids
|
||||
}
|
||||
|
||||
fn collect_schemas(&mut self, errors: &mut Vec<crate::drop::Error>) {
|
||||
@ -259,6 +300,7 @@ impl Database {
|
||||
// Validate every node recursively via string filters natively!
|
||||
for (type_name, type_def) in &self.types {
|
||||
for (id, schema_arc) in &type_def.schemas {
|
||||
self.schemas.insert(id.clone(), Arc::clone(schema_arc));
|
||||
let mut local_insert = Vec::new();
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
schema_arc,
|
||||
@ -275,6 +317,7 @@ impl Database {
|
||||
|
||||
for (punc_name, punc_def) in &self.puncs {
|
||||
for (id, schema_arc) in &punc_def.schemas {
|
||||
self.schemas.insert(id.clone(), Arc::clone(schema_arc));
|
||||
let mut local_insert = Vec::new();
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
schema_arc,
|
||||
@ -291,6 +334,7 @@ impl Database {
|
||||
|
||||
for (enum_name, enum_def) in &self.enums {
|
||||
for (id, schema_arc) in &enum_def.schemas {
|
||||
self.schemas.insert(id.clone(), Arc::clone(schema_arc));
|
||||
let mut local_insert = Vec::new();
|
||||
crate::database::schema::Schema::collect_schemas(
|
||||
schema_arc,
|
||||
@ -305,57 +349,27 @@ impl Database {
|
||||
}
|
||||
}
|
||||
|
||||
// Apply local scopes
|
||||
// Apply local scopes and global schema map
|
||||
for (origin_name, id, schema_arc) in type_insert {
|
||||
self.schemas.insert(id.clone(), schema_arc.clone());
|
||||
if let Some(t) = self.types.get_mut(&origin_name) {
|
||||
t.schemas.insert(id, schema_arc);
|
||||
}
|
||||
}
|
||||
for (origin_name, id, schema_arc) in punc_insert {
|
||||
self.schemas.insert(id.clone(), schema_arc.clone());
|
||||
if let Some(p) = self.puncs.get_mut(&origin_name) {
|
||||
p.schemas.insert(id, schema_arc);
|
||||
}
|
||||
}
|
||||
for (origin_name, id, schema_arc) in enum_insert {
|
||||
self.schemas.insert(id.clone(), schema_arc.clone());
|
||||
if let Some(e) = self.enums.get_mut(&origin_name) {
|
||||
e.schemas.insert(id, schema_arc);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_scoped_schema(&self, realm: SchemaRealm, schema_id: &str) -> Option<Arc<Schema>> {
|
||||
// Punc Realm natively maps mathematically to `.request` and `.response` shapes
|
||||
if realm == SchemaRealm::Punc {
|
||||
if schema_id.ends_with(".request") || schema_id.ends_with(".response") {
|
||||
let punc_name = schema_id
|
||||
.trim_end_matches(".request")
|
||||
.trim_end_matches(".response");
|
||||
return self.puncs.get(punc_name).and_then(|p| p.schemas.get(schema_id).cloned());
|
||||
}
|
||||
}
|
||||
|
||||
let clean_id = schema_id.trim_end_matches(".filter");
|
||||
let root_id = clean_id.split('/').next().unwrap_or(clean_id);
|
||||
let base_name = root_id.split('.').next_back().unwrap_or(root_id);
|
||||
|
||||
// Puncs and Types can lookup Table boundaries
|
||||
if realm == SchemaRealm::Type || realm == SchemaRealm::Punc {
|
||||
if let Some(type_def) = self.types.get(base_name) {
|
||||
if let Some(schema) = type_def.schemas.get(schema_id) {
|
||||
return Some(schema.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// All realms can intrinsically look up enumerations
|
||||
if let Some(enum_def) = self.enums.get(base_name) {
|
||||
if let Some(schema) = enum_def.schemas.get(schema_id) {
|
||||
return Some(schema.clone());
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
/// Inspects the Postgres pg_constraint relations catalog to securely identify
|
||||
/// the precise Foreign Key connecting a parent and child hierarchy path.
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
use crate::database::schema::Schema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::Value;
|
||||
use std::collections::BTreeMap;
|
||||
use indexmap::IndexMap;
|
||||
use std::sync::Arc;
|
||||
use std::sync::OnceLock;
|
||||
|
||||
@ -30,10 +30,10 @@ pub struct SchemaObject {
|
||||
|
||||
// Object Keywords
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub properties: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
pub properties: Option<IndexMap<String, Arc<Schema>>>,
|
||||
#[serde(rename = "patternProperties")]
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub pattern_properties: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
pub pattern_properties: Option<IndexMap<String, Arc<Schema>>>,
|
||||
#[serde(rename = "additionalProperties")]
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub additional_properties: Option<Arc<Schema>>,
|
||||
@ -46,7 +46,7 @@ pub struct SchemaObject {
|
||||
|
||||
// dependencies can be schema dependencies or property dependencies
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub dependencies: Option<BTreeMap<String, Dependency>>,
|
||||
pub dependencies: Option<IndexMap<String, Dependency>>,
|
||||
|
||||
// Array Keywords
|
||||
#[serde(rename = "items")]
|
||||
@ -147,7 +147,7 @@ pub struct SchemaObject {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub control: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub actions: Option<BTreeMap<String, Action>>,
|
||||
pub actions: Option<IndexMap<String, Action>>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub computer: Option<String>,
|
||||
#[serde(default)]
|
||||
@ -164,7 +164,7 @@ pub struct SchemaObject {
|
||||
|
||||
// Internal structural representation caching active AST Node maps. Unlike the Go framework counterpart, the JSPG implementation DOES natively include ALL ancestral inheritance boundary schemas because it compiles locally against the raw database graph.
|
||||
#[serde(skip)]
|
||||
pub compiled_properties: OnceLock<BTreeMap<String, Arc<Schema>>>,
|
||||
pub compiled_properties: OnceLock<IndexMap<String, Arc<Schema>>>,
|
||||
|
||||
#[serde(rename = "compiledDiscriminator")]
|
||||
#[serde(skip_deserializing)]
|
||||
@ -176,13 +176,13 @@ pub struct SchemaObject {
|
||||
#[serde(skip_deserializing)]
|
||||
#[serde(skip_serializing_if = "crate::database::object::is_once_lock_map_empty")]
|
||||
#[serde(serialize_with = "crate::database::object::serialize_once_lock")]
|
||||
pub compiled_options: OnceLock<BTreeMap<String, (Option<usize>, Option<String>)>>,
|
||||
pub compiled_options: OnceLock<IndexMap<String, (Option<usize>, Option<String>)>>,
|
||||
|
||||
#[serde(rename = "compiledEdges")]
|
||||
#[serde(skip_deserializing)]
|
||||
#[serde(skip_serializing_if = "crate::database::object::is_once_lock_map_empty")]
|
||||
#[serde(serialize_with = "crate::database::object::serialize_once_lock")]
|
||||
pub compiled_edges: OnceLock<BTreeMap<String, crate::database::edge::Edge>>,
|
||||
pub compiled_edges: OnceLock<IndexMap<String, crate::database::edge::Edge>>,
|
||||
|
||||
#[serde(skip)]
|
||||
pub compiled_format: OnceLock<CompiledFormat>,
|
||||
@ -245,7 +245,7 @@ pub fn serialize_once_lock<T: serde::Serialize, S: serde::Serializer>(
|
||||
}
|
||||
}
|
||||
|
||||
pub fn is_once_lock_map_empty<K, V>(lock: &OnceLock<std::collections::BTreeMap<K, V>>) -> bool {
|
||||
pub fn is_once_lock_map_empty<K, V>(lock: &OnceLock<indexmap::IndexMap<K, V>>) -> bool {
|
||||
lock.get().map_or(true, |m| m.is_empty())
|
||||
}
|
||||
|
||||
|
||||
@ -1,5 +1,6 @@
|
||||
use crate::database::page::Page;
|
||||
use crate::database::schema::Schema;
|
||||
use indexmap::IndexMap;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::sync::Arc;
|
||||
|
||||
@ -18,5 +19,5 @@ pub struct Punc {
|
||||
pub save: Option<String>,
|
||||
pub page: Option<Page>,
|
||||
#[serde(default)]
|
||||
pub schemas: std::collections::BTreeMap<String, Arc<Schema>>,
|
||||
pub schemas: IndexMap<String, Arc<Schema>>,
|
||||
}
|
||||
|
||||
@ -1,6 +0,0 @@
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum SchemaRealm {
|
||||
Enum,
|
||||
Type,
|
||||
Punc,
|
||||
}
|
||||
@ -22,6 +22,27 @@ impl std::ops::DerefMut for Schema {
|
||||
}
|
||||
}
|
||||
|
||||
impl Schema {
|
||||
/// Returns true if the schema acts purely as a type pointer (composition without overriding constraints)
|
||||
pub fn is_proxy(&self) -> bool {
|
||||
self.obj.properties.is_none()
|
||||
&& self.obj.pattern_properties.is_none()
|
||||
&& self.obj.additional_properties.is_none()
|
||||
&& self.obj.required.is_none()
|
||||
&& self.obj.dependencies.is_none()
|
||||
&& self.obj.items.is_none()
|
||||
&& self.obj.prefix_items.is_none()
|
||||
&& self.obj.contains.is_none()
|
||||
&& self.obj.format.is_none()
|
||||
&& self.obj.enum_.is_none()
|
||||
&& self.obj.const_.is_none()
|
||||
&& self.obj.cases.is_none()
|
||||
&& self.obj.one_of.is_none()
|
||||
&& self.obj.not.is_none()
|
||||
&& self.obj.family.is_none()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'de> Deserialize<'de> for Schema {
|
||||
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||
where
|
||||
|
||||
@ -1,4 +1,4 @@
|
||||
use std::collections::HashSet;
|
||||
use indexmap::{IndexMap, IndexSet};
|
||||
|
||||
use crate::database::schema::Schema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
@ -25,7 +25,7 @@ pub struct Type {
|
||||
#[serde(default)]
|
||||
pub hierarchy: Vec<String>,
|
||||
#[serde(default)]
|
||||
pub variations: HashSet<String>,
|
||||
pub variations: IndexSet<String>,
|
||||
#[serde(default)]
|
||||
pub relationship: bool,
|
||||
#[serde(default)]
|
||||
@ -39,5 +39,5 @@ pub struct Type {
|
||||
pub default_fields: Vec<String>,
|
||||
pub field_types: Option<Value>,
|
||||
#[serde(default)]
|
||||
pub schemas: std::collections::BTreeMap<String, Arc<Schema>>,
|
||||
pub schemas: IndexMap<String, Arc<Schema>>,
|
||||
}
|
||||
|
||||
11
src/lib.rs
11
src/lib.rs
@ -7,6 +7,9 @@ pg_module_magic!();
|
||||
#[cfg(test)]
|
||||
pub struct JsonB(pub serde_json::Value);
|
||||
|
||||
#[cfg(test)]
|
||||
pub struct Json(pub serde_json::Value);
|
||||
|
||||
pub mod database;
|
||||
pub mod drop;
|
||||
pub mod jspg;
|
||||
@ -41,7 +44,7 @@ fn jspg_failure() -> JsonB {
|
||||
}
|
||||
|
||||
#[cfg_attr(not(test), pg_extern(strict))]
|
||||
pub fn jspg_setup(database: JsonB) -> JsonB {
|
||||
pub fn jspg_setup(database: Json) -> JsonB {
|
||||
let (new_jspg, drop) = crate::jspg::Jspg::new(&database.0);
|
||||
let new_arc = Arc::new(new_jspg);
|
||||
|
||||
@ -109,7 +112,7 @@ pub fn jspg_validate(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
}
|
||||
|
||||
#[cfg_attr(not(test), pg_extern)]
|
||||
pub fn jspg_database() -> JsonB {
|
||||
pub fn jspg_database() -> Json {
|
||||
let engine_opt = {
|
||||
let lock = GLOBAL_JSPG.read().unwrap();
|
||||
lock.clone()
|
||||
@ -120,9 +123,9 @@ pub fn jspg_database() -> JsonB {
|
||||
let database_json = serde_json::to_value(&engine.database)
|
||||
.unwrap_or(serde_json::Value::Object(serde_json::Map::new()));
|
||||
let drop = crate::drop::Drop::success_with_val(database_json);
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
Json(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
None => jspg_failure(),
|
||||
None => Json(jspg_failure().0),
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@ -4,7 +4,6 @@
|
||||
pub mod cache;
|
||||
|
||||
use crate::database::Database;
|
||||
use crate::database::realm::SchemaRealm;
|
||||
use crate::database::r#type::Type;
|
||||
use serde_json::Value;
|
||||
use std::sync::Arc;
|
||||
@ -25,7 +24,7 @@ impl Merger {
|
||||
pub fn merge(&self, schema_id: &str, data: Value) -> crate::drop::Drop {
|
||||
let mut notifications_queue = Vec::new();
|
||||
|
||||
let target_schema = match self.db.get_scoped_schema(SchemaRealm::Type, schema_id) {
|
||||
let target_schema = match self.db.schemas.get(schema_id) {
|
||||
Some(s) => Arc::clone(&s),
|
||||
None => {
|
||||
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||
@ -41,7 +40,7 @@ impl Merger {
|
||||
}
|
||||
};
|
||||
|
||||
let result = self.merge_internal(target_schema, data, &mut notifications_queue);
|
||||
let result = self.merge_internal(target_schema, data, &mut notifications_queue, None, false);
|
||||
|
||||
let val_resolved = match result {
|
||||
Ok(val) => val,
|
||||
@ -135,9 +134,11 @@ impl Merger {
|
||||
mut schema: Arc<crate::database::schema::Schema>,
|
||||
data: Value,
|
||||
notifications: &mut Vec<String>,
|
||||
parent_org_id: Option<String>,
|
||||
is_child: bool,
|
||||
) -> Result<Value, String> {
|
||||
match data {
|
||||
Value::Array(items) => self.merge_array(schema, items, notifications),
|
||||
Value::Array(items) => self.merge_array(schema, items, notifications, parent_org_id, is_child),
|
||||
Value::Object(map) => {
|
||||
if let Some(options) = schema.obj.compiled_options.get() {
|
||||
if let Some(disc) = schema.obj.compiled_discriminator.get() {
|
||||
@ -145,9 +146,7 @@ impl Merger {
|
||||
if let Some(v) = val {
|
||||
if let Some((idx_opt, target_id_opt)) = options.get(v) {
|
||||
if let Some(target_id) = target_id_opt {
|
||||
if let Some(target_schema) =
|
||||
self.db.get_scoped_schema(SchemaRealm::Type, target_id)
|
||||
{
|
||||
if let Some(target_schema) = self.db.schemas.get(target_id) {
|
||||
schema = target_schema.clone();
|
||||
} else {
|
||||
return Err(format!(
|
||||
@ -186,7 +185,7 @@ impl Merger {
|
||||
}
|
||||
}
|
||||
}
|
||||
self.merge_object(schema, map, notifications)
|
||||
self.merge_object(schema, map, notifications, parent_org_id, is_child)
|
||||
}
|
||||
_ => Err("Invalid merge payload: root must be an Object or Array".to_string()),
|
||||
}
|
||||
@ -197,6 +196,8 @@ impl Merger {
|
||||
schema: Arc<crate::database::schema::Schema>,
|
||||
items: Vec<Value>,
|
||||
notifications: &mut Vec<String>,
|
||||
parent_org_id: Option<String>,
|
||||
is_child: bool,
|
||||
) -> Result<Value, String> {
|
||||
let mut item_schema = schema.clone();
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &schema.obj.type_ {
|
||||
@ -209,7 +210,7 @@ impl Merger {
|
||||
|
||||
let mut resolved_items = Vec::new();
|
||||
for item in items {
|
||||
let resolved = self.merge_internal(item_schema.clone(), item, notifications)?;
|
||||
let resolved = self.merge_internal(item_schema.clone(), item, notifications, parent_org_id.clone(), is_child)?;
|
||||
resolved_items.push(resolved);
|
||||
}
|
||||
Ok(Value::Array(resolved_items))
|
||||
@ -220,6 +221,8 @@ impl Merger {
|
||||
schema: Arc<crate::database::schema::Schema>,
|
||||
obj: serde_json::Map<String, Value>,
|
||||
notifications: &mut Vec<String>,
|
||||
parent_org_id: Option<String>,
|
||||
is_child: bool,
|
||||
) -> Result<Value, String> {
|
||||
let queue_start = notifications.len();
|
||||
|
||||
@ -279,6 +282,20 @@ impl Merger {
|
||||
}
|
||||
}
|
||||
|
||||
let mut current_org_id = None;
|
||||
if let Some(compiled_props) = schema.obj.compiled_properties.get() {
|
||||
if let Some(org_schema) = compiled_props.get("organization_id") {
|
||||
if let Some(c) = &org_schema.obj.const_ {
|
||||
if let Some(c_str) = c.as_str() {
|
||||
current_org_id = Some(c_str.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if current_org_id.is_none() {
|
||||
current_org_id = parent_org_id.clone();
|
||||
}
|
||||
|
||||
let user_id = self.db.auth_user_id()?;
|
||||
let timestamp = self.db.timestamp()?;
|
||||
|
||||
@ -293,6 +310,16 @@ impl Merger {
|
||||
entity_change_kind = kind;
|
||||
entity_fetched = fetched;
|
||||
entity_replaces = replaces;
|
||||
|
||||
if entity_change_kind.as_deref() == Some("create") {
|
||||
if is_child {
|
||||
if !entity_fields.contains_key("organization_id") {
|
||||
if let Some(ref org_id) = current_org_id {
|
||||
entity_fields.insert("organization_id".to_string(), Value::String(org_id.clone()));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut entity_response = serde_json::Map::new();
|
||||
@ -313,17 +340,14 @@ impl Merger {
|
||||
if let Some(relation) = self.db.relations.get(&edge.constraint) {
|
||||
let parent_is_source = edge.forward;
|
||||
|
||||
let org_id_to_pass = entity_fields.get("organization_id").and_then(|v| v.as_str()).map(|s| s.to_string());
|
||||
if parent_is_source {
|
||||
if !relative.contains_key("organization_id") {
|
||||
if let Some(org_id) = entity_fields.get("organization_id") {
|
||||
relative.insert("organization_id".to_string(), org_id.clone());
|
||||
}
|
||||
}
|
||||
|
||||
let mut merged_relative = match self.merge_internal(
|
||||
rel_schema.clone(),
|
||||
Value::Object(relative),
|
||||
notifications,
|
||||
org_id_to_pass.clone(),
|
||||
true,
|
||||
)? {
|
||||
Value::Object(m) => m,
|
||||
_ => continue,
|
||||
@ -339,12 +363,6 @@ impl Merger {
|
||||
);
|
||||
entity_response.insert(relation_name, Value::Object(merged_relative));
|
||||
} else {
|
||||
if !relative.contains_key("organization_id") {
|
||||
if let Some(org_id) = entity_fields.get("organization_id") {
|
||||
relative.insert("organization_id".to_string(), org_id.clone());
|
||||
}
|
||||
}
|
||||
|
||||
Self::apply_entity_relation(
|
||||
&mut relative,
|
||||
&relation.source_columns,
|
||||
@ -356,6 +374,8 @@ impl Merger {
|
||||
rel_schema.clone(),
|
||||
Value::Object(relative),
|
||||
notifications,
|
||||
org_id_to_pass.clone(),
|
||||
true,
|
||||
)? {
|
||||
Value::Object(m) => m,
|
||||
_ => continue,
|
||||
@ -375,6 +395,16 @@ impl Merger {
|
||||
entity_change_kind = kind;
|
||||
entity_fetched = fetched;
|
||||
entity_replaces = replaces;
|
||||
|
||||
if entity_change_kind.as_deref() == Some("create") {
|
||||
if is_child {
|
||||
if !entity_fields.contains_key("organization_id") {
|
||||
if let Some(ref org_id) = current_org_id {
|
||||
entity_fields.insert("organization_id".to_string(), Value::String(org_id.clone()));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
self.merge_entity_fields(
|
||||
@ -402,15 +432,21 @@ impl Merger {
|
||||
if let Some(compiled_edges) = schema.obj.compiled_edges.get() {
|
||||
if let Some(edge) = compiled_edges.get(&relation_name) {
|
||||
if let Some(relation) = self.db.relations.get(&edge.constraint) {
|
||||
let mut item_schema = rel_schema.clone();
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) =
|
||||
&rel_schema.obj.type_
|
||||
{
|
||||
if t == "array" {
|
||||
if let Some(items_def) = &rel_schema.obj.items {
|
||||
item_schema = items_def.clone();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let org_id_to_pass = entity_fields.get("organization_id").and_then(|v| v.as_str()).map(|s| s.to_string());
|
||||
let mut relative_responses = Vec::new();
|
||||
for relative_item_val in relative_arr {
|
||||
if let Value::Object(mut relative_item) = relative_item_val {
|
||||
if !relative_item.contains_key("organization_id") {
|
||||
if let Some(org_id) = entity_fields.get("organization_id") {
|
||||
relative_item.insert("organization_id".to_string(), org_id.clone());
|
||||
}
|
||||
}
|
||||
|
||||
Self::apply_entity_relation(
|
||||
&mut relative_item,
|
||||
&relation.source_columns,
|
||||
@ -418,21 +454,12 @@ impl Merger {
|
||||
&entity_fields,
|
||||
);
|
||||
|
||||
let mut item_schema = rel_schema.clone();
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) =
|
||||
&rel_schema.obj.type_
|
||||
{
|
||||
if t == "array" {
|
||||
if let Some(items_def) = &rel_schema.obj.items {
|
||||
item_schema = items_def.clone();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let merged_relative = match self.merge_internal(
|
||||
item_schema,
|
||||
item_schema.clone(),
|
||||
Value::Object(relative_item),
|
||||
notifications,
|
||||
org_id_to_pass.clone(),
|
||||
true,
|
||||
)? {
|
||||
Value::Object(m) => m,
|
||||
_ => continue,
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
use crate::database::Database;
|
||||
use crate::database::realm::SchemaRealm;
|
||||
use indexmap::IndexMap;
|
||||
use std::sync::Arc;
|
||||
|
||||
pub struct Compiler<'a> {
|
||||
@ -25,15 +25,11 @@ pub struct Node<'a> {
|
||||
impl<'a> Compiler<'a> {
|
||||
/// Compiles a JSON schema into a nested PostgreSQL query returning JSONB
|
||||
pub fn compile(&self, schema_id: &str, filter_keys: &[String]) -> Result<String, String> {
|
||||
let realm = if schema_id.ends_with(".request") || schema_id.ends_with(".response") {
|
||||
SchemaRealm::Punc
|
||||
} else {
|
||||
SchemaRealm::Type
|
||||
};
|
||||
|
||||
let schema = self
|
||||
.db
|
||||
.get_scoped_schema(realm, schema_id)
|
||||
.schemas
|
||||
.get(schema_id)
|
||||
.cloned()
|
||||
.ok_or_else(|| format!("Schema not found: {}", schema_id))?;
|
||||
|
||||
let target_schema = schema;
|
||||
@ -157,7 +153,7 @@ impl<'a> Compiler<'a> {
|
||||
if let Some(crate::database::object::SchemaTypeOrArray::Single(t)) = &node.schema.obj.type_ {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
// If it's just an ad-hoc struct ref, we should resolve it
|
||||
if let Some(target_schema) = self.db.get_scoped_schema(SchemaRealm::Type, t) {
|
||||
if let Some(target_schema) = self.db.schemas.get(t).cloned() {
|
||||
let mut ref_node = node.clone();
|
||||
ref_node.schema = target_schema.clone();
|
||||
ref_node.schema_id = Some(t.clone());
|
||||
@ -261,7 +257,7 @@ impl<'a> Compiler<'a> {
|
||||
|
||||
fn compile_object(
|
||||
&mut self,
|
||||
props: &std::collections::BTreeMap<String, std::sync::Arc<crate::database::schema::Schema>>,
|
||||
props: &IndexMap<String, std::sync::Arc<crate::database::schema::Schema>>,
|
||||
node: Node<'a>,
|
||||
) -> Result<(String, String), String> {
|
||||
let mut build_args = Vec::new();
|
||||
@ -312,7 +308,7 @@ impl<'a> Compiler<'a> {
|
||||
|
||||
for (disc_val, (idx_opt, target_id_opt)) in options {
|
||||
if let Some(target_id) = target_id_opt {
|
||||
if let Some(target_schema) = self.db.get_scoped_schema(SchemaRealm::Type, target_id) {
|
||||
if let Some(target_schema) = self.db.schemas.get(target_id).cloned() {
|
||||
let mut child_node = node.clone();
|
||||
child_node.schema = target_schema.clone();
|
||||
child_node.schema_id = Some(target_id.clone());
|
||||
@ -382,10 +378,7 @@ impl<'a> Compiler<'a> {
|
||||
return Ok(("NULL".to_string(), "string".to_string()));
|
||||
}
|
||||
|
||||
case_statements.sort();
|
||||
|
||||
let sql = format!("CASE {} ELSE NULL END", case_statements.join(" "));
|
||||
|
||||
Ok((sql, "object".to_string()))
|
||||
}
|
||||
|
||||
@ -422,7 +415,7 @@ impl<'a> Compiler<'a> {
|
||||
) -> Result<Vec<String>, String> {
|
||||
let mut select_args = Vec::new();
|
||||
let grouped_fields = r#type.grouped_fields.as_ref().and_then(|v| v.as_object());
|
||||
let default_props = std::collections::BTreeMap::new();
|
||||
let default_props = IndexMap::new();
|
||||
let merged_props = node
|
||||
.schema
|
||||
.obj
|
||||
@ -466,10 +459,24 @@ impl<'a> Compiler<'a> {
|
||||
.cloned()
|
||||
.unwrap_or_else(|| format!("{}_t_err", node.parent_alias));
|
||||
|
||||
let mut lookup_key = prop_key.as_str();
|
||||
|
||||
if let Some(edges) = node.schema.obj.compiled_edges.get() {
|
||||
if let Some(edge) = edges.get(prop_key) {
|
||||
if let Some(relation) = self.db.relations.get(&edge.constraint) {
|
||||
if edge.forward {
|
||||
lookup_key = &relation.source_columns[0];
|
||||
} else {
|
||||
lookup_key = &relation.destination_columns[0];
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(gf) = grouped_fields {
|
||||
for (t_name, fields_val) in gf {
|
||||
if let Some(fields_arr) = fields_val.as_array() {
|
||||
if fields_arr.iter().any(|v| v.as_str() == Some(prop_key)) {
|
||||
if fields_arr.iter().any(|v| v.as_str() == Some(lookup_key)) {
|
||||
owner_alias = table_aliases
|
||||
.get(t_name)
|
||||
.cloned()
|
||||
|
||||
@ -1247,6 +1247,36 @@ fn test_const_17_1() {
|
||||
crate::tests::runner::run_test_case(&path, 17, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_type_0_0() {
|
||||
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_type_0_1() {
|
||||
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_type_0_2() {
|
||||
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_type_0_3() {
|
||||
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 3).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_type_0_4() {
|
||||
let path = format!("{}/fixtures/dynamicType.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 4).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_property_names_0_0() {
|
||||
let path = format!("{}/fixtures/propertyNames.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -1589,6 +1619,24 @@ fn test_polymorphism_5_2() {
|
||||
crate::tests::runner::run_test_case(&path, 5, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_polymorphism_6_0() {
|
||||
let path = format!("{}/fixtures/polymorphism.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 6, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_polymorphism_6_1() {
|
||||
let path = format!("{}/fixtures/polymorphism.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 6, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_polymorphism_6_2() {
|
||||
let path = format!("{}/fixtures/polymorphism.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 6, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_not_0_0() {
|
||||
let path = format!("{}/fixtures/not.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -8140,3 +8188,9 @@ fn test_merger_0_14() {
|
||||
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 14).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_merger_0_15() {
|
||||
let path = format!("{}/fixtures/merger.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::tests::runner::run_test_case(&path, 0, 15).unwrap();
|
||||
}
|
||||
|
||||
393
src/tests/formatter.rs
Normal file
393
src/tests/formatter.rs
Normal file
@ -0,0 +1,393 @@
|
||||
use sqlparser::ast::{
|
||||
BinaryOperator, Expr, Function, FunctionArg, Join, JoinConstraint, JoinOperator,
|
||||
Query, Select, SelectItem, SetExpr, Statement, TableWithJoins, Value
|
||||
};
|
||||
use sqlparser::dialect::PostgreSqlDialect;
|
||||
use sqlparser::parser::Parser;
|
||||
|
||||
pub struct SqlFormatter {
|
||||
pub lines: Vec<String>,
|
||||
pub indent: usize,
|
||||
}
|
||||
|
||||
impl SqlFormatter {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
lines: Vec::new(),
|
||||
indent: 0,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn format(sql: &str) -> Vec<String> {
|
||||
let dialect = PostgreSqlDialect {};
|
||||
let ast = match Parser::parse_sql(&dialect, sql) {
|
||||
Ok(ast) => ast,
|
||||
Err(e) => {
|
||||
println!("DEBUG PARSE SQL ERROR: {:?}", e);
|
||||
return vec![sql.to_string()];
|
||||
}
|
||||
};
|
||||
|
||||
if ast.is_empty() {
|
||||
return vec![sql.to_string()];
|
||||
}
|
||||
|
||||
let mut formatter = SqlFormatter::new();
|
||||
formatter.format_statement(&ast[0]);
|
||||
formatter.lines
|
||||
}
|
||||
|
||||
fn push_str(&mut self, s: &str) {
|
||||
if self.lines.is_empty() {
|
||||
self.lines.push(format!("{}{}", " ".repeat(self.indent), s.replace("JSONB", "jsonb")));
|
||||
} else {
|
||||
let last = self.lines.last_mut().unwrap();
|
||||
last.push_str(&s.replace("JSONB", "jsonb"));
|
||||
}
|
||||
}
|
||||
|
||||
fn push_line(&mut self, s: &str) {
|
||||
self.lines.push(format!("{}{}", " ".repeat(self.indent), s.replace("JSONB", "jsonb")));
|
||||
}
|
||||
|
||||
fn format_statement(&mut self, stmt: &Statement) {
|
||||
match stmt {
|
||||
Statement::Query(query) => {
|
||||
self.push_line("(");
|
||||
self.format_query(query);
|
||||
self.push_str(")");
|
||||
}
|
||||
Statement::Update(_update) => {
|
||||
let sql = stmt.to_string();
|
||||
self.format_update_fallback(&sql);
|
||||
}
|
||||
_ => {
|
||||
let sql = stmt.to_string();
|
||||
if sql.starts_with("INSERT") {
|
||||
self.format_insert_fallback(&sql);
|
||||
} else {
|
||||
self.push_line(&sql);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn format_insert_fallback(&mut self, sql: &str) {
|
||||
let s = sql.to_string();
|
||||
if let Some(values_idx) = s.find(" VALUES (") {
|
||||
let prefix = &s[..values_idx];
|
||||
let suffix = &s[values_idx + 9..];
|
||||
|
||||
if let Some(paren_idx) = prefix.find(" (") {
|
||||
self.push_line(&format!("{} (", &prefix[..paren_idx]));
|
||||
self.indent += 2;
|
||||
let cols = &prefix[paren_idx + 2..prefix.len() - 1];
|
||||
let cols_split: Vec<&str> = cols.split(", ").collect();
|
||||
for (i, col) in cols_split.iter().enumerate() {
|
||||
let comma = if i < cols_split.len() - 1 { "," } else { "" };
|
||||
let c = col.replace("\"", "");
|
||||
self.push_line(&format!("\"{}\"{}", c, comma));
|
||||
}
|
||||
self.indent -= 2;
|
||||
self.push_line(")");
|
||||
} else {
|
||||
self.push_line(prefix);
|
||||
}
|
||||
|
||||
self.push_line("VALUES (");
|
||||
self.indent += 2;
|
||||
|
||||
let vals = if suffix.ends_with(")") { &suffix[..suffix.len() - 1] } else { suffix };
|
||||
let mut val_tokens = Vec::new();
|
||||
let mut curr = String::new();
|
||||
let mut in_str = false;
|
||||
for c in vals.chars() {
|
||||
if c == '\'' {
|
||||
in_str = !in_str;
|
||||
curr.push(c);
|
||||
} else if c == ',' && !in_str {
|
||||
val_tokens.push(curr.trim().to_string());
|
||||
curr = String::new();
|
||||
} else {
|
||||
curr.push(c);
|
||||
}
|
||||
}
|
||||
if !curr.trim().is_empty() {
|
||||
val_tokens.push(curr.trim().to_string());
|
||||
}
|
||||
|
||||
for (i, val) in val_tokens.iter().enumerate() {
|
||||
let comma = if i < val_tokens.len() - 1 { "," } else { "" };
|
||||
|
||||
if val.starts_with("'{") && val.ends_with("}'") {
|
||||
let inner = &val[1..val.len() - 1];
|
||||
// Unescape single quotes from SQL strings
|
||||
let unescaped = inner.replace("''", "'");
|
||||
if let Ok(json) = serde_json::from_str::<serde_json::Value>(&unescaped) {
|
||||
if let Ok(pretty) = serde_json::to_string_pretty(&json) {
|
||||
let lines: Vec<&str> = pretty.split('\n').collect();
|
||||
self.push_line("'{");
|
||||
self.indent += 2;
|
||||
for (j, line) in lines.iter().skip(1).enumerate() {
|
||||
if j == lines.len() - 2 {
|
||||
self.indent -= 2;
|
||||
// re-escape single quotes for SQL
|
||||
self.push_line(&format!("{}'{}", line.replace("'", "''"), comma));
|
||||
} else {
|
||||
self.push_line(&line.replace("'", "''"));
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
self.push_line(&format!("{}{}", val, comma));
|
||||
}
|
||||
self.indent -= 2;
|
||||
self.push_line(")");
|
||||
} else {
|
||||
self.push_line(&s);
|
||||
}
|
||||
}
|
||||
|
||||
fn format_update_fallback(&mut self, sql: &str) {
|
||||
let s = sql.to_string();
|
||||
if let Some(set_idx) = s.find(" SET ") {
|
||||
self.push_line(&format!("{} SET", &s[..set_idx]));
|
||||
self.indent += 2;
|
||||
|
||||
let after_set = &s[set_idx + 5..];
|
||||
let where_idx = after_set.find(" WHERE ");
|
||||
let assigns = if let Some(w) = where_idx { &after_set[..w] } else { after_set };
|
||||
let assigns_split: Vec<&str> = assigns.split(", ").collect();
|
||||
for (i, assign) in assigns_split.iter().enumerate() {
|
||||
let comma = if i < assigns_split.len() - 1 { "," } else { "" };
|
||||
self.push_line(&format!("{}{}", assign.replace("\"", ""), comma));
|
||||
}
|
||||
self.indent -= 2;
|
||||
|
||||
if let Some(w) = where_idx {
|
||||
self.push_line("WHERE");
|
||||
self.indent += 2;
|
||||
self.push_line(&after_set[w + 7..]);
|
||||
self.indent -= 2;
|
||||
}
|
||||
} else {
|
||||
self.push_line(&s);
|
||||
}
|
||||
}
|
||||
|
||||
fn format_query(&mut self, query: &Query) {
|
||||
match &*query.body {
|
||||
SetExpr::Select(select) => self.format_select(select),
|
||||
SetExpr::Query(inner_query) => {
|
||||
self.push_str("(");
|
||||
self.format_query(inner_query);
|
||||
self.push_str(")");
|
||||
}
|
||||
_ => self.push_str(&query.to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
fn format_select(&mut self, select: &Select) {
|
||||
self.push_str("SELECT ");
|
||||
for (i, p) in select.projection.iter().enumerate() {
|
||||
let comma = if i < select.projection.len() - 1 { ", " } else { "" };
|
||||
self.format_select_item(p);
|
||||
self.push_str(comma);
|
||||
}
|
||||
|
||||
if !select.from.is_empty() {
|
||||
self.push_line("FROM ");
|
||||
for (i, table) in select.from.iter().enumerate() {
|
||||
let comma = if i < select.from.len() - 1 { ", " } else { "" };
|
||||
self.format_table_with_joins(table);
|
||||
self.push_str(comma);
|
||||
}
|
||||
|
||||
if let Some(selection) = &select.selection {
|
||||
self.push_line("WHERE");
|
||||
self.indent += 2;
|
||||
self.push_line(""); // new line for where clauses
|
||||
self.format_expr(selection);
|
||||
self.indent -= 2;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn format_select_item(&mut self, item: &SelectItem) {
|
||||
match item {
|
||||
SelectItem::UnnamedExpr(expr) => self.format_expr(expr),
|
||||
SelectItem::ExprWithAlias { expr, alias } => {
|
||||
self.format_expr(expr);
|
||||
self.push_str(&format!(" AS {}", alias));
|
||||
}
|
||||
_ => self.push_str(&item.to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
fn format_table_with_joins(&mut self, table: &TableWithJoins) {
|
||||
self.push_str(&table.relation.to_string());
|
||||
for join in &table.joins {
|
||||
self.push_line("");
|
||||
self.format_join(join);
|
||||
}
|
||||
}
|
||||
|
||||
fn format_join(&mut self, join: &Join) {
|
||||
let op = match &join.join_operator {
|
||||
JoinOperator::Inner(_) => "JOIN",
|
||||
JoinOperator::LeftOuter(_) => "LEFT JOIN",
|
||||
_ => "JOIN",
|
||||
};
|
||||
self.push_str(&format!("{} {} ON ", op, join.relation));
|
||||
|
||||
match &join.join_operator {
|
||||
JoinOperator::Inner(JoinConstraint::On(expr)) => self.format_expr(expr),
|
||||
JoinOperator::LeftOuter(JoinConstraint::On(expr)) => self.format_expr(expr),
|
||||
JoinOperator::Join(JoinConstraint::On(expr)) => self.format_expr(expr),
|
||||
_ => {
|
||||
println!("FALLBACK JOIN OP: {:?}", join.join_operator);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn format_expr(&mut self, expr: &Expr) {
|
||||
match expr {
|
||||
Expr::Function(func) => self.format_function(func),
|
||||
Expr::BinaryOp { left, op, right } => {
|
||||
if *op == BinaryOperator::And || *op == BinaryOperator::Or {
|
||||
self.format_expr(left);
|
||||
self.push_line(&format!("{} ", op));
|
||||
self.format_expr(right);
|
||||
} else {
|
||||
self.format_expr(left);
|
||||
self.push_str(&format!(" {} ", op));
|
||||
self.format_expr(right);
|
||||
}
|
||||
}
|
||||
Expr::Nested(inner) => {
|
||||
self.push_str("(");
|
||||
self.format_expr(inner);
|
||||
self.push_str(")");
|
||||
}
|
||||
Expr::IsNull(inner) => {
|
||||
self.format_expr(inner);
|
||||
self.push_str(" IS NULL");
|
||||
}
|
||||
Expr::IsNotNull(inner) => {
|
||||
self.format_expr(inner);
|
||||
self.push_str(" IS NOT NULL");
|
||||
}
|
||||
Expr::Subquery(query) => {
|
||||
self.push_str("(");
|
||||
self.indent += 2;
|
||||
self.push_line("");
|
||||
self.format_query(query);
|
||||
self.indent -= 2;
|
||||
self.push_line(")");
|
||||
}
|
||||
Expr::Case { operand, conditions, else_result, .. } => {
|
||||
self.push_str("CASE");
|
||||
if let Some(op) = operand {
|
||||
self.push_str(" ");
|
||||
self.format_expr(op);
|
||||
}
|
||||
self.indent += 2;
|
||||
for when in conditions {
|
||||
self.push_line("WHEN ");
|
||||
self.format_expr(&when.condition);
|
||||
self.push_str(" THEN ");
|
||||
self.format_expr(&when.result);
|
||||
}
|
||||
if let Some(els) = else_result {
|
||||
self.push_line("ELSE ");
|
||||
self.format_expr(els);
|
||||
}
|
||||
self.indent -= 2;
|
||||
self.push_line("END");
|
||||
}
|
||||
Expr::UnaryOp { op, expr: inner } => {
|
||||
self.push_str(&format!("{} ", op));
|
||||
self.format_expr(inner);
|
||||
}
|
||||
|
||||
Expr::Value(sqlparser::ast::ValueWithSpan { value: Value::SingleQuotedString(s), .. }) | Expr::Value(sqlparser::ast::ValueWithSpan { value: Value::EscapedStringLiteral(s), .. }) => {
|
||||
if s.starts_with('{') && s.ends_with('}') {
|
||||
if let Ok(json) = serde_json::from_str::<serde_json::Value>(s) {
|
||||
if let Ok(pretty) = serde_json::to_string_pretty(&json) {
|
||||
let lines: Vec<&str> = pretty.split('\n').collect();
|
||||
self.push_str("'{");
|
||||
self.indent += 2;
|
||||
for (j, line) in lines.iter().skip(1).enumerate() {
|
||||
if j == lines.len() - 2 {
|
||||
self.indent -= 2;
|
||||
self.push_line(&format!("{}'", line.replace("'", "''")));
|
||||
} else {
|
||||
self.push_line(&line.replace("'", "''"));
|
||||
}
|
||||
}
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
self.push_str(&expr.to_string());
|
||||
}
|
||||
_ => {
|
||||
self.push_str(&expr.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn format_function(&mut self, func: &Function) {
|
||||
let name = func.name.to_string();
|
||||
self.push_str(&format!("{}(", name));
|
||||
|
||||
if let sqlparser::ast::FunctionArguments::List(list) = &func.args {
|
||||
if name == "jsonb_build_object" {
|
||||
self.indent += 2;
|
||||
self.push_line("");
|
||||
let mut i = 0;
|
||||
while i < list.args.len() {
|
||||
let arg_key = &list.args[i];
|
||||
let arg_val = if i + 1 < list.args.len() { Some(&list.args[i+1]) } else { None };
|
||||
|
||||
self.format_function_arg(arg_key);
|
||||
self.push_str(", ");
|
||||
if let Some(val) = arg_val {
|
||||
self.format_function_arg(val);
|
||||
}
|
||||
|
||||
if i + 2 < list.args.len() {
|
||||
self.push_str(",");
|
||||
self.push_line("");
|
||||
}
|
||||
i += 2;
|
||||
}
|
||||
self.indent -= 2;
|
||||
self.push_line(")");
|
||||
} else {
|
||||
for (i, arg) in list.args.iter().enumerate() {
|
||||
let comma = if i < list.args.len() - 1 { ", " } else { "" };
|
||||
self.format_function_arg(arg);
|
||||
self.push_str(comma);
|
||||
}
|
||||
self.push_str(")");
|
||||
}
|
||||
} else {
|
||||
self.push_str(")");
|
||||
}
|
||||
}
|
||||
|
||||
fn format_function_arg(&mut self, arg: &FunctionArg) {
|
||||
match arg {
|
||||
FunctionArg::Unnamed(sqlparser::ast::FunctionArgExpr::Expr(expr)) => self.format_expr(expr),
|
||||
_ => {
|
||||
println!("FALLBACK ARG: {:?}", arg);
|
||||
self.push_str(&arg.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,4 +1,5 @@
|
||||
use crate::*;
|
||||
pub mod formatter;
|
||||
pub mod runner;
|
||||
pub mod types;
|
||||
use serde_json::json;
|
||||
@ -72,7 +73,7 @@ fn test_library_api() {
|
||||
]
|
||||
});
|
||||
|
||||
let cache_drop = jspg_setup(JsonB(db_json));
|
||||
let cache_drop = jspg_setup(Json(db_json));
|
||||
assert_eq!(
|
||||
cache_drop.0,
|
||||
json!({
|
||||
@ -127,7 +128,7 @@ fn test_library_api() {
|
||||
"forward": true
|
||||
}
|
||||
},
|
||||
"compiledPropertyNames": ["name", "target", "type"],
|
||||
"compiledPropertyNames": ["type", "name", "target"],
|
||||
"properties": {
|
||||
"name": { "type": "string" },
|
||||
"target": {
|
||||
@ -140,19 +141,19 @@ fn test_library_api() {
|
||||
"type": "object"
|
||||
},
|
||||
"source_schema.filter": {
|
||||
"compiledPropertyNames": ["$and", "$or", "name", "target", "type"],
|
||||
"compiledPropertyNames": ["type", "name", "target", "$and", "$or"],
|
||||
"properties": {
|
||||
"$and": {
|
||||
"type": ["array", "null"],
|
||||
"items": {
|
||||
"compiledPropertyNames": ["$and", "$or", "name", "target", "type"],
|
||||
"compiledPropertyNames": ["type", "name", "target", "$and", "$or"],
|
||||
"type": "source_schema.filter"
|
||||
}
|
||||
},
|
||||
"$or": {
|
||||
"type": ["array", "null"],
|
||||
"items": {
|
||||
"compiledPropertyNames": ["$and", "$or", "name", "target", "type"],
|
||||
"compiledPropertyNames": ["type", "name", "target", "$and", "$or"],
|
||||
"type": "source_schema.filter"
|
||||
}
|
||||
},
|
||||
@ -193,19 +194,19 @@ fn test_library_api() {
|
||||
"type": "object"
|
||||
},
|
||||
"target_schema.filter": {
|
||||
"compiledPropertyNames": ["$and", "$or", "value"],
|
||||
"compiledPropertyNames": ["value", "$and", "$or"],
|
||||
"properties": {
|
||||
"$and": {
|
||||
"type": ["array", "null"],
|
||||
"items": {
|
||||
"compiledPropertyNames": ["$and", "$or", "value"],
|
||||
"compiledPropertyNames": ["value", "$and", "$or"],
|
||||
"type": "target_schema.filter"
|
||||
}
|
||||
},
|
||||
"$or": {
|
||||
"type": ["array", "null"],
|
||||
"items": {
|
||||
"compiledPropertyNames": ["$and", "$or", "value"],
|
||||
"compiledPropertyNames": ["value", "$and", "$or"],
|
||||
"type": "target_schema.filter"
|
||||
}
|
||||
},
|
||||
|
||||
@ -127,7 +127,7 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
||||
}
|
||||
}
|
||||
"merge" => {
|
||||
let result = test.run_merge(db_unwrapped.unwrap());
|
||||
let result = test.run_merge(db_unwrapped.unwrap(), path, suite_idx, case_idx);
|
||||
if let Err(e) = result {
|
||||
println!("TEST MERGE ERROR FOR '{}': {}", test.description, e);
|
||||
failures.push(format!(
|
||||
@ -137,7 +137,7 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
||||
}
|
||||
}
|
||||
"query" => {
|
||||
let result = test.run_query(db_unwrapped.unwrap());
|
||||
let result = test.run_query(db_unwrapped.unwrap(), path, suite_idx, case_idx);
|
||||
if let Err(e) = result {
|
||||
println!("TEST QUERY ERROR FOR '{}': {}", test.description, e);
|
||||
failures.push(format!(
|
||||
@ -160,3 +160,83 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn extract_uuids(val: &Value, path: &str, map: &mut HashMap<String, String>) {
|
||||
let uuid_re = regex::Regex::new(r"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$").unwrap();
|
||||
|
||||
match val {
|
||||
Value::Object(obj) => {
|
||||
for (k, v) in obj {
|
||||
let new_path = if path.is_empty() { k.clone() } else { format!("{}.{}", path, k) };
|
||||
extract_uuids(v, &new_path, map);
|
||||
}
|
||||
}
|
||||
Value::Array(arr) => {
|
||||
for (i, v) in arr.iter().enumerate() {
|
||||
let new_path = if path.is_empty() { i.to_string() } else { format!("{}.{}", path, i) };
|
||||
extract_uuids(v, &new_path, map);
|
||||
}
|
||||
}
|
||||
Value::String(s) => {
|
||||
if s != "00000000-0000-0000-0000-000000000000" && uuid_re.is_match(s) {
|
||||
map.insert(s.clone(), path.to_string());
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn canonicalize_with_map(s: &str, uuid_map: &HashMap<String, String>, gen_map: &mut HashMap<String, usize>) -> String {
|
||||
let uuid_re = regex::Regex::new(r"[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}").unwrap();
|
||||
let s1 = uuid_re.replace_all(s, |caps: ®ex::Captures| {
|
||||
let val = &caps[0];
|
||||
if val == "00000000-0000-0000-0000-000000000000" {
|
||||
val.to_string()
|
||||
} else if let Some(path) = uuid_map.get(val) {
|
||||
format!("{{{{uuid:{}}}}}", path)
|
||||
} else {
|
||||
let next_idx = gen_map.len();
|
||||
let idx = *gen_map.entry(val.to_string()).or_insert(next_idx);
|
||||
format!("{{{{uuid:generated_{}}}}}", idx)
|
||||
}
|
||||
});
|
||||
|
||||
let ts_re = regex::Regex::new(r"\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(?:\.\d{1,6})?(?:Z|\+\d{2}(?::\d{2})?)?").unwrap();
|
||||
ts_re.replace_all(&s1, "{{timestamp}}").to_string()
|
||||
}
|
||||
|
||||
pub fn update_sql_fixture(path: &str, suite_idx: usize, case_idx: usize, queries: &[String]) {
|
||||
use crate::tests::formatter::SqlFormatter;
|
||||
let content = fs::read_to_string(path).unwrap();
|
||||
let mut file_data: Value = serde_json::from_str(&content).unwrap();
|
||||
|
||||
let mut uuid_map = HashMap::new();
|
||||
if let Some(test_case) = file_data.get(suite_idx).and_then(|s| s.get("tests")).and_then(|t| t.get(case_idx)) {
|
||||
if let Some(data) = test_case.get("data") {
|
||||
extract_uuids(data, "data", &mut uuid_map);
|
||||
}
|
||||
if let Some(mocks) = test_case.get("mocks") {
|
||||
extract_uuids(mocks, "mocks", &mut uuid_map);
|
||||
}
|
||||
}
|
||||
|
||||
let mut gen_map = HashMap::new();
|
||||
|
||||
let mut formatted_sql = Vec::new();
|
||||
for q in queries {
|
||||
let res = SqlFormatter::format(q);
|
||||
let mapped_res: Vec<String> = res.into_iter().map(|l| canonicalize_with_map(&l, &uuid_map, &mut gen_map)).collect();
|
||||
formatted_sql.push(mapped_res);
|
||||
}
|
||||
|
||||
if let Some(expect) = file_data[suite_idx]["tests"][case_idx].get_mut("expect") {
|
||||
if let Some(obj) = expect.as_object_mut() {
|
||||
obj.remove("pattern");
|
||||
obj.insert("sql".to_string(), serde_json::json!(formatted_sql));
|
||||
}
|
||||
}
|
||||
|
||||
// To preserve original formatting, we just use serde_json pretty output
|
||||
let formatted_json = serde_json::to_string_pretty(&file_data).unwrap();
|
||||
fs::write(path, formatted_json).unwrap();
|
||||
}
|
||||
|
||||
@ -75,7 +75,7 @@ impl Case {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn run_merge(&self, db: Arc<Database>) -> Result<(), String> {
|
||||
pub fn run_merge(&self, db: Arc<Database>, path: &str, suite_idx: usize, case_idx: usize) -> Result<(), String> {
|
||||
if let Some(mocks) = &self.mocks {
|
||||
if let Some(arr) = mocks.as_array() {
|
||||
db.executor.set_mocks(arr.clone());
|
||||
@ -94,7 +94,10 @@ impl Case {
|
||||
} else if result.errors.is_empty() {
|
||||
// Only assert SQL if merge succeeded
|
||||
let queries = db.executor.get_queries();
|
||||
expect.assert_pattern(&queries).and_then(|_| expect.assert_sql(&queries))
|
||||
if std::env::var("UPDATE_EXPECT").is_ok() {
|
||||
crate::tests::runner::update_sql_fixture(path, suite_idx, case_idx, &queries);
|
||||
}
|
||||
expect.assert_sql(&queries)
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
@ -106,7 +109,7 @@ impl Case {
|
||||
return_val
|
||||
}
|
||||
|
||||
pub fn run_query(&self, db: Arc<Database>) -> Result<(), String> {
|
||||
pub fn run_query(&self, db: Arc<Database>, path: &str, suite_idx: usize, case_idx: usize) -> Result<(), String> {
|
||||
if let Some(mocks) = &self.mocks {
|
||||
if let Some(arr) = mocks.as_array() {
|
||||
db.executor.set_mocks(arr.clone());
|
||||
@ -123,7 +126,10 @@ impl Case {
|
||||
Err(format!("Query {}", e))
|
||||
} else if result.errors.is_empty() {
|
||||
let queries = db.executor.get_queries();
|
||||
expect.assert_pattern(&queries).and_then(|_| expect.assert_sql(&queries))
|
||||
if std::env::var("UPDATE_EXPECT").is_ok() {
|
||||
crate::tests::runner::update_sql_fixture(path, suite_idx, case_idx, &queries);
|
||||
}
|
||||
expect.assert_sql(&queries)
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@ -35,12 +35,7 @@ impl Expect {
|
||||
if expected_val.is_object() && expected_val.as_object().unwrap().is_empty() {
|
||||
continue; // A `{}` means we just wanted to test it was collected/promoted, skip deep match
|
||||
}
|
||||
let schema_realm = if key.ends_with(".request") || key.ends_with(".response") {
|
||||
crate::database::realm::SchemaRealm::Punc
|
||||
} else {
|
||||
crate::database::realm::SchemaRealm::Type
|
||||
};
|
||||
let actual_ast = db.get_scoped_schema(schema_realm, key).unwrap();
|
||||
let actual_ast = db.schemas.get(key).cloned().unwrap();
|
||||
let actual_val = serde_json::to_value(actual_ast).unwrap();
|
||||
|
||||
if actual_val != *expected_val {
|
||||
|
||||
@ -15,6 +15,7 @@ pub struct ValidationContext<'a> {
|
||||
pub extensible: bool,
|
||||
pub reporter: bool,
|
||||
pub overrides: HashSet<String>,
|
||||
pub parent: Option<&'a serde_json::Value>,
|
||||
}
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
@ -38,6 +39,7 @@ impl<'a> ValidationContext<'a> {
|
||||
extensible: effective_extensible,
|
||||
reporter,
|
||||
overrides,
|
||||
parent: None,
|
||||
}
|
||||
}
|
||||
|
||||
@ -57,6 +59,7 @@ impl<'a> ValidationContext<'a> {
|
||||
overrides: HashSet<String>,
|
||||
extensible: bool,
|
||||
reporter: bool,
|
||||
parent_instance: Option<&'a serde_json::Value>,
|
||||
) -> Self {
|
||||
let effective_extensible = schema.extensible.unwrap_or(extensible);
|
||||
|
||||
@ -70,6 +73,7 @@ impl<'a> ValidationContext<'a> {
|
||||
extensible: effective_extensible,
|
||||
reporter,
|
||||
overrides,
|
||||
parent: parent_instance,
|
||||
}
|
||||
}
|
||||
|
||||
@ -81,6 +85,7 @@ impl<'a> ValidationContext<'a> {
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
reporter,
|
||||
self.parent,
|
||||
)
|
||||
}
|
||||
|
||||
|
||||
@ -10,7 +10,6 @@ pub use error::ValidationError;
|
||||
pub use result::ValidationResult;
|
||||
|
||||
use crate::database::Database;
|
||||
use crate::database::realm::SchemaRealm;
|
||||
use crate::validator::rules::util::is_integer;
|
||||
use serde_json::Value;
|
||||
use std::sync::Arc;
|
||||
@ -43,11 +42,7 @@ impl Validator {
|
||||
}
|
||||
|
||||
pub fn validate(&self, schema_id: &str, instance: &Value) -> crate::drop::Drop {
|
||||
let schema_opt = if schema_id.ends_with(".request") || schema_id.ends_with(".response") {
|
||||
self.db.get_scoped_schema(SchemaRealm::Punc, schema_id)
|
||||
} else {
|
||||
self.db.get_scoped_schema(SchemaRealm::Type, schema_id)
|
||||
};
|
||||
let schema_opt = self.db.schemas.get(schema_id);
|
||||
|
||||
if let Some(schema) = schema_opt {
|
||||
let ctx = ValidationContext::new(
|
||||
|
||||
@ -57,6 +57,7 @@ impl<'a> ValidationContext<'a> {
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
false,
|
||||
Some(self.instance),
|
||||
);
|
||||
|
||||
let check = derived.validate()?;
|
||||
@ -108,6 +109,7 @@ impl<'a> ValidationContext<'a> {
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
false,
|
||||
Some(self.instance),
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
result.merge(item_res);
|
||||
@ -137,6 +139,7 @@ impl<'a> ValidationContext<'a> {
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
false,
|
||||
Some(self.instance),
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
result.merge(item_res);
|
||||
|
||||
@ -12,6 +12,7 @@ pub mod numeric;
|
||||
pub mod object;
|
||||
pub mod polymorphism;
|
||||
pub mod string;
|
||||
pub mod r#type;
|
||||
pub mod util;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
@ -28,7 +29,7 @@ impl<'a> ValidationContext<'a> {
|
||||
if !self.validate_family(&mut result)? {
|
||||
return Ok(result);
|
||||
}
|
||||
if !self.validate_type_inheritance(&mut result)? {
|
||||
if !self.validate_type(&mut result)? {
|
||||
return Ok(result);
|
||||
}
|
||||
|
||||
|
||||
@ -191,6 +191,7 @@ impl<'a> ValidationContext<'a> {
|
||||
HashSet::new(),
|
||||
next_extensible,
|
||||
false,
|
||||
Some(self.instance),
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
|
||||
@ -220,6 +221,7 @@ impl<'a> ValidationContext<'a> {
|
||||
HashSet::new(),
|
||||
next_extensible,
|
||||
false,
|
||||
Some(self.instance),
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
result.merge(item_res);
|
||||
@ -265,6 +267,7 @@ impl<'a> ValidationContext<'a> {
|
||||
HashSet::new(),
|
||||
next_extensible,
|
||||
false,
|
||||
Some(self.instance),
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
result.merge(item_res);
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
use crate::database::realm::SchemaRealm;
|
||||
use indexmap::IndexMap;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_family(
|
||||
@ -66,7 +66,7 @@ impl<'a> ValidationContext<'a> {
|
||||
|
||||
pub(crate) fn execute_polymorph(
|
||||
&self,
|
||||
options: &std::collections::BTreeMap<String, (Option<usize>, Option<String>)>,
|
||||
options: &IndexMap<String, (Option<usize>, Option<String>)>,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
// 1. O(1) Fast-Path Router & Extractor
|
||||
@ -100,8 +100,8 @@ impl<'a> ValidationContext<'a> {
|
||||
if let Some(val) = instance_val {
|
||||
if let Some((idx_opt, target_id_opt)) = options.get(&val) {
|
||||
if let Some(target_id) = target_id_opt {
|
||||
if let Some(target_schema) = self.db.get_scoped_schema(SchemaRealm::Type, target_id) {
|
||||
let derived = self.derive_for_schema(&target_schema, false);
|
||||
if let Some(target_schema) = self.db.schemas.get(target_id) {
|
||||
let derived = self.derive_for_schema(target_schema, false);
|
||||
let sub_res = derived.validate()?;
|
||||
let is_valid = sub_res.is_valid();
|
||||
result.merge(sub_res);
|
||||
@ -177,78 +177,4 @@ impl<'a> ValidationContext<'a> {
|
||||
return Ok(false);
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn validate_type_inheritance(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
// Core inheritance logic replaces legacy routing
|
||||
let payload_primitive = match self.instance {
|
||||
serde_json::Value::Null => "null",
|
||||
serde_json::Value::Bool(_) => "boolean",
|
||||
serde_json::Value::Number(n) => {
|
||||
if n.is_i64() || n.is_u64() {
|
||||
"integer"
|
||||
} else {
|
||||
"number"
|
||||
}
|
||||
}
|
||||
serde_json::Value::String(_) => "string",
|
||||
serde_json::Value::Array(_) => "array",
|
||||
serde_json::Value::Object(_) => "object",
|
||||
};
|
||||
|
||||
let mut custom_types = Vec::new();
|
||||
match &self.schema.type_ {
|
||||
Some(crate::database::object::SchemaTypeOrArray::Single(t)) => {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
custom_types.push(t.clone());
|
||||
}
|
||||
}
|
||||
Some(crate::database::object::SchemaTypeOrArray::Multiple(arr)) => {
|
||||
if arr.contains(&payload_primitive.to_string())
|
||||
|| (payload_primitive == "integer" && arr.contains(&"number".to_string()))
|
||||
{
|
||||
// It natively matched a primitive in the array options, skip forcing custom proxy fallback
|
||||
} else {
|
||||
for t in arr {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
custom_types.push(t.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
None => {}
|
||||
}
|
||||
|
||||
for t in custom_types {
|
||||
if let Some(global_schema) = self.db.get_scoped_schema(SchemaRealm::Type, &t) {
|
||||
let mut new_overrides = self.overrides.clone();
|
||||
if let Some(props) = &self.schema.properties {
|
||||
new_overrides.extend(props.keys().map(|k| k.to_string()));
|
||||
}
|
||||
|
||||
let mut shadow = self.derive(
|
||||
&global_schema,
|
||||
self.instance,
|
||||
&self.path,
|
||||
new_overrides,
|
||||
self.extensible,
|
||||
true, // Reporter mode
|
||||
);
|
||||
shadow.root = &global_schema;
|
||||
result.merge(shadow.validate()?);
|
||||
} else {
|
||||
result.errors.push(ValidationError {
|
||||
code: "INHERITANCE_RESOLUTION_FAILED".to_string(),
|
||||
message: format!(
|
||||
"Inherited entity pointer '{}' was not found in schema registry",
|
||||
t
|
||||
),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
|
||||
138
src/validator/rules/type.rs
Normal file
138
src/validator/rules/type.rs
Normal file
@ -0,0 +1,138 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_type(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
let payload_primitive = match self.instance {
|
||||
serde_json::Value::Null => "null",
|
||||
serde_json::Value::Bool(_) => "boolean",
|
||||
serde_json::Value::Number(n) => {
|
||||
if n.is_i64() || n.is_u64() {
|
||||
"integer"
|
||||
} else {
|
||||
"number"
|
||||
}
|
||||
}
|
||||
serde_json::Value::String(_) => "string",
|
||||
serde_json::Value::Array(_) => "array",
|
||||
serde_json::Value::Object(_) => "object",
|
||||
};
|
||||
|
||||
let mut custom_types = Vec::new();
|
||||
match &self.schema.type_ {
|
||||
Some(crate::database::object::SchemaTypeOrArray::Single(t)) => {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
custom_types.push(t.clone());
|
||||
}
|
||||
}
|
||||
Some(crate::database::object::SchemaTypeOrArray::Multiple(arr)) => {
|
||||
if arr.contains(&payload_primitive.to_string())
|
||||
|| (payload_primitive == "integer" && arr.contains(&"number".to_string()))
|
||||
{
|
||||
// It natively matched a primitive in the array options, skip forcing custom proxy fallback
|
||||
} else {
|
||||
for t in arr {
|
||||
if !crate::database::object::is_primitive_type(t) {
|
||||
custom_types.push(t.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
None => {}
|
||||
}
|
||||
|
||||
for t in custom_types {
|
||||
let mut target_id = t.clone();
|
||||
|
||||
// 1. DYNAMIC TYPE (Composition)
|
||||
if t.starts_with('$') {
|
||||
let parts: Vec<&str> = t.split('.').collect();
|
||||
let var_name = &parts[0][1..]; // Remove the $ prefix
|
||||
let suffix = if parts.len() > 1 {
|
||||
format!(".{}", parts[1..].join("."))
|
||||
} else {
|
||||
String::new()
|
||||
};
|
||||
|
||||
let mut resolved = false;
|
||||
if let Some(parent) = self.parent {
|
||||
if let Some(obj) = parent.as_object() {
|
||||
if let Some(val) = obj.get(var_name) {
|
||||
if let Some(str_val) = val.as_str() {
|
||||
target_id = format!("{}{}", str_val, suffix);
|
||||
resolved = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !resolved {
|
||||
result.errors.push(ValidationError {
|
||||
code: "DYNAMIC_TYPE_RESOLUTION_FAILED".to_string(),
|
||||
message: format!(
|
||||
"Dynamic type pointer '{}' could not resolve discriminator property '{}' on parent instance",
|
||||
t, var_name
|
||||
),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Fetch and apply
|
||||
if let Some(global_schema) = self.db.schemas.get(&target_id) {
|
||||
let mut new_overrides = self.overrides.clone();
|
||||
if let Some(props) = &self.schema.properties {
|
||||
new_overrides.extend(props.keys().map(|k| k.to_string()));
|
||||
}
|
||||
|
||||
let mut shadow = self.derive(
|
||||
&global_schema,
|
||||
self.instance,
|
||||
&self.path,
|
||||
new_overrides,
|
||||
self.extensible,
|
||||
true, // Reporter mode
|
||||
self.parent,
|
||||
);
|
||||
shadow.root = &global_schema;
|
||||
result.merge(shadow.validate()?);
|
||||
} else {
|
||||
// 3. Error handling pathways
|
||||
if t.starts_with('$') {
|
||||
result.errors.push(ValidationError {
|
||||
code: "DYNAMIC_TYPE_RESOLUTION_FAILED".to_string(),
|
||||
message: format!(
|
||||
"Resolved dynamic type pointer '{}' was not found in schema registry",
|
||||
target_id
|
||||
),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
} else if self.schema.is_proxy() {
|
||||
result.errors.push(ValidationError {
|
||||
code: "PROXY_TYPE_RESOLUTION_FAILED".to_string(),
|
||||
message: format!(
|
||||
"Composed proxy entity pointer '{}' was not found in schema registry",
|
||||
target_id
|
||||
),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
} else {
|
||||
result.errors.push(ValidationError {
|
||||
code: "INHERITANCE_RESOLUTION_FAILED".to_string(),
|
||||
message: format!(
|
||||
"Inherited entity pointer '{}' was not found in schema registry",
|
||||
target_id
|
||||
),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
@ -1,81 +0,0 @@
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.43s
|
||||
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
|
||||
|
||||
running 11 tests
|
||||
test tests::test_minimum_0_2 ... ok
|
||||
test tests::test_minimum_1_4 ... ok
|
||||
test tests::test_minimum_1_0 ... FAILED
|
||||
test tests::test_minimum_1_1 ... FAILED
|
||||
test tests::test_minimum_0_3 ... FAILED
|
||||
test tests::test_minimum_1_5 ... ok
|
||||
test tests::test_minimum_1_3 ... FAILED
|
||||
test tests::test_minimum_0_0 ... FAILED
|
||||
test tests::test_minimum_0_1 ... FAILED
|
||||
test tests::test_minimum_1_2 ... FAILED
|
||||
test tests::test_minimum_1_6 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
---- tests::test_minimum_1_0 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'negative above the minimum is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_minimum_1_0' (110318318) panicked at src/tests/fixtures.rs:3503:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'negative above the minimum is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
|
||||
---- tests::test_minimum_1_1 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'positive above the minimum is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_minimum_1_1' (110318319) panicked at src/tests/fixtures.rs:3509:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'positive above the minimum is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
|
||||
|
||||
---- tests::test_minimum_0_3 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'ignores non-numbers': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_0_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_minimum_0_3' (110318317) panicked at src/tests/fixtures.rs:3497:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[minimum validation] Validate Test 'ignores non-numbers' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_0_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
|
||||
|
||||
---- tests::test_minimum_1_3 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'boundary point with float is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_minimum_1_3' (110318321) panicked at src/tests/fixtures.rs:3521:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'boundary point with float is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
|
||||
|
||||
---- tests::test_minimum_0_0 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'above the minimum is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_0_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_minimum_0_0' (110318314) panicked at src/tests/fixtures.rs:3479:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[minimum validation] Validate Test 'above the minimum is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_0_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
|
||||
|
||||
---- tests::test_minimum_0_1 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'boundary point is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_0_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_minimum_0_1' (110318315) panicked at src/tests/fixtures.rs:3485:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[minimum validation] Validate Test 'boundary point is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_0_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
|
||||
|
||||
---- tests::test_minimum_1_2 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'boundary point is valid': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_minimum_1_2' (110318320) panicked at src/tests/fixtures.rs:3515:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'boundary point is valid' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
|
||||
|
||||
---- tests::test_minimum_1_6 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'ignores non-numbers': Expected success: true, Got: false. Actual Errors: [Error { code: "SCHEMA_NOT_FOUND", message: "Schema minimum_1_0 not found", details: ErrorDetails { path: Some("/"), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_minimum_1_6' (110318324) panicked at src/tests/fixtures.rs:3539:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[minimum validation with signed integer] Validate Test 'ignores non-numbers' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"SCHEMA_NOT_FOUND\", message: \"Schema minimum_1_0 not found\", details: ErrorDetails { path: Some(\"/\"), cause: None, context: None, schema: None } }]"
|
||||
|
||||
|
||||
failures:
|
||||
tests::test_minimum_0_0
|
||||
tests::test_minimum_0_1
|
||||
tests::test_minimum_0_3
|
||||
tests::test_minimum_1_0
|
||||
tests::test_minimum_1_1
|
||||
tests::test_minimum_1_2
|
||||
tests::test_minimum_1_3
|
||||
tests::test_minimum_1_6
|
||||
|
||||
test result: FAILED. 3 passed; 8 failed; 0 ignored; 0 measured; 1347 filtered out; finished in 0.00s
|
||||
|
||||
error: test failed, to rerun pass `--lib`
|
||||
@ -1,23 +0,0 @@
|
||||
Compiling jspg v0.1.0 (/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg)
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 7.59s
|
||||
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
|
||||
|
||||
running 1 test
|
||||
test tests::test_merge_0_0 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
---- tests::test_merge_0_0 stdout ----
|
||||
TEST VALIDATE ERROR FOR 'valid with both properties': Expected success: true, Got: false. Actual Errors: [Error { code: "MISSING_TYPE", message: "Schema mechanically requires type discrimination 'base_0'", details: ErrorDetails { path: Some(""), cause: None, context: None, schema: None } }]
|
||||
|
||||
thread 'tests::test_merge_0_0' (110369726) panicked at src/tests/fixtures.rs:4307:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[merging: properties accumulate] Validate Test 'valid with both properties' failed. Error: Expected success: true, Got: false. Actual Errors: [Error { code: \"MISSING_TYPE\", message: \"Schema mechanically requires type discrimination 'base_0'\", details: ErrorDetails { path: Some(\"\"), cause: None, context: None, schema: None } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
|
||||
|
||||
failures:
|
||||
tests::test_merge_0_0
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 1357 filtered out; finished in 0.00s
|
||||
|
||||
error: test failed, to rerun pass `--lib`
|
||||
Reference in New Issue
Block a user