Compare commits
26 Commits
8e50d4852d
...
1.0.57
| Author | SHA1 | Date | |
|---|---|---|---|
| bb263190f6 | |||
| 737c0b04ac | |||
| 2d1fca599b | |||
| 8d0369c672 | |||
| 566b599512 | |||
| e7f20e2cb6 | |||
| 3898c43742 | |||
| e14f53e7d9 | |||
| 960a99034a | |||
| 81388149e8 | |||
| b8b3f7a501 | |||
| bc5489b1ea | |||
| 7b55277116 | |||
| ed636b05a4 | |||
| 2aec2da2fd | |||
| ad78896f72 | |||
| 55b93d9957 | |||
| 7ec6e09ae0 | |||
| 9d9c6d2c06 | |||
| 12e952fa94 | |||
| 776a912098 | |||
| 612188a54b | |||
| 29c5160b49 | |||
| 944675d669 | |||
| 53a40d1099 | |||
| e55977c11b |
49
.agent/workflows/jspg.md
Normal file
49
.agent/workflows/jspg.md
Normal file
@ -0,0 +1,49 @@
|
||||
---
|
||||
description: jspg work preparation
|
||||
---
|
||||
|
||||
This workflow will get you up-to-speed on the JSPG custom json-schema-based cargo pgrx postgres validation extension. Everything you read will be in the jspg directory/project.
|
||||
|
||||
Read over this entire workflow and commit to every section of work in a task list, so that you don't stop half way through before reviewing all of the directories and files mentioned. Do not ask for confirmation after generating this task list and proceed through all sections in your list.
|
||||
|
||||
Please analyze the files and directories and do not use cat, find, or the terminal to discover or read in any of these files. Analyze every file mentioned. If a directory is mentioned or a /*, please analyze the directory, every single file at its root, and recursively analyze every subdirectory and every single file in every subdirectory to capture not just critical files, but the entirety of what is requested. I state again, DO NOT just review a cherry picking of files in any folder or wildcard specified. Review 100% of all files discovered recursively!
|
||||
|
||||
Section 1: Documentation
|
||||
|
||||
- GEMINI.md at the root
|
||||
|
||||
Section 2: Flow file for cmd interface
|
||||
|
||||
- flow at the root
|
||||
|
||||
Section 3: Source
|
||||
|
||||
- src/*
|
||||
|
||||
Section 4: Test Fixtures
|
||||
|
||||
- Just review some of the *.json files in tests/fixtures/*
|
||||
|
||||
Section 5: Build
|
||||
|
||||
- build.rs
|
||||
|
||||
Section 6: Cargo TOML
|
||||
|
||||
- Cargo.toml
|
||||
|
||||
Section 7: Some PUNC Syntax
|
||||
|
||||
Now, review some punc type and enum source in the api project with api/ these files:
|
||||
|
||||
- punc/sql/tables.sql
|
||||
- punc/sql/domains.sql
|
||||
- punc/sql/indexes.sql
|
||||
- punc/sql/functions/entity.sql
|
||||
- punc/sql/functions/puncs.sql
|
||||
- punc/sql/puncs/entity.sql
|
||||
- punc/sql/puncs/persons.sql
|
||||
- punc/sql/puncs/puncs.sql
|
||||
- punc/sql/puncs/job.sql
|
||||
|
||||
Now you are ready to help me work on this extension.
|
||||
1
Cargo.lock
generated
1
Cargo.lock
generated
@ -817,6 +817,7 @@ dependencies = [
|
||||
"chrono",
|
||||
"fluent-uri",
|
||||
"idna",
|
||||
"indexmap",
|
||||
"json-pointer",
|
||||
"lazy_static",
|
||||
"once_cell",
|
||||
|
||||
@ -19,6 +19,7 @@ percent-encoding = "2.3.2"
|
||||
uuid = { version = "1.20.0", features = ["v4", "serde"] }
|
||||
chrono = { version = "0.4.43", features = ["serde"] }
|
||||
json-pointer = "0.3.4"
|
||||
indexmap = { version = "2.13.0", features = ["serde"] }
|
||||
|
||||
[dev-dependencies]
|
||||
pgrx-tests = "0.16.1"
|
||||
|
||||
94
GEMINI.md
94
GEMINI.md
@ -9,7 +9,7 @@ It is designed to serve as the validation engine for the "Punc" architecture, wh
|
||||
1. **Draft 2020-12 Compliance**: Attempt to adhere to the official JSON Schema Draft 2020-12 specification.
|
||||
2. **Ultra-Fast Validation**: Compile schemas into an optimized in-memory representation for near-instant validation during high-throughput workloads.
|
||||
3. **Connection-Bound Caching**: Leverage the PostgreSQL session lifecycle to maintain a per-connection schema cache, eliminating the need for repetitive parsing.
|
||||
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `.family` schemas.
|
||||
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `$family` references.
|
||||
5. **Punc Integration**: validation is aware of the "Punc" context (request/response) and can validate `cue` objects efficiently.
|
||||
|
||||
## 🔌 API Reference
|
||||
@ -18,7 +18,7 @@ The extension exposes the following functions to PostgreSQL:
|
||||
|
||||
### `cache_json_schemas(enums jsonb, types jsonb, puncs jsonb) -> jsonb`
|
||||
|
||||
Loads and compiles the entire schema registry into the session's memory.
|
||||
Loads and compiles the entire schema registry into the session's memory, atomically replacing the previous validator.
|
||||
|
||||
* **Inputs**:
|
||||
* `enums`: Array of enum definitions.
|
||||
@ -27,10 +27,21 @@ Loads and compiles the entire schema registry into the session's memory.
|
||||
* **Behavior**:
|
||||
* Parses all inputs into an internal schema graph.
|
||||
* Resolves all internal references (`$ref`).
|
||||
* Generates virtual `.family` schemas for type hierarchies.
|
||||
* Generates virtual union schemas for type hierarchies referenced via `$family`.
|
||||
* Compiles schemas into validators.
|
||||
* **Returns**: `{"response": "success"}` or an error object.
|
||||
|
||||
### `mask_json_schema(schema_id text, instance jsonb) -> jsonb`
|
||||
|
||||
Validates a JSON instance and returns a new JSON object with unknown properties removed (pruned) based on the schema.
|
||||
|
||||
* **Inputs**:
|
||||
* `schema_id`: The `$id` of the schema to mask against.
|
||||
* `instance`: The JSON data to mask.
|
||||
* **Returns**:
|
||||
* On success: A `Drop` containing the **masked data**.
|
||||
* On failure: A `Drop` containing validation errors.
|
||||
|
||||
### `validate_json_schema(schema_id text, instance jsonb) -> jsonb`
|
||||
|
||||
Validates a JSON instance against a pre-compiled schema.
|
||||
@ -56,30 +67,64 @@ Returns a debug dump of the currently cached schemas (for development/debugging)
|
||||
|
||||
## ✨ Custom Features & Deviations
|
||||
|
||||
JSPG implements specific extensions to the Draft 2020-12 standard to support the Punc architecture's object-oriented needs.
|
||||
JSPG implements specific extensions to the Draft 2020-12 standard to support the Punc architecture's object-oriented needs while heavily optimizing for zero-runtime lookups.
|
||||
|
||||
### 1. Implicit Keyword Shadowing
|
||||
Standard JSON Schema composition (`allOf`) is additive (Intersection), meaning constraints can only be tightened, not replaced. However, JSPG treats `$ref` differently when it appears alongside other properties to support object-oriented inheritance.
|
||||
### 1. Polymorphism & Referencing (`$ref`, `$family`, and Native Types)
|
||||
|
||||
* **Inheritance (`$ref` + `properties`)**: When a schema uses `$ref` *and* defines its own properties, JSPG implements **Smart Merge** (or Shadowing). If a property is defined in the current schema, its constraints take precedence over the inherited constraints for that specific keyword.
|
||||
* *Example*: If `Entity` defines `type: { const: "entity" }` and `Person` (which refs Entity) defines `type: { const: "person" }`, validation passes for "person". The local `const` shadows the inherited `const`.
|
||||
* *Granularity*: Shadowing is per-keyword. If `Entity` defined `type: { const: "entity", minLength: 5 }`, `Person` would shadow `const` but still inherit `minLength: 5`.
|
||||
JSPG replaces the complex, dynamic reference resolution logic of standard JSON Schema (e.g., `$defs`, relative URIs, `$dynamicRef`, `$dynamicAnchor`, `anyOf`) with a strict, explicitly structured global `$id` system. This powers predictable code generation and blazing-fast runtime validation.
|
||||
|
||||
* **Composition (`allOf`)**: When using `allOf`, standard intersection rules apply. No shadowing occurs; all constraints from all branches must pass. This is used for mixins or interfaces.
|
||||
#### A. Global `$id` Conventions & Schema Buckets
|
||||
Every schema is part of a flat, globally addressable namespace. However, where a schema is defined in the database determines its physical boundaries:
|
||||
* **Types (Entities)**: Schemas defined within a Postgres `type` represent entities. The `$id` must be exactly the type name (`person`) or suffixed (`full.person`). All schemas in this bucket receive strict Native Type Discrimination based on the physical table hierarchy.
|
||||
* **Puncs (APIs)**: Schemas defined within a `punc` are ad-hoc containers. The `$id` must be exactly `[punc_name].request` or `[punc_name].response`. They are never entities themselves.
|
||||
* **Enums (Domains)**: Schemas defined within an `enum` represent enum definitions. The `$id` must be exactly the enum name (`job_status`) or suffixed (`short.job_status`).
|
||||
|
||||
### 2. Virtual Family Schemas (`.family`)
|
||||
To support polymorphic fields (e.g., a field that accepts any "User" type), JSPG generates virtual schemas representing type hierarchies.
|
||||
#### B. Native Type Discrimination (The `variations` Property)
|
||||
Because `jspg` knows which schemas are Entities based on their origin bucket (Types), it securely and implicitly manages the `"type"` property by attaching `compiled_variations`.
|
||||
If a schema originates in the `user` bucket, the validator does *not* rigidly require `{"type": "user"}`. Instead, it queries the physical Postgres type inheritance graph (e.g. `[entity, organization, user]`) and allows the JSON to be `{"type": "person"}` or `{"type": "bot"}` automatically, enabling seamless API polymorphism.
|
||||
|
||||
* **Mechanism**: When caching types, if a type defines a `hierarchy` (e.g., `["entity", "organization", "person"]`), JSPG generates a schema like `organization.family` which is a `oneOf` containing refs to all valid descendants.
|
||||
#### C. Structural Inheritance & Viral Infection (`$ref`)
|
||||
`$ref` is used exclusively for structural inheritance.
|
||||
* **Viral Infection**: If an anonymous schema or an ad-hoc schema (like a Punc Request) `$ref`s a strict Entity schema (like `person.light`), it *virally inherits* the `compiled_variations` of that target. This means a Punc request instantly gains the exact polymorphic security boundaries of the Entity it points to.
|
||||
* **`$ref` never creates a Union.** When you use `$ref`, you are asking for a single, concrete struct/shape.
|
||||
|
||||
### 3. Strict by Default & Extensibility
|
||||
#### D. Shape Polymorphism & Virtual Unions (`$family`)
|
||||
To support polymorphic API contracts (e.g., heterogeneous arrays of generic widgets) without manually writing massive `oneOf` blocks, JSPG provides the `$family` macro.
|
||||
While `$ref` defines rigid structure, `$family` relies on an abstract **Descendants Graph**.
|
||||
|
||||
During compilation, `jspg` temporarily tracks every `$ref` pointer globally to build a reverse-lookup graph of "Descendants". It also calculates the **Inheritance Depth** of every schema (how far removed it is from the root entity).
|
||||
When `{"$family": "widget"}` is encountered, JSPG:
|
||||
1. Locates the `widget` schema in the Descendants graph.
|
||||
2. Expands the macro by finding *every* schema in the entire database that structurally `$ref`s `widget`, directly or indirectly (e.g., `stock.widget`, an anonymous object, etc.).
|
||||
3. Evaluates the incoming JSON against **every** descendant schema in that family *strictly*.
|
||||
|
||||
If you request `{"$family": "light.widget"}`, it simply evaluates all schemas that `$ref` the generic abstract `light.widget` interface.
|
||||
|
||||
#### E. Strict Matches & The Depth Heuristic
|
||||
JSPG strictly enforces that polymorphic structures (`oneOf`, or a `$family` expansion) match **exactly one** valid schema permutation. It does not support fuzzy matching (`anyOf`).
|
||||
If a JSON payload matches more than one schema in a union (which happens frequently due to implicit inheritance where an object might technically satisfy the requirements of both `entity` and `user`), JSPG automatically applies the **Depth Heuristic Tie-Breaker**:
|
||||
* It looks up the pre-calculated Inheritance Depth for all valid passing candidates.
|
||||
* It selects the candidate that is **deepest** in the inheritance tree (the most explicitly defined descendant).
|
||||
* If multiple passing candidates tie at the exact same depth level, an `AMBIGUOUS` error is thrown, forcing the developer to supply a more precise type discriminator or payload.
|
||||
|
||||
This cleanly separates **Database Physics** (derived from the Postgres `Types` bucket and viral `$ref` inheritance) from **Structural Polymorphism** (derived purely from the abstract `$ref` tree).
|
||||
|
||||
### 2. Strict by Default & Extensibility
|
||||
JSPG enforces a "Secure by Default" philosophy. All schemas are treated as if `unevaluatedProperties: false` (and `unevaluatedItems: false`) is set, unless explicitly overridden.
|
||||
|
||||
* **Strictness**: By default, any property in the instance data that is not explicitly defined in the schema causes a validation error. This prevents clients from sending undeclared fields.
|
||||
* **Extensibility (`extensible: true`)**: To allow additional, undefined properties, you must add `"extensible": true` to the schema. This is useful for types that are designed to be open for extension.
|
||||
* **Strictness**: By default, any property or array item in the instance data that is not explicitly defined in the schema causes a validation error. This prevents clients from sending undeclared fields or extra array elements.
|
||||
* **Extensibility (`extensible: true`)**: To allow a free-for-all of additional, undefined properties or extra array items, you must add `"extensible": true` to the schema. This globally disables the strictness check for that object or array, useful for types designed to be completely open.
|
||||
* **Structured Additional Properties (`additionalProperties: {...}`)**: Instead of a boolean free-for-all, you can define `additionalProperties` as a schema object (e.g., `{"type": "string"}`). This maintains strictness (no arbitrary keys) but allows any extra keys as long as their values match the defined structure.
|
||||
* **Ref Boundaries**: Strictness is reset when crossing `$ref` boundaries. The referenced schema's strictness is determined by its own definition (strict by default unless `extensible: true`), ignoring the caller's state.
|
||||
* **Inheritance**: Strictness is inherited. A schema extending a strict parent will also be strict unless it declares itself `extensible: true`. Conversely, a schema extending a loose parent will also be loose unless it declares itself `extensible: false`.
|
||||
|
||||
### 3. Implicit Keyword Shadowing
|
||||
Standard JSON Schema composition (`allOf`) is additive (Intersection), meaning constraints can only be tightened, not replaced. However, JSPG treats `$ref` differently when it appears alongside other properties to support object-oriented inheritance.
|
||||
|
||||
* **Inheritance (`$ref` + properties)**: When a schema uses `$ref` and defines its own properties, JSPG implements Smart Merge (or Shadowing). If a property is defined in the current schema, its constraints take precedence over the inherited constraints for that specific keyword.
|
||||
* **Example**: If Entity defines `type: { const: "entity" }` and Person (which refs Entity) defines `type: { const: "person" }`, validation passes for "person". The local const shadows the inherited const.
|
||||
* **Granularity**: Shadowing is per-keyword. If Entity defined `type: { const: "entity", minLength: 5 }`, Person would shadow `const` but still inherit `minLength: 5`.
|
||||
* **Composition (`allOf`)**: When using `allOf`, standard intersection rules apply. No shadowing occurs; all constraints from all branches must pass. This is used for mixins or interfaces.
|
||||
|
||||
### 4. Format Leniency for Empty Strings
|
||||
To simplify frontend form logic, the format validators for `uuid`, `date-time`, and `email` explicitly allow empty strings (`""`). This treats an empty string as "present but unset" rather than "invalid format".
|
||||
@ -92,8 +137,23 @@ The extension is written in Rust using `pgrx` and structures its schema parser t
|
||||
* **Compiler Phase**: schema JSONs are parsed into this struct, linked (references resolved), and then compiled into an efficient validation tree.
|
||||
* **Validation Phase**: The compiled validators traverse the JSON instance using `serde_json::Value`.
|
||||
|
||||
### Concurrency & Threading ("Immutable Graphs")
|
||||
|
||||
To support high-throughput validation while allowing for runtime schema updates (e.g., during development or hot-reloading), JSPG uses an **Atomic Swap** pattern based on 100% immutable schemas.
|
||||
|
||||
1. **Parser Phase**: Schema JSONs are parsed into ordered `Schema` structs.
|
||||
2. **Compiler Phase**: The database iterates all parsed schemas and pre-computes native optimization maps:
|
||||
* **Descendants Map**: A reverse `$ref` lookup graph for instant `$family` resolution.
|
||||
* **Depths Map**: The `$ref` lineage distance of every schema for heuristic tie-breaking.
|
||||
* **Variations Map**: The Native Type inheritance hierarchy.
|
||||
3. **Immutable Validator**: The `Validator` struct immutably owns the `Database` registry and all its global maps. Once created, a validator instance (and its registry) never changes. Schemas themselves are completely frozen; `$ref` strings are resolved dynamically at runtime using the pre-computed O(1) maps, eliminating the need to physically mutate or link pointers across structures.
|
||||
4. **Global Pointer**: A global `RwLock<Option<Arc<Validator>>>` holds the current active validator.
|
||||
5. **Lock-Free Reads**: Validation requests acquire a read lock just long enough to clone the `Arc` (incrementing a reference count), then release the lock immediately. Validation proceeds on the snapshot, ensuring no blocking during schema updates.
|
||||
6. **Atomic Updates**: When schemas are reloaded (`cache_json_schemas`), a new `Registry` and `Validator` are built entirely on the stack. The global pointer is then atomically swapped to the new instance under a write lock.
|
||||
|
||||
## 🧪 Testing
|
||||
|
||||
Testing is driven by standard Rust unit tests that load JSON fixtures.
|
||||
|
||||
The tests are located in `tests/fixtures/*.json` and are executed via `cargo test`.
|
||||
* **Isolation**: Each test file runs with its own isolated `Registry` and `Validator` instance, created on the stack. This eliminates global state interference and allows tests to run in parallel.
|
||||
* **Fixtures**: The tests are located in `tests/fixtures/*.json` and are executed via `cargo test`.
|
||||
0
agreego.sql
Normal file
0
agreego.sql
Normal file
99
build.rs
99
build.rs
@ -3,37 +3,37 @@ use std::fs::File;
|
||||
use std::io::Write;
|
||||
use std::path::Path;
|
||||
|
||||
fn to_safe_identifier(name: &str) -> String {
|
||||
let mut safe = String::new();
|
||||
for (i, c) in name.chars().enumerate() {
|
||||
if c.is_uppercase() {
|
||||
if i > 0 {
|
||||
safe.push('_');
|
||||
}
|
||||
safe.push(c.to_ascii_lowercase());
|
||||
} else if c == '-' || c == '.' {
|
||||
safe.push('_');
|
||||
} else {
|
||||
safe.push(c);
|
||||
}
|
||||
}
|
||||
safe
|
||||
}
|
||||
|
||||
fn main() {
|
||||
println!("cargo:rerun-if-changed=tests/fixtures");
|
||||
println!("cargo:rerun-if-changed=Cargo.toml");
|
||||
|
||||
// File 1: src/tests.rs for #[pg_test]
|
||||
let pg_dest_path = Path::new("src/tests.rs");
|
||||
let mut pg_file = File::create(&pg_dest_path).unwrap();
|
||||
// File 1: src/tests/fixtures.rs for #[pg_test]
|
||||
let pg_dest_path = Path::new("src/tests/fixtures.rs");
|
||||
let mut pg_file = File::create(pg_dest_path).unwrap();
|
||||
|
||||
// File 2: tests/tests.rs for standard #[test] integration
|
||||
let std_dest_path = Path::new("tests/tests.rs");
|
||||
let mut std_file = File::create(&std_dest_path).unwrap();
|
||||
// File 2: tests/fixtures.rs for standard #[test] integration
|
||||
let std_dest_path = Path::new("tests/fixtures.rs");
|
||||
let mut std_file = File::create(std_dest_path).unwrap();
|
||||
|
||||
// Write headers
|
||||
writeln!(std_file, "use jspg::util;").unwrap();
|
||||
|
||||
// Helper for snake_case conversion
|
||||
// let _to_snake_case = |s: &str| -> String {
|
||||
// s.chars().fold(String::new(), |mut acc, c| {
|
||||
// if c.is_uppercase() {
|
||||
// if !acc.is_empty() {
|
||||
// acc.push('_');
|
||||
// }
|
||||
// acc.push(c.to_ascii_lowercase());
|
||||
// } else if c == '-' || c == ' ' || c == '.' || c == '/' || c == ':' {
|
||||
// acc.push('_');
|
||||
// } else if c.is_alphanumeric() {
|
||||
// acc.push(c);
|
||||
// }
|
||||
// acc
|
||||
// })
|
||||
// };
|
||||
writeln!(std_file, "use jspg::validator::util;").unwrap();
|
||||
|
||||
// Walk tests/fixtures directly
|
||||
let fixtures_path = "tests/fixtures";
|
||||
@ -49,26 +49,43 @@ fn main() {
|
||||
let val: serde_json::Value = serde_json::from_reader(file).unwrap();
|
||||
|
||||
if let Some(arr) = val.as_array() {
|
||||
for (i, _item) in arr.iter().enumerate() {
|
||||
// Use deterministic names: test_{filename}_{index}
|
||||
// We sanitize the filename to be a valid identifier
|
||||
// Use manual snake_case logic since we don't want to add a build-dependency just yet if not needed,
|
||||
// but `dynamicRef` -> `dynamic_ref` requires parsing.
|
||||
// Let's implement a simple camelToSnake helper.
|
||||
let mut safe_filename = String::new();
|
||||
for (i, c) in file_name.chars().enumerate() {
|
||||
if c.is_uppercase() {
|
||||
if i > 0 {
|
||||
safe_filename.push('_');
|
||||
for (i, item) in arr.iter().enumerate() {
|
||||
// Enforce test suite structure
|
||||
let group = item.as_object().expect("Test suite must be an object");
|
||||
|
||||
// Validate required suite fields
|
||||
if !group.contains_key("description")
|
||||
|| !group.contains_key("database")
|
||||
|| !group.contains_key("tests")
|
||||
{
|
||||
panic!(
|
||||
"File {} index {} is missing required suite fields (description, database, tests)",
|
||||
file_name, i
|
||||
);
|
||||
}
|
||||
safe_filename.push(c.to_ascii_lowercase());
|
||||
} else if c == '-' || c == '.' {
|
||||
safe_filename.push('_');
|
||||
} else {
|
||||
safe_filename.push(c);
|
||||
|
||||
// Validate required test case fields
|
||||
let tests = group
|
||||
.get("tests")
|
||||
.unwrap()
|
||||
.as_array()
|
||||
.expect("Tests must be an array");
|
||||
for (t_idx, test) in tests.iter().enumerate() {
|
||||
let t_obj = test.as_object().expect("Test case must be an object");
|
||||
if !t_obj.contains_key("description")
|
||||
|| !t_obj.contains_key("data")
|
||||
|| !t_obj.contains_key("valid")
|
||||
|| !t_obj.contains_key("schema_id")
|
||||
{
|
||||
panic!(
|
||||
"File {} suite {} test {} is missing required case fields (description, data, valid, schema_id)",
|
||||
file_name, i, t_idx
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Use deterministic names: test_{filename}_{index}
|
||||
let safe_filename = to_safe_identifier(file_name);
|
||||
let fn_name = format!("test_{}_{}", safe_filename, i);
|
||||
|
||||
// Write to src/tests.rs (PG Test)
|
||||
@ -79,7 +96,7 @@ fn main() {
|
||||
#[pg_test]
|
||||
fn {}() {{
|
||||
let path = format!("{{}}/tests/fixtures/{}.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::util::run_test_file_at_index(&path, {}).unwrap();
|
||||
crate::validator::util::run_test_file_at_index(&path, {}).unwrap();
|
||||
}}
|
||||
"#,
|
||||
fn_name, file_name, i
|
||||
|
||||
813
debug.log
813
debug.log
@ -1,813 +0,0 @@
|
||||
warning: function `test_uniqueItems_0` should have a snake case name
|
||||
--> tests/tests.rs:52:4
|
||||
|
|
||||
52 | fn test_uniqueItems_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_0`
|
||||
|
|
||||
= note: `#[warn(non_snake_case)]` (part of `#[warn(nonstandard_style)]`) on by default
|
||||
|
||||
warning: function `test_uniqueItems_1` should have a snake case name
|
||||
--> tests/tests.rs:58:4
|
||||
|
|
||||
58 | fn test_uniqueItems_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_1`
|
||||
|
||||
warning: function `test_uniqueItems_2` should have a snake case name
|
||||
--> tests/tests.rs:64:4
|
||||
|
|
||||
64 | fn test_uniqueItems_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_2`
|
||||
|
||||
warning: function `test_uniqueItems_3` should have a snake case name
|
||||
--> tests/tests.rs:70:4
|
||||
|
|
||||
70 | fn test_uniqueItems_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_3`
|
||||
|
||||
warning: function `test_uniqueItems_4` should have a snake case name
|
||||
--> tests/tests.rs:76:4
|
||||
|
|
||||
76 | fn test_uniqueItems_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_4`
|
||||
|
||||
warning: function `test_uniqueItems_5` should have a snake case name
|
||||
--> tests/tests.rs:82:4
|
||||
|
|
||||
82 | fn test_uniqueItems_5() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_5`
|
||||
|
||||
warning: function `test_uniqueItems_6` should have a snake case name
|
||||
--> tests/tests.rs:88:4
|
||||
|
|
||||
88 | fn test_uniqueItems_6() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_6`
|
||||
|
||||
warning: function `test_minItems_0` should have a snake case name
|
||||
--> tests/tests.rs:94:4
|
||||
|
|
||||
94 | fn test_minItems_0() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_items_0`
|
||||
|
||||
warning: function `test_minItems_1` should have a snake case name
|
||||
--> tests/tests.rs:100:4
|
||||
|
|
||||
100 | fn test_minItems_1() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_items_1`
|
||||
|
||||
warning: function `test_minItems_2` should have a snake case name
|
||||
--> tests/tests.rs:106:4
|
||||
|
|
||||
106 | fn test_minItems_2() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_items_2`
|
||||
|
||||
warning: function `test_exclusiveMinimum_0` should have a snake case name
|
||||
--> tests/tests.rs:160:4
|
||||
|
|
||||
160 | fn test_exclusiveMinimum_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_exclusive_minimum_0`
|
||||
|
||||
warning: function `test_anyOf_0` should have a snake case name
|
||||
--> tests/tests.rs:274:4
|
||||
|
|
||||
274 | fn test_anyOf_0() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_0`
|
||||
|
||||
warning: function `test_anyOf_1` should have a snake case name
|
||||
--> tests/tests.rs:280:4
|
||||
|
|
||||
280 | fn test_anyOf_1() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_1`
|
||||
|
||||
warning: function `test_anyOf_2` should have a snake case name
|
||||
--> tests/tests.rs:286:4
|
||||
|
|
||||
286 | fn test_anyOf_2() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_2`
|
||||
|
||||
warning: function `test_anyOf_3` should have a snake case name
|
||||
--> tests/tests.rs:292:4
|
||||
|
|
||||
292 | fn test_anyOf_3() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_3`
|
||||
|
||||
warning: function `test_anyOf_4` should have a snake case name
|
||||
--> tests/tests.rs:298:4
|
||||
|
|
||||
298 | fn test_anyOf_4() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_4`
|
||||
|
||||
warning: function `test_anyOf_5` should have a snake case name
|
||||
--> tests/tests.rs:304:4
|
||||
|
|
||||
304 | fn test_anyOf_5() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_5`
|
||||
|
||||
warning: function `test_anyOf_6` should have a snake case name
|
||||
--> tests/tests.rs:310:4
|
||||
|
|
||||
310 | fn test_anyOf_6() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_6`
|
||||
|
||||
warning: function `test_anyOf_7` should have a snake case name
|
||||
--> tests/tests.rs:316:4
|
||||
|
|
||||
316 | fn test_anyOf_7() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_7`
|
||||
|
||||
warning: function `test_anyOf_8` should have a snake case name
|
||||
--> tests/tests.rs:322:4
|
||||
|
|
||||
322 | fn test_anyOf_8() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_8`
|
||||
|
||||
warning: function `test_anyOf_9` should have a snake case name
|
||||
--> tests/tests.rs:328:4
|
||||
|
|
||||
328 | fn test_anyOf_9() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_9`
|
||||
|
||||
warning: function `test_propertyNames_0` should have a snake case name
|
||||
--> tests/tests.rs:334:4
|
||||
|
|
||||
334 | fn test_propertyNames_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_0`
|
||||
|
||||
warning: function `test_propertyNames_1` should have a snake case name
|
||||
--> tests/tests.rs:340:4
|
||||
|
|
||||
340 | fn test_propertyNames_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_1`
|
||||
|
||||
warning: function `test_propertyNames_2` should have a snake case name
|
||||
--> tests/tests.rs:346:4
|
||||
|
|
||||
346 | fn test_propertyNames_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_2`
|
||||
|
||||
warning: function `test_propertyNames_3` should have a snake case name
|
||||
--> tests/tests.rs:352:4
|
||||
|
|
||||
352 | fn test_propertyNames_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_3`
|
||||
|
||||
warning: function `test_propertyNames_4` should have a snake case name
|
||||
--> tests/tests.rs:358:4
|
||||
|
|
||||
358 | fn test_propertyNames_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_4`
|
||||
|
||||
warning: function `test_propertyNames_5` should have a snake case name
|
||||
--> tests/tests.rs:364:4
|
||||
|
|
||||
364 | fn test_propertyNames_5() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_5`
|
||||
|
||||
warning: function `test_propertyNames_6` should have a snake case name
|
||||
--> tests/tests.rs:370:4
|
||||
|
|
||||
370 | fn test_propertyNames_6() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_6`
|
||||
|
||||
warning: function `test_minProperties_0` should have a snake case name
|
||||
--> tests/tests.rs:646:4
|
||||
|
|
||||
646 | fn test_minProperties_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_properties_0`
|
||||
|
||||
warning: function `test_minProperties_1` should have a snake case name
|
||||
--> tests/tests.rs:652:4
|
||||
|
|
||||
652 | fn test_minProperties_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_properties_1`
|
||||
|
||||
warning: function `test_minProperties_2` should have a snake case name
|
||||
--> tests/tests.rs:658:4
|
||||
|
|
||||
658 | fn test_minProperties_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_properties_2`
|
||||
|
||||
warning: function `test_minContains_0` should have a snake case name
|
||||
--> tests/tests.rs:664:4
|
||||
|
|
||||
664 | fn test_minContains_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_0`
|
||||
|
||||
warning: function `test_minContains_1` should have a snake case name
|
||||
--> tests/tests.rs:670:4
|
||||
|
|
||||
670 | fn test_minContains_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_1`
|
||||
|
||||
warning: function `test_minContains_2` should have a snake case name
|
||||
--> tests/tests.rs:676:4
|
||||
|
|
||||
676 | fn test_minContains_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_2`
|
||||
|
||||
warning: function `test_minContains_3` should have a snake case name
|
||||
--> tests/tests.rs:682:4
|
||||
|
|
||||
682 | fn test_minContains_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_3`
|
||||
|
||||
warning: function `test_minContains_4` should have a snake case name
|
||||
--> tests/tests.rs:688:4
|
||||
|
|
||||
688 | fn test_minContains_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_4`
|
||||
|
||||
warning: function `test_minContains_5` should have a snake case name
|
||||
--> tests/tests.rs:694:4
|
||||
|
|
||||
694 | fn test_minContains_5() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_5`
|
||||
|
||||
warning: function `test_minContains_6` should have a snake case name
|
||||
--> tests/tests.rs:700:4
|
||||
|
|
||||
700 | fn test_minContains_6() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_6`
|
||||
|
||||
warning: function `test_minContains_7` should have a snake case name
|
||||
--> tests/tests.rs:706:4
|
||||
|
|
||||
706 | fn test_minContains_7() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_7`
|
||||
|
||||
warning: function `test_minContains_8` should have a snake case name
|
||||
--> tests/tests.rs:712:4
|
||||
|
|
||||
712 | fn test_minContains_8() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_8`
|
||||
|
||||
warning: function `test_maxContains_0` should have a snake case name
|
||||
--> tests/tests.rs:796:4
|
||||
|
|
||||
796 | fn test_maxContains_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_0`
|
||||
|
||||
warning: function `test_maxContains_1` should have a snake case name
|
||||
--> tests/tests.rs:802:4
|
||||
|
|
||||
802 | fn test_maxContains_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_1`
|
||||
|
||||
warning: function `test_maxContains_2` should have a snake case name
|
||||
--> tests/tests.rs:808:4
|
||||
|
|
||||
808 | fn test_maxContains_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_2`
|
||||
|
||||
warning: function `test_maxContains_3` should have a snake case name
|
||||
--> tests/tests.rs:814:4
|
||||
|
|
||||
814 | fn test_maxContains_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_3`
|
||||
|
||||
warning: function `test_maxContains_4` should have a snake case name
|
||||
--> tests/tests.rs:820:4
|
||||
|
|
||||
820 | fn test_maxContains_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_4`
|
||||
|
||||
warning: function `test_maxLength_0` should have a snake case name
|
||||
--> tests/tests.rs:826:4
|
||||
|
|
||||
826 | fn test_maxLength_0() {
|
||||
| ^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_length_0`
|
||||
|
||||
warning: function `test_maxLength_1` should have a snake case name
|
||||
--> tests/tests.rs:832:4
|
||||
|
|
||||
832 | fn test_maxLength_1() {
|
||||
| ^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_length_1`
|
||||
|
||||
warning: function `test_dependentSchemas_0` should have a snake case name
|
||||
--> tests/tests.rs:838:4
|
||||
|
|
||||
838 | fn test_dependentSchemas_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_schemas_0`
|
||||
|
||||
warning: function `test_dependentSchemas_1` should have a snake case name
|
||||
--> tests/tests.rs:844:4
|
||||
|
|
||||
844 | fn test_dependentSchemas_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_schemas_1`
|
||||
|
||||
warning: function `test_dependentSchemas_2` should have a snake case name
|
||||
--> tests/tests.rs:850:4
|
||||
|
|
||||
850 | fn test_dependentSchemas_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_schemas_2`
|
||||
|
||||
warning: function `test_dependentSchemas_3` should have a snake case name
|
||||
--> tests/tests.rs:856:4
|
||||
|
|
||||
856 | fn test_dependentSchemas_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_schemas_3`
|
||||
|
||||
warning: function `test_exclusiveMaximum_0` should have a snake case name
|
||||
--> tests/tests.rs:862:4
|
||||
|
|
||||
862 | fn test_exclusiveMaximum_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_exclusive_maximum_0`
|
||||
|
||||
warning: function `test_prefixItems_0` should have a snake case name
|
||||
--> tests/tests.rs:868:4
|
||||
|
|
||||
868 | fn test_prefixItems_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_0`
|
||||
|
||||
warning: function `test_prefixItems_1` should have a snake case name
|
||||
--> tests/tests.rs:874:4
|
||||
|
|
||||
874 | fn test_prefixItems_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_1`
|
||||
|
||||
warning: function `test_prefixItems_2` should have a snake case name
|
||||
--> tests/tests.rs:880:4
|
||||
|
|
||||
880 | fn test_prefixItems_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_2`
|
||||
|
||||
warning: function `test_prefixItems_3` should have a snake case name
|
||||
--> tests/tests.rs:886:4
|
||||
|
|
||||
886 | fn test_prefixItems_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_3`
|
||||
|
||||
warning: function `test_prefixItems_4` should have a snake case name
|
||||
--> tests/tests.rs:892:4
|
||||
|
|
||||
892 | fn test_prefixItems_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_4`
|
||||
|
||||
warning: function `test_oneOf_0` should have a snake case name
|
||||
--> tests/tests.rs:910:4
|
||||
|
|
||||
910 | fn test_oneOf_0() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_0`
|
||||
|
||||
warning: function `test_oneOf_1` should have a snake case name
|
||||
--> tests/tests.rs:916:4
|
||||
|
|
||||
916 | fn test_oneOf_1() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_1`
|
||||
|
||||
warning: function `test_oneOf_2` should have a snake case name
|
||||
--> tests/tests.rs:922:4
|
||||
|
|
||||
922 | fn test_oneOf_2() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_2`
|
||||
|
||||
warning: function `test_oneOf_3` should have a snake case name
|
||||
--> tests/tests.rs:928:4
|
||||
|
|
||||
928 | fn test_oneOf_3() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_3`
|
||||
|
||||
warning: function `test_oneOf_4` should have a snake case name
|
||||
--> tests/tests.rs:934:4
|
||||
|
|
||||
934 | fn test_oneOf_4() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_4`
|
||||
|
||||
warning: function `test_oneOf_5` should have a snake case name
|
||||
--> tests/tests.rs:940:4
|
||||
|
|
||||
940 | fn test_oneOf_5() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_5`
|
||||
|
||||
warning: function `test_oneOf_6` should have a snake case name
|
||||
--> tests/tests.rs:946:4
|
||||
|
|
||||
946 | fn test_oneOf_6() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_6`
|
||||
|
||||
warning: function `test_oneOf_7` should have a snake case name
|
||||
--> tests/tests.rs:952:4
|
||||
|
|
||||
952 | fn test_oneOf_7() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_7`
|
||||
|
||||
warning: function `test_oneOf_8` should have a snake case name
|
||||
--> tests/tests.rs:958:4
|
||||
|
|
||||
958 | fn test_oneOf_8() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_8`
|
||||
|
||||
warning: function `test_oneOf_9` should have a snake case name
|
||||
--> tests/tests.rs:964:4
|
||||
|
|
||||
964 | fn test_oneOf_9() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_9`
|
||||
|
||||
warning: function `test_oneOf_10` should have a snake case name
|
||||
--> tests/tests.rs:970:4
|
||||
|
|
||||
970 | fn test_oneOf_10() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_10`
|
||||
|
||||
warning: function `test_oneOf_11` should have a snake case name
|
||||
--> tests/tests.rs:976:4
|
||||
|
|
||||
976 | fn test_oneOf_11() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_11`
|
||||
|
||||
warning: function `test_oneOf_12` should have a snake case name
|
||||
--> tests/tests.rs:982:4
|
||||
|
|
||||
982 | fn test_oneOf_12() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_12`
|
||||
|
||||
warning: function `test_emptyString_0` should have a snake case name
|
||||
--> tests/tests.rs:1072:4
|
||||
|
|
||||
1072 | fn test_emptyString_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_empty_string_0`
|
||||
|
||||
warning: function `test_maxProperties_0` should have a snake case name
|
||||
--> tests/tests.rs:1090:4
|
||||
|
|
||||
1090 | fn test_maxProperties_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_properties_0`
|
||||
|
||||
warning: function `test_maxProperties_1` should have a snake case name
|
||||
--> tests/tests.rs:1096:4
|
||||
|
|
||||
1096 | fn test_maxProperties_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_properties_1`
|
||||
|
||||
warning: function `test_maxProperties_2` should have a snake case name
|
||||
--> tests/tests.rs:1102:4
|
||||
|
|
||||
1102 | fn test_maxProperties_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_properties_2`
|
||||
|
||||
warning: function `test_maxProperties_3` should have a snake case name
|
||||
--> tests/tests.rs:1108:4
|
||||
|
|
||||
1108 | fn test_maxProperties_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_properties_3`
|
||||
|
||||
warning: function `test_dependentRequired_0` should have a snake case name
|
||||
--> tests/tests.rs:1114:4
|
||||
|
|
||||
1114 | fn test_dependentRequired_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_0`
|
||||
|
||||
warning: function `test_dependentRequired_1` should have a snake case name
|
||||
--> tests/tests.rs:1120:4
|
||||
|
|
||||
1120 | fn test_dependentRequired_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_1`
|
||||
|
||||
warning: function `test_dependentRequired_2` should have a snake case name
|
||||
--> tests/tests.rs:1126:4
|
||||
|
|
||||
1126 | fn test_dependentRequired_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_2`
|
||||
|
||||
warning: function `test_dependentRequired_3` should have a snake case name
|
||||
--> tests/tests.rs:1132:4
|
||||
|
|
||||
1132 | fn test_dependentRequired_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_3`
|
||||
|
||||
warning: function `test_dependentRequired_4` should have a snake case name
|
||||
--> tests/tests.rs:1138:4
|
||||
|
|
||||
1138 | fn test_dependentRequired_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_4`
|
||||
|
||||
warning: function `test_multipleOf_0` should have a snake case name
|
||||
--> tests/tests.rs:1252:4
|
||||
|
|
||||
1252 | fn test_multipleOf_0() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_multiple_of_0`
|
||||
|
||||
warning: function `test_multipleOf_1` should have a snake case name
|
||||
--> tests/tests.rs:1258:4
|
||||
|
|
||||
1258 | fn test_multipleOf_1() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_multiple_of_1`
|
||||
|
||||
warning: function `test_multipleOf_2` should have a snake case name
|
||||
--> tests/tests.rs:1264:4
|
||||
|
|
||||
1264 | fn test_multipleOf_2() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_multiple_of_2`
|
||||
|
||||
warning: function `test_multipleOf_3` should have a snake case name
|
||||
--> tests/tests.rs:1270:4
|
||||
|
|
||||
1270 | fn test_multipleOf_3() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_multiple_of_3`
|
||||
|
||||
warning: function `test_patternProperties_0` should have a snake case name
|
||||
--> tests/tests.rs:1276:4
|
||||
|
|
||||
1276 | fn test_patternProperties_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_0`
|
||||
|
||||
warning: function `test_patternProperties_1` should have a snake case name
|
||||
--> tests/tests.rs:1282:4
|
||||
|
|
||||
1282 | fn test_patternProperties_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_1`
|
||||
|
||||
warning: function `test_patternProperties_2` should have a snake case name
|
||||
--> tests/tests.rs:1288:4
|
||||
|
|
||||
1288 | fn test_patternProperties_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_2`
|
||||
|
||||
warning: function `test_patternProperties_3` should have a snake case name
|
||||
--> tests/tests.rs:1294:4
|
||||
|
|
||||
1294 | fn test_patternProperties_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_3`
|
||||
|
||||
warning: function `test_patternProperties_4` should have a snake case name
|
||||
--> tests/tests.rs:1300:4
|
||||
|
|
||||
1300 | fn test_patternProperties_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_4`
|
||||
|
||||
warning: function `test_patternProperties_5` should have a snake case name
|
||||
--> tests/tests.rs:1306:4
|
||||
|
|
||||
1306 | fn test_patternProperties_5() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_5`
|
||||
|
||||
warning: function `test_allOf_0` should have a snake case name
|
||||
--> tests/tests.rs:1336:4
|
||||
|
|
||||
1336 | fn test_allOf_0() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_0`
|
||||
|
||||
warning: function `test_allOf_1` should have a snake case name
|
||||
--> tests/tests.rs:1342:4
|
||||
|
|
||||
1342 | fn test_allOf_1() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_1`
|
||||
|
||||
warning: function `test_allOf_2` should have a snake case name
|
||||
--> tests/tests.rs:1348:4
|
||||
|
|
||||
1348 | fn test_allOf_2() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_2`
|
||||
|
||||
warning: function `test_allOf_3` should have a snake case name
|
||||
--> tests/tests.rs:1354:4
|
||||
|
|
||||
1354 | fn test_allOf_3() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_3`
|
||||
|
||||
warning: function `test_allOf_4` should have a snake case name
|
||||
--> tests/tests.rs:1360:4
|
||||
|
|
||||
1360 | fn test_allOf_4() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_4`
|
||||
|
||||
warning: function `test_allOf_5` should have a snake case name
|
||||
--> tests/tests.rs:1366:4
|
||||
|
|
||||
1366 | fn test_allOf_5() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_5`
|
||||
|
||||
warning: function `test_allOf_6` should have a snake case name
|
||||
--> tests/tests.rs:1372:4
|
||||
|
|
||||
1372 | fn test_allOf_6() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_6`
|
||||
|
||||
warning: function `test_allOf_7` should have a snake case name
|
||||
--> tests/tests.rs:1378:4
|
||||
|
|
||||
1378 | fn test_allOf_7() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_7`
|
||||
|
||||
warning: function `test_allOf_8` should have a snake case name
|
||||
--> tests/tests.rs:1384:4
|
||||
|
|
||||
1384 | fn test_allOf_8() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_8`
|
||||
|
||||
warning: function `test_allOf_9` should have a snake case name
|
||||
--> tests/tests.rs:1390:4
|
||||
|
|
||||
1390 | fn test_allOf_9() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_9`
|
||||
|
||||
warning: function `test_allOf_10` should have a snake case name
|
||||
--> tests/tests.rs:1396:4
|
||||
|
|
||||
1396 | fn test_allOf_10() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_10`
|
||||
|
||||
warning: function `test_allOf_11` should have a snake case name
|
||||
--> tests/tests.rs:1402:4
|
||||
|
|
||||
1402 | fn test_allOf_11() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_11`
|
||||
|
||||
warning: function `test_allOf_12` should have a snake case name
|
||||
--> tests/tests.rs:1408:4
|
||||
|
|
||||
1408 | fn test_allOf_12() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_12`
|
||||
|
||||
warning: function `test_allOf_13` should have a snake case name
|
||||
--> tests/tests.rs:1414:4
|
||||
|
|
||||
1414 | fn test_allOf_13() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_13`
|
||||
|
||||
warning: function `test_allOf_14` should have a snake case name
|
||||
--> tests/tests.rs:1420:4
|
||||
|
|
||||
1420 | fn test_allOf_14() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_14`
|
||||
|
||||
warning: function `test_allOf_15` should have a snake case name
|
||||
--> tests/tests.rs:1426:4
|
||||
|
|
||||
1426 | fn test_allOf_15() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_15`
|
||||
|
||||
warning: function `test_minLength_0` should have a snake case name
|
||||
--> tests/tests.rs:1828:4
|
||||
|
|
||||
1828 | fn test_minLength_0() {
|
||||
| ^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_length_0`
|
||||
|
||||
warning: function `test_minLength_1` should have a snake case name
|
||||
--> tests/tests.rs:1834:4
|
||||
|
|
||||
1834 | fn test_minLength_1() {
|
||||
| ^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_length_1`
|
||||
|
||||
warning: function `test_maxItems_0` should have a snake case name
|
||||
--> tests/tests.rs:1840:4
|
||||
|
|
||||
1840 | fn test_maxItems_0() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_items_0`
|
||||
|
||||
warning: function `test_maxItems_1` should have a snake case name
|
||||
--> tests/tests.rs:1846:4
|
||||
|
|
||||
1846 | fn test_maxItems_1() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_items_1`
|
||||
|
||||
warning: function `test_maxItems_2` should have a snake case name
|
||||
--> tests/tests.rs:1852:4
|
||||
|
|
||||
1852 | fn test_maxItems_2() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_items_2`
|
||||
|
||||
warning: function `test_dynamicRef_0` should have a snake case name
|
||||
--> tests/tests.rs:1912:4
|
||||
|
|
||||
1912 | fn test_dynamicRef_0() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_0`
|
||||
|
||||
warning: function `test_dynamicRef_1` should have a snake case name
|
||||
--> tests/tests.rs:1918:4
|
||||
|
|
||||
1918 | fn test_dynamicRef_1() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_1`
|
||||
|
||||
warning: function `test_dynamicRef_2` should have a snake case name
|
||||
--> tests/tests.rs:1924:4
|
||||
|
|
||||
1924 | fn test_dynamicRef_2() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_2`
|
||||
|
||||
warning: function `test_dynamicRef_3` should have a snake case name
|
||||
--> tests/tests.rs:1930:4
|
||||
|
|
||||
1930 | fn test_dynamicRef_3() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_3`
|
||||
|
||||
warning: function `test_dynamicRef_4` should have a snake case name
|
||||
--> tests/tests.rs:1936:4
|
||||
|
|
||||
1936 | fn test_dynamicRef_4() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_4`
|
||||
|
||||
warning: function `test_dynamicRef_5` should have a snake case name
|
||||
--> tests/tests.rs:1942:4
|
||||
|
|
||||
1942 | fn test_dynamicRef_5() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_5`
|
||||
|
||||
warning: function `test_dynamicRef_6` should have a snake case name
|
||||
--> tests/tests.rs:1948:4
|
||||
|
|
||||
1948 | fn test_dynamicRef_6() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_6`
|
||||
|
||||
warning: function `test_dynamicRef_7` should have a snake case name
|
||||
--> tests/tests.rs:1954:4
|
||||
|
|
||||
1954 | fn test_dynamicRef_7() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_7`
|
||||
|
||||
warning: function `test_dynamicRef_8` should have a snake case name
|
||||
--> tests/tests.rs:1960:4
|
||||
|
|
||||
1960 | fn test_dynamicRef_8() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_8`
|
||||
|
||||
warning: function `test_dynamicRef_9` should have a snake case name
|
||||
--> tests/tests.rs:1966:4
|
||||
|
|
||||
1966 | fn test_dynamicRef_9() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_9`
|
||||
|
||||
warning: function `test_dynamicRef_10` should have a snake case name
|
||||
--> tests/tests.rs:1972:4
|
||||
|
|
||||
1972 | fn test_dynamicRef_10() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_10`
|
||||
|
||||
warning: function `test_dynamicRef_11` should have a snake case name
|
||||
--> tests/tests.rs:1978:4
|
||||
|
|
||||
1978 | fn test_dynamicRef_11() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_11`
|
||||
|
||||
warning: function `test_dynamicRef_12` should have a snake case name
|
||||
--> tests/tests.rs:1984:4
|
||||
|
|
||||
1984 | fn test_dynamicRef_12() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_12`
|
||||
|
||||
warning: function `test_dynamicRef_13` should have a snake case name
|
||||
--> tests/tests.rs:1990:4
|
||||
|
|
||||
1990 | fn test_dynamicRef_13() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_13`
|
||||
|
||||
warning: function `test_dynamicRef_14` should have a snake case name
|
||||
--> tests/tests.rs:1996:4
|
||||
|
|
||||
1996 | fn test_dynamicRef_14() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_14`
|
||||
|
||||
warning: function `test_dynamicRef_15` should have a snake case name
|
||||
--> tests/tests.rs:2002:4
|
||||
|
|
||||
2002 | fn test_dynamicRef_15() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_15`
|
||||
|
||||
warning: function `test_dynamicRef_16` should have a snake case name
|
||||
--> tests/tests.rs:2008:4
|
||||
|
|
||||
2008 | fn test_dynamicRef_16() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_16`
|
||||
|
||||
warning: function `test_dynamicRef_17` should have a snake case name
|
||||
--> tests/tests.rs:2014:4
|
||||
|
|
||||
2014 | fn test_dynamicRef_17() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_17`
|
||||
|
||||
warning: function `test_dynamicRef_18` should have a snake case name
|
||||
--> tests/tests.rs:2020:4
|
||||
|
|
||||
2020 | fn test_dynamicRef_18() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_18`
|
||||
|
||||
warning: function `test_dynamicRef_19` should have a snake case name
|
||||
--> tests/tests.rs:2026:4
|
||||
|
|
||||
2026 | fn test_dynamicRef_19() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_19`
|
||||
|
||||
warning: function `test_dynamicRef_20` should have a snake case name
|
||||
--> tests/tests.rs:2032:4
|
||||
|
|
||||
2032 | fn test_dynamicRef_20() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_20`
|
||||
|
||||
warning: `jspg` (test "tests") generated 132 warnings
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.42s
|
||||
Running tests/tests.rs (target/debug/deps/tests-0f6b1e496850f0af)
|
||||
|
||||
running 1 test
|
||||
|
||||
thread 'test_ref_39' (14864151) panicked at tests/tests.rs:1812:45:
|
||||
called `Result::unwrap()` on an `Err` value: "[implicit keyword shadowing] Test 'child type overrides parent type' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'age'\", details: ErrorDetails { path: \"/age\" } }]\n[implicit keyword shadowing] Test 'parent max age (20) is shadowed (replaced) by child definition' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'age'\", details: ErrorDetails { path: \"/age\" } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
test test_ref_39 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
failures:
|
||||
test_ref_39
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 338 filtered out; finished in 0.01s
|
||||
|
||||
error: test failed, to rerun pass `--test tests`
|
||||
44
debug_2.log
44
debug_2.log
@ -1,44 +0,0 @@
|
||||
Blocking waiting for file lock on artifact directory
|
||||
Compiling jspg v0.1.0 (/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg)
|
||||
error[E0424]: expected value, found module `self`
|
||||
--> src/util.rs:162:33
|
||||
|
|
||||
40 | pub fn run_test_file_at_index(path: &str, index: usi...
|
||||
| ---------------------- this function can't have a `self` parameter
|
||||
...
|
||||
162 | let mut new_overrides = self.overrides.clone();
|
||||
| ^^^^ `self` value is a keyword only available in methods with a `self` parameter
|
||||
|
||||
error[E0424]: expected value, found module `self`
|
||||
--> src/util.rs:164:31
|
||||
|
|
||||
40 | pub fn run_test_file_at_index(path: &str, index: usi...
|
||||
| ---------------------- this function can't have a `self` parameter
|
||||
...
|
||||
164 | if let Some(props) = &self.schema.properties {
|
||||
| ^^^^ `self` value is a keyword only available in methods with a `self` parameter
|
||||
|
||||
error[E0282]: type annotations needed
|
||||
--> src/util.rs:166:32
|
||||
|
|
||||
166 | new_overrides.extend(props.keys().cloned());
|
||||
| ^^^^^ cannot infer type
|
||||
|
||||
error[E0599]: no method named `is_valid` found for struct `drop::Drop` in the current scope
|
||||
--> src/util.rs:204:18
|
||||
|
|
||||
204 | ...ult.is_valid(), // Use is_valid() for clear "Got"...
|
||||
| ^^^^^^^^ method not found in `drop::Drop`
|
||||
|
|
||||
::: src/drop.rs:5:1
|
||||
|
|
||||
5 | pub struct Drop {
|
||||
| --------------- method `is_valid` not found for this struct
|
||||
|
|
||||
= help: items from traits can only be used if the trait is implemented and in scope
|
||||
= note: the following trait defines an item `is_valid`, perhaps you need to implement it:
|
||||
candidate #1: `NullLayout`
|
||||
|
||||
Some errors have detailed explanations: E0282, E0424, E0599.
|
||||
For more information about an error, try `rustc --explain E0282`.
|
||||
error: could not compile `jspg` (lib) due to 4 previous errors
|
||||
815
debug_3.log
815
debug_3.log
@ -1,815 +0,0 @@
|
||||
Blocking waiting for file lock on artifact directory
|
||||
Compiling jspg v0.1.0 (/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg)
|
||||
warning: function `test_uniqueItems_0` should have a snake case name
|
||||
--> tests/tests.rs:52:4
|
||||
|
|
||||
52 | fn test_uniqueItems_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_0`
|
||||
|
|
||||
= note: `#[warn(non_snake_case)]` (part of `#[warn(nonstandard_style)]`) on by default
|
||||
|
||||
warning: function `test_uniqueItems_1` should have a snake case name
|
||||
--> tests/tests.rs:58:4
|
||||
|
|
||||
58 | fn test_uniqueItems_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_1`
|
||||
|
||||
warning: function `test_uniqueItems_2` should have a snake case name
|
||||
--> tests/tests.rs:64:4
|
||||
|
|
||||
64 | fn test_uniqueItems_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_2`
|
||||
|
||||
warning: function `test_uniqueItems_3` should have a snake case name
|
||||
--> tests/tests.rs:70:4
|
||||
|
|
||||
70 | fn test_uniqueItems_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_3`
|
||||
|
||||
warning: function `test_uniqueItems_4` should have a snake case name
|
||||
--> tests/tests.rs:76:4
|
||||
|
|
||||
76 | fn test_uniqueItems_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_4`
|
||||
|
||||
warning: function `test_uniqueItems_5` should have a snake case name
|
||||
--> tests/tests.rs:82:4
|
||||
|
|
||||
82 | fn test_uniqueItems_5() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_5`
|
||||
|
||||
warning: function `test_uniqueItems_6` should have a snake case name
|
||||
--> tests/tests.rs:88:4
|
||||
|
|
||||
88 | fn test_uniqueItems_6() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_unique_items_6`
|
||||
|
||||
warning: function `test_minItems_0` should have a snake case name
|
||||
--> tests/tests.rs:94:4
|
||||
|
|
||||
94 | fn test_minItems_0() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_items_0`
|
||||
|
||||
warning: function `test_minItems_1` should have a snake case name
|
||||
--> tests/tests.rs:100:4
|
||||
|
|
||||
100 | fn test_minItems_1() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_items_1`
|
||||
|
||||
warning: function `test_minItems_2` should have a snake case name
|
||||
--> tests/tests.rs:106:4
|
||||
|
|
||||
106 | fn test_minItems_2() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_items_2`
|
||||
|
||||
warning: function `test_exclusiveMinimum_0` should have a snake case name
|
||||
--> tests/tests.rs:160:4
|
||||
|
|
||||
160 | fn test_exclusiveMinimum_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_exclusive_minimum_0`
|
||||
|
||||
warning: function `test_anyOf_0` should have a snake case name
|
||||
--> tests/tests.rs:274:4
|
||||
|
|
||||
274 | fn test_anyOf_0() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_0`
|
||||
|
||||
warning: function `test_anyOf_1` should have a snake case name
|
||||
--> tests/tests.rs:280:4
|
||||
|
|
||||
280 | fn test_anyOf_1() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_1`
|
||||
|
||||
warning: function `test_anyOf_2` should have a snake case name
|
||||
--> tests/tests.rs:286:4
|
||||
|
|
||||
286 | fn test_anyOf_2() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_2`
|
||||
|
||||
warning: function `test_anyOf_3` should have a snake case name
|
||||
--> tests/tests.rs:292:4
|
||||
|
|
||||
292 | fn test_anyOf_3() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_3`
|
||||
|
||||
warning: function `test_anyOf_4` should have a snake case name
|
||||
--> tests/tests.rs:298:4
|
||||
|
|
||||
298 | fn test_anyOf_4() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_4`
|
||||
|
||||
warning: function `test_anyOf_5` should have a snake case name
|
||||
--> tests/tests.rs:304:4
|
||||
|
|
||||
304 | fn test_anyOf_5() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_5`
|
||||
|
||||
warning: function `test_anyOf_6` should have a snake case name
|
||||
--> tests/tests.rs:310:4
|
||||
|
|
||||
310 | fn test_anyOf_6() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_6`
|
||||
|
||||
warning: function `test_anyOf_7` should have a snake case name
|
||||
--> tests/tests.rs:316:4
|
||||
|
|
||||
316 | fn test_anyOf_7() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_7`
|
||||
|
||||
warning: function `test_anyOf_8` should have a snake case name
|
||||
--> tests/tests.rs:322:4
|
||||
|
|
||||
322 | fn test_anyOf_8() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_8`
|
||||
|
||||
warning: function `test_anyOf_9` should have a snake case name
|
||||
--> tests/tests.rs:328:4
|
||||
|
|
||||
328 | fn test_anyOf_9() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_any_of_9`
|
||||
|
||||
warning: function `test_propertyNames_0` should have a snake case name
|
||||
--> tests/tests.rs:334:4
|
||||
|
|
||||
334 | fn test_propertyNames_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_0`
|
||||
|
||||
warning: function `test_propertyNames_1` should have a snake case name
|
||||
--> tests/tests.rs:340:4
|
||||
|
|
||||
340 | fn test_propertyNames_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_1`
|
||||
|
||||
warning: function `test_propertyNames_2` should have a snake case name
|
||||
--> tests/tests.rs:346:4
|
||||
|
|
||||
346 | fn test_propertyNames_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_2`
|
||||
|
||||
warning: function `test_propertyNames_3` should have a snake case name
|
||||
--> tests/tests.rs:352:4
|
||||
|
|
||||
352 | fn test_propertyNames_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_3`
|
||||
|
||||
warning: function `test_propertyNames_4` should have a snake case name
|
||||
--> tests/tests.rs:358:4
|
||||
|
|
||||
358 | fn test_propertyNames_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_4`
|
||||
|
||||
warning: function `test_propertyNames_5` should have a snake case name
|
||||
--> tests/tests.rs:364:4
|
||||
|
|
||||
364 | fn test_propertyNames_5() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_5`
|
||||
|
||||
warning: function `test_propertyNames_6` should have a snake case name
|
||||
--> tests/tests.rs:370:4
|
||||
|
|
||||
370 | fn test_propertyNames_6() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_property_names_6`
|
||||
|
||||
warning: function `test_minProperties_0` should have a snake case name
|
||||
--> tests/tests.rs:646:4
|
||||
|
|
||||
646 | fn test_minProperties_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_properties_0`
|
||||
|
||||
warning: function `test_minProperties_1` should have a snake case name
|
||||
--> tests/tests.rs:652:4
|
||||
|
|
||||
652 | fn test_minProperties_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_properties_1`
|
||||
|
||||
warning: function `test_minProperties_2` should have a snake case name
|
||||
--> tests/tests.rs:658:4
|
||||
|
|
||||
658 | fn test_minProperties_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_properties_2`
|
||||
|
||||
warning: function `test_minContains_0` should have a snake case name
|
||||
--> tests/tests.rs:664:4
|
||||
|
|
||||
664 | fn test_minContains_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_0`
|
||||
|
||||
warning: function `test_minContains_1` should have a snake case name
|
||||
--> tests/tests.rs:670:4
|
||||
|
|
||||
670 | fn test_minContains_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_1`
|
||||
|
||||
warning: function `test_minContains_2` should have a snake case name
|
||||
--> tests/tests.rs:676:4
|
||||
|
|
||||
676 | fn test_minContains_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_2`
|
||||
|
||||
warning: function `test_minContains_3` should have a snake case name
|
||||
--> tests/tests.rs:682:4
|
||||
|
|
||||
682 | fn test_minContains_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_3`
|
||||
|
||||
warning: function `test_minContains_4` should have a snake case name
|
||||
--> tests/tests.rs:688:4
|
||||
|
|
||||
688 | fn test_minContains_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_4`
|
||||
|
||||
warning: function `test_minContains_5` should have a snake case name
|
||||
--> tests/tests.rs:694:4
|
||||
|
|
||||
694 | fn test_minContains_5() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_5`
|
||||
|
||||
warning: function `test_minContains_6` should have a snake case name
|
||||
--> tests/tests.rs:700:4
|
||||
|
|
||||
700 | fn test_minContains_6() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_6`
|
||||
|
||||
warning: function `test_minContains_7` should have a snake case name
|
||||
--> tests/tests.rs:706:4
|
||||
|
|
||||
706 | fn test_minContains_7() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_7`
|
||||
|
||||
warning: function `test_minContains_8` should have a snake case name
|
||||
--> tests/tests.rs:712:4
|
||||
|
|
||||
712 | fn test_minContains_8() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_contains_8`
|
||||
|
||||
warning: function `test_maxContains_0` should have a snake case name
|
||||
--> tests/tests.rs:796:4
|
||||
|
|
||||
796 | fn test_maxContains_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_0`
|
||||
|
||||
warning: function `test_maxContains_1` should have a snake case name
|
||||
--> tests/tests.rs:802:4
|
||||
|
|
||||
802 | fn test_maxContains_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_1`
|
||||
|
||||
warning: function `test_maxContains_2` should have a snake case name
|
||||
--> tests/tests.rs:808:4
|
||||
|
|
||||
808 | fn test_maxContains_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_2`
|
||||
|
||||
warning: function `test_maxContains_3` should have a snake case name
|
||||
--> tests/tests.rs:814:4
|
||||
|
|
||||
814 | fn test_maxContains_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_3`
|
||||
|
||||
warning: function `test_maxContains_4` should have a snake case name
|
||||
--> tests/tests.rs:820:4
|
||||
|
|
||||
820 | fn test_maxContains_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_contains_4`
|
||||
|
||||
warning: function `test_maxLength_0` should have a snake case name
|
||||
--> tests/tests.rs:826:4
|
||||
|
|
||||
826 | fn test_maxLength_0() {
|
||||
| ^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_length_0`
|
||||
|
||||
warning: function `test_maxLength_1` should have a snake case name
|
||||
--> tests/tests.rs:832:4
|
||||
|
|
||||
832 | fn test_maxLength_1() {
|
||||
| ^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_length_1`
|
||||
|
||||
warning: function `test_dependentSchemas_0` should have a snake case name
|
||||
--> tests/tests.rs:838:4
|
||||
|
|
||||
838 | fn test_dependentSchemas_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_schemas_0`
|
||||
|
||||
warning: function `test_dependentSchemas_1` should have a snake case name
|
||||
--> tests/tests.rs:844:4
|
||||
|
|
||||
844 | fn test_dependentSchemas_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_schemas_1`
|
||||
|
||||
warning: function `test_dependentSchemas_2` should have a snake case name
|
||||
--> tests/tests.rs:850:4
|
||||
|
|
||||
850 | fn test_dependentSchemas_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_schemas_2`
|
||||
|
||||
warning: function `test_dependentSchemas_3` should have a snake case name
|
||||
--> tests/tests.rs:856:4
|
||||
|
|
||||
856 | fn test_dependentSchemas_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_schemas_3`
|
||||
|
||||
warning: function `test_exclusiveMaximum_0` should have a snake case name
|
||||
--> tests/tests.rs:862:4
|
||||
|
|
||||
862 | fn test_exclusiveMaximum_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_exclusive_maximum_0`
|
||||
|
||||
warning: function `test_prefixItems_0` should have a snake case name
|
||||
--> tests/tests.rs:868:4
|
||||
|
|
||||
868 | fn test_prefixItems_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_0`
|
||||
|
||||
warning: function `test_prefixItems_1` should have a snake case name
|
||||
--> tests/tests.rs:874:4
|
||||
|
|
||||
874 | fn test_prefixItems_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_1`
|
||||
|
||||
warning: function `test_prefixItems_2` should have a snake case name
|
||||
--> tests/tests.rs:880:4
|
||||
|
|
||||
880 | fn test_prefixItems_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_2`
|
||||
|
||||
warning: function `test_prefixItems_3` should have a snake case name
|
||||
--> tests/tests.rs:886:4
|
||||
|
|
||||
886 | fn test_prefixItems_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_3`
|
||||
|
||||
warning: function `test_prefixItems_4` should have a snake case name
|
||||
--> tests/tests.rs:892:4
|
||||
|
|
||||
892 | fn test_prefixItems_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_prefix_items_4`
|
||||
|
||||
warning: function `test_oneOf_0` should have a snake case name
|
||||
--> tests/tests.rs:910:4
|
||||
|
|
||||
910 | fn test_oneOf_0() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_0`
|
||||
|
||||
warning: function `test_oneOf_1` should have a snake case name
|
||||
--> tests/tests.rs:916:4
|
||||
|
|
||||
916 | fn test_oneOf_1() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_1`
|
||||
|
||||
warning: function `test_oneOf_2` should have a snake case name
|
||||
--> tests/tests.rs:922:4
|
||||
|
|
||||
922 | fn test_oneOf_2() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_2`
|
||||
|
||||
warning: function `test_oneOf_3` should have a snake case name
|
||||
--> tests/tests.rs:928:4
|
||||
|
|
||||
928 | fn test_oneOf_3() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_3`
|
||||
|
||||
warning: function `test_oneOf_4` should have a snake case name
|
||||
--> tests/tests.rs:934:4
|
||||
|
|
||||
934 | fn test_oneOf_4() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_4`
|
||||
|
||||
warning: function `test_oneOf_5` should have a snake case name
|
||||
--> tests/tests.rs:940:4
|
||||
|
|
||||
940 | fn test_oneOf_5() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_5`
|
||||
|
||||
warning: function `test_oneOf_6` should have a snake case name
|
||||
--> tests/tests.rs:946:4
|
||||
|
|
||||
946 | fn test_oneOf_6() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_6`
|
||||
|
||||
warning: function `test_oneOf_7` should have a snake case name
|
||||
--> tests/tests.rs:952:4
|
||||
|
|
||||
952 | fn test_oneOf_7() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_7`
|
||||
|
||||
warning: function `test_oneOf_8` should have a snake case name
|
||||
--> tests/tests.rs:958:4
|
||||
|
|
||||
958 | fn test_oneOf_8() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_8`
|
||||
|
||||
warning: function `test_oneOf_9` should have a snake case name
|
||||
--> tests/tests.rs:964:4
|
||||
|
|
||||
964 | fn test_oneOf_9() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_9`
|
||||
|
||||
warning: function `test_oneOf_10` should have a snake case name
|
||||
--> tests/tests.rs:970:4
|
||||
|
|
||||
970 | fn test_oneOf_10() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_10`
|
||||
|
||||
warning: function `test_oneOf_11` should have a snake case name
|
||||
--> tests/tests.rs:976:4
|
||||
|
|
||||
976 | fn test_oneOf_11() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_11`
|
||||
|
||||
warning: function `test_oneOf_12` should have a snake case name
|
||||
--> tests/tests.rs:982:4
|
||||
|
|
||||
982 | fn test_oneOf_12() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_one_of_12`
|
||||
|
||||
warning: function `test_emptyString_0` should have a snake case name
|
||||
--> tests/tests.rs:1072:4
|
||||
|
|
||||
1072 | fn test_emptyString_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_empty_string_0`
|
||||
|
||||
warning: function `test_maxProperties_0` should have a snake case name
|
||||
--> tests/tests.rs:1090:4
|
||||
|
|
||||
1090 | fn test_maxProperties_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_properties_0`
|
||||
|
||||
warning: function `test_maxProperties_1` should have a snake case name
|
||||
--> tests/tests.rs:1096:4
|
||||
|
|
||||
1096 | fn test_maxProperties_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_properties_1`
|
||||
|
||||
warning: function `test_maxProperties_2` should have a snake case name
|
||||
--> tests/tests.rs:1102:4
|
||||
|
|
||||
1102 | fn test_maxProperties_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_properties_2`
|
||||
|
||||
warning: function `test_maxProperties_3` should have a snake case name
|
||||
--> tests/tests.rs:1108:4
|
||||
|
|
||||
1108 | fn test_maxProperties_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_properties_3`
|
||||
|
||||
warning: function `test_dependentRequired_0` should have a snake case name
|
||||
--> tests/tests.rs:1114:4
|
||||
|
|
||||
1114 | fn test_dependentRequired_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_0`
|
||||
|
||||
warning: function `test_dependentRequired_1` should have a snake case name
|
||||
--> tests/tests.rs:1120:4
|
||||
|
|
||||
1120 | fn test_dependentRequired_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_1`
|
||||
|
||||
warning: function `test_dependentRequired_2` should have a snake case name
|
||||
--> tests/tests.rs:1126:4
|
||||
|
|
||||
1126 | fn test_dependentRequired_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_2`
|
||||
|
||||
warning: function `test_dependentRequired_3` should have a snake case name
|
||||
--> tests/tests.rs:1132:4
|
||||
|
|
||||
1132 | fn test_dependentRequired_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_3`
|
||||
|
||||
warning: function `test_dependentRequired_4` should have a snake case name
|
||||
--> tests/tests.rs:1138:4
|
||||
|
|
||||
1138 | fn test_dependentRequired_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dependent_required_4`
|
||||
|
||||
warning: function `test_multipleOf_0` should have a snake case name
|
||||
--> tests/tests.rs:1252:4
|
||||
|
|
||||
1252 | fn test_multipleOf_0() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_multiple_of_0`
|
||||
|
||||
warning: function `test_multipleOf_1` should have a snake case name
|
||||
--> tests/tests.rs:1258:4
|
||||
|
|
||||
1258 | fn test_multipleOf_1() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_multiple_of_1`
|
||||
|
||||
warning: function `test_multipleOf_2` should have a snake case name
|
||||
--> tests/tests.rs:1264:4
|
||||
|
|
||||
1264 | fn test_multipleOf_2() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_multiple_of_2`
|
||||
|
||||
warning: function `test_multipleOf_3` should have a snake case name
|
||||
--> tests/tests.rs:1270:4
|
||||
|
|
||||
1270 | fn test_multipleOf_3() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_multiple_of_3`
|
||||
|
||||
warning: function `test_patternProperties_0` should have a snake case name
|
||||
--> tests/tests.rs:1276:4
|
||||
|
|
||||
1276 | fn test_patternProperties_0() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_0`
|
||||
|
||||
warning: function `test_patternProperties_1` should have a snake case name
|
||||
--> tests/tests.rs:1282:4
|
||||
|
|
||||
1282 | fn test_patternProperties_1() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_1`
|
||||
|
||||
warning: function `test_patternProperties_2` should have a snake case name
|
||||
--> tests/tests.rs:1288:4
|
||||
|
|
||||
1288 | fn test_patternProperties_2() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_2`
|
||||
|
||||
warning: function `test_patternProperties_3` should have a snake case name
|
||||
--> tests/tests.rs:1294:4
|
||||
|
|
||||
1294 | fn test_patternProperties_3() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_3`
|
||||
|
||||
warning: function `test_patternProperties_4` should have a snake case name
|
||||
--> tests/tests.rs:1300:4
|
||||
|
|
||||
1300 | fn test_patternProperties_4() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_4`
|
||||
|
||||
warning: function `test_patternProperties_5` should have a snake case name
|
||||
--> tests/tests.rs:1306:4
|
||||
|
|
||||
1306 | fn test_patternProperties_5() {
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_pattern_properties_5`
|
||||
|
||||
warning: function `test_allOf_0` should have a snake case name
|
||||
--> tests/tests.rs:1336:4
|
||||
|
|
||||
1336 | fn test_allOf_0() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_0`
|
||||
|
||||
warning: function `test_allOf_1` should have a snake case name
|
||||
--> tests/tests.rs:1342:4
|
||||
|
|
||||
1342 | fn test_allOf_1() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_1`
|
||||
|
||||
warning: function `test_allOf_2` should have a snake case name
|
||||
--> tests/tests.rs:1348:4
|
||||
|
|
||||
1348 | fn test_allOf_2() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_2`
|
||||
|
||||
warning: function `test_allOf_3` should have a snake case name
|
||||
--> tests/tests.rs:1354:4
|
||||
|
|
||||
1354 | fn test_allOf_3() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_3`
|
||||
|
||||
warning: function `test_allOf_4` should have a snake case name
|
||||
--> tests/tests.rs:1360:4
|
||||
|
|
||||
1360 | fn test_allOf_4() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_4`
|
||||
|
||||
warning: function `test_allOf_5` should have a snake case name
|
||||
--> tests/tests.rs:1366:4
|
||||
|
|
||||
1366 | fn test_allOf_5() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_5`
|
||||
|
||||
warning: function `test_allOf_6` should have a snake case name
|
||||
--> tests/tests.rs:1372:4
|
||||
|
|
||||
1372 | fn test_allOf_6() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_6`
|
||||
|
||||
warning: function `test_allOf_7` should have a snake case name
|
||||
--> tests/tests.rs:1378:4
|
||||
|
|
||||
1378 | fn test_allOf_7() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_7`
|
||||
|
||||
warning: function `test_allOf_8` should have a snake case name
|
||||
--> tests/tests.rs:1384:4
|
||||
|
|
||||
1384 | fn test_allOf_8() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_8`
|
||||
|
||||
warning: function `test_allOf_9` should have a snake case name
|
||||
--> tests/tests.rs:1390:4
|
||||
|
|
||||
1390 | fn test_allOf_9() {
|
||||
| ^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_9`
|
||||
|
||||
warning: function `test_allOf_10` should have a snake case name
|
||||
--> tests/tests.rs:1396:4
|
||||
|
|
||||
1396 | fn test_allOf_10() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_10`
|
||||
|
||||
warning: function `test_allOf_11` should have a snake case name
|
||||
--> tests/tests.rs:1402:4
|
||||
|
|
||||
1402 | fn test_allOf_11() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_11`
|
||||
|
||||
warning: function `test_allOf_12` should have a snake case name
|
||||
--> tests/tests.rs:1408:4
|
||||
|
|
||||
1408 | fn test_allOf_12() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_12`
|
||||
|
||||
warning: function `test_allOf_13` should have a snake case name
|
||||
--> tests/tests.rs:1414:4
|
||||
|
|
||||
1414 | fn test_allOf_13() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_13`
|
||||
|
||||
warning: function `test_allOf_14` should have a snake case name
|
||||
--> tests/tests.rs:1420:4
|
||||
|
|
||||
1420 | fn test_allOf_14() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_14`
|
||||
|
||||
warning: function `test_allOf_15` should have a snake case name
|
||||
--> tests/tests.rs:1426:4
|
||||
|
|
||||
1426 | fn test_allOf_15() {
|
||||
| ^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_all_of_15`
|
||||
|
||||
warning: function `test_minLength_0` should have a snake case name
|
||||
--> tests/tests.rs:1828:4
|
||||
|
|
||||
1828 | fn test_minLength_0() {
|
||||
| ^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_length_0`
|
||||
|
||||
warning: function `test_minLength_1` should have a snake case name
|
||||
--> tests/tests.rs:1834:4
|
||||
|
|
||||
1834 | fn test_minLength_1() {
|
||||
| ^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_min_length_1`
|
||||
|
||||
warning: function `test_maxItems_0` should have a snake case name
|
||||
--> tests/tests.rs:1840:4
|
||||
|
|
||||
1840 | fn test_maxItems_0() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_items_0`
|
||||
|
||||
warning: function `test_maxItems_1` should have a snake case name
|
||||
--> tests/tests.rs:1846:4
|
||||
|
|
||||
1846 | fn test_maxItems_1() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_items_1`
|
||||
|
||||
warning: function `test_maxItems_2` should have a snake case name
|
||||
--> tests/tests.rs:1852:4
|
||||
|
|
||||
1852 | fn test_maxItems_2() {
|
||||
| ^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_max_items_2`
|
||||
|
||||
warning: function `test_dynamicRef_0` should have a snake case name
|
||||
--> tests/tests.rs:1912:4
|
||||
|
|
||||
1912 | fn test_dynamicRef_0() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_0`
|
||||
|
||||
warning: function `test_dynamicRef_1` should have a snake case name
|
||||
--> tests/tests.rs:1918:4
|
||||
|
|
||||
1918 | fn test_dynamicRef_1() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_1`
|
||||
|
||||
warning: function `test_dynamicRef_2` should have a snake case name
|
||||
--> tests/tests.rs:1924:4
|
||||
|
|
||||
1924 | fn test_dynamicRef_2() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_2`
|
||||
|
||||
warning: function `test_dynamicRef_3` should have a snake case name
|
||||
--> tests/tests.rs:1930:4
|
||||
|
|
||||
1930 | fn test_dynamicRef_3() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_3`
|
||||
|
||||
warning: function `test_dynamicRef_4` should have a snake case name
|
||||
--> tests/tests.rs:1936:4
|
||||
|
|
||||
1936 | fn test_dynamicRef_4() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_4`
|
||||
|
||||
warning: function `test_dynamicRef_5` should have a snake case name
|
||||
--> tests/tests.rs:1942:4
|
||||
|
|
||||
1942 | fn test_dynamicRef_5() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_5`
|
||||
|
||||
warning: function `test_dynamicRef_6` should have a snake case name
|
||||
--> tests/tests.rs:1948:4
|
||||
|
|
||||
1948 | fn test_dynamicRef_6() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_6`
|
||||
|
||||
warning: function `test_dynamicRef_7` should have a snake case name
|
||||
--> tests/tests.rs:1954:4
|
||||
|
|
||||
1954 | fn test_dynamicRef_7() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_7`
|
||||
|
||||
warning: function `test_dynamicRef_8` should have a snake case name
|
||||
--> tests/tests.rs:1960:4
|
||||
|
|
||||
1960 | fn test_dynamicRef_8() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_8`
|
||||
|
||||
warning: function `test_dynamicRef_9` should have a snake case name
|
||||
--> tests/tests.rs:1966:4
|
||||
|
|
||||
1966 | fn test_dynamicRef_9() {
|
||||
| ^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_9`
|
||||
|
||||
warning: function `test_dynamicRef_10` should have a snake case name
|
||||
--> tests/tests.rs:1972:4
|
||||
|
|
||||
1972 | fn test_dynamicRef_10() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_10`
|
||||
|
||||
warning: function `test_dynamicRef_11` should have a snake case name
|
||||
--> tests/tests.rs:1978:4
|
||||
|
|
||||
1978 | fn test_dynamicRef_11() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_11`
|
||||
|
||||
warning: function `test_dynamicRef_12` should have a snake case name
|
||||
--> tests/tests.rs:1984:4
|
||||
|
|
||||
1984 | fn test_dynamicRef_12() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_12`
|
||||
|
||||
warning: function `test_dynamicRef_13` should have a snake case name
|
||||
--> tests/tests.rs:1990:4
|
||||
|
|
||||
1990 | fn test_dynamicRef_13() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_13`
|
||||
|
||||
warning: function `test_dynamicRef_14` should have a snake case name
|
||||
--> tests/tests.rs:1996:4
|
||||
|
|
||||
1996 | fn test_dynamicRef_14() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_14`
|
||||
|
||||
warning: function `test_dynamicRef_15` should have a snake case name
|
||||
--> tests/tests.rs:2002:4
|
||||
|
|
||||
2002 | fn test_dynamicRef_15() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_15`
|
||||
|
||||
warning: function `test_dynamicRef_16` should have a snake case name
|
||||
--> tests/tests.rs:2008:4
|
||||
|
|
||||
2008 | fn test_dynamicRef_16() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_16`
|
||||
|
||||
warning: function `test_dynamicRef_17` should have a snake case name
|
||||
--> tests/tests.rs:2014:4
|
||||
|
|
||||
2014 | fn test_dynamicRef_17() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_17`
|
||||
|
||||
warning: function `test_dynamicRef_18` should have a snake case name
|
||||
--> tests/tests.rs:2020:4
|
||||
|
|
||||
2020 | fn test_dynamicRef_18() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_18`
|
||||
|
||||
warning: function `test_dynamicRef_19` should have a snake case name
|
||||
--> tests/tests.rs:2026:4
|
||||
|
|
||||
2026 | fn test_dynamicRef_19() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_19`
|
||||
|
||||
warning: function `test_dynamicRef_20` should have a snake case name
|
||||
--> tests/tests.rs:2032:4
|
||||
|
|
||||
2032 | fn test_dynamicRef_20() {
|
||||
| ^^^^^^^^^^^^^^^^^^ help: convert the identifier to snake case: `test_dynamic_ref_20`
|
||||
|
||||
warning: `jspg` (test "tests") generated 132 warnings
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 6.12s
|
||||
Running tests/tests.rs (target/debug/deps/tests-0f6b1e496850f0af)
|
||||
|
||||
running 1 test
|
||||
|
||||
thread 'test_ref_39' (14867888) panicked at tests/tests.rs:1812:45:
|
||||
called `Result::unwrap()` on an `Err` value: "[implicit keyword shadowing] Test 'child type overrides parent type' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'age'\", details: ErrorDetails { path: \"/age\" } }]\n[implicit keyword shadowing] Test 'parent max age (20) is shadowed (replaced) by child definition' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'age'\", details: ErrorDetails { path: \"/age\" } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
test test_ref_39 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
failures:
|
||||
test_ref_39
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 338 filtered out; finished in 0.00s
|
||||
|
||||
error: test failed, to rerun pass `--test tests`
|
||||
71
flow
71
flow
@ -15,25 +15,28 @@ CARGO_DEPENDENCIES=(cargo-pgrx==0.16.1)
|
||||
GITEA_ORGANIZATION="cellular"
|
||||
GITEA_REPOSITORY="jspg"
|
||||
|
||||
pgrx-prepare() {
|
||||
pgrx-up() {
|
||||
info "Initializing pgrx..."
|
||||
# Explicitly point to the postgresql@${POSTGRES_VERSION} pg_config, don't rely on 'which'
|
||||
local POSTGRES_CONFIG_PATH="/opt/homebrew/opt/postgresql@${POSTGRES_VERSION}/bin/pg_config"
|
||||
|
||||
if [ ! -x "$POSTGRES_CONFIG_PATH" ]; then
|
||||
error "pg_config not found or not executable at $POSTGRES_CONFIG_PATH."
|
||||
warning "Ensure postgresql@${POSTGRES_VERSION} is installed correctly via Homebrew."
|
||||
return 2
|
||||
abort "pg_config not found or not executable at $POSTGRES_CONFIG_PATH." 2
|
||||
fi
|
||||
|
||||
if cargo pgrx init --pg"$POSTGRES_VERSION"="$POSTGRES_CONFIG_PATH"; then
|
||||
success "pgrx initialized successfully."
|
||||
else
|
||||
error "Failed to initialize pgrx. Check PostgreSQL development packages are installed and $POSTGRES_CONFIG_PATH is valid."
|
||||
return 2
|
||||
success "pgrx initialized successfully." && return 0
|
||||
fi
|
||||
|
||||
abort "Failed to initialize pgrx. Check PostgreSQL development packages are installed and $POSTGRES_CONFIG_PATH is valid." 2
|
||||
}
|
||||
|
||||
pgrx-down() {
|
||||
info "Taking pgrx down..."
|
||||
}
|
||||
|
||||
|
||||
build() {
|
||||
local version
|
||||
version=$(get-version) || return $?
|
||||
@ -51,11 +54,10 @@ build() {
|
||||
info "Creating tarball: ${tarball_path}"
|
||||
# Set COPYFILE_DISABLE=1 to prevent macOS tar from including ._ metadata files
|
||||
if COPYFILE_DISABLE=1 tar --exclude='.git*' --exclude='./target' --exclude='./package' --exclude='./flows' --exclude='./flow' -czf "${tarball_path}" .; then
|
||||
success "Successfully created source tarball: ${tarball_path}"
|
||||
else
|
||||
error "Failed to create source tarball."
|
||||
return 2
|
||||
success "Successfully created source tarball: ${tarball_path}" && return 0
|
||||
fi
|
||||
|
||||
abort "Failed to create source tarball." 2
|
||||
}
|
||||
|
||||
install() {
|
||||
@ -66,8 +68,7 @@ install() {
|
||||
|
||||
# Run the pgrx install command
|
||||
if ! cargo pgrx install; then
|
||||
error "cargo pgrx install command failed."
|
||||
return 2
|
||||
abort "cargo pgrx install command failed." 2
|
||||
fi
|
||||
success "PGRX extension v$version successfully built and installed."
|
||||
|
||||
@ -76,36 +77,28 @@ install() {
|
||||
pg_sharedir=$("$POSTGRES_CONFIG_PATH" --sharedir)
|
||||
local pg_config_status=$?
|
||||
if [ $pg_config_status -ne 0 ] || [ -z "$pg_sharedir" ]; then
|
||||
error "Failed to determine PostgreSQL shared directory using pg_config."
|
||||
return 2
|
||||
abort "Failed to determine PostgreSQL shared directory using pg_config." 2
|
||||
fi
|
||||
local installed_control_path="${pg_sharedir}/extension/jspg.control"
|
||||
|
||||
# Modify the control file
|
||||
if [ ! -f "$installed_control_path" ]; then
|
||||
error "Installed control file not found: '$installed_control_path'"
|
||||
return 2
|
||||
abort "Installed control file not found: '$installed_control_path'" 2
|
||||
fi
|
||||
|
||||
info "Modifying control file for non-superuser access: ${installed_control_path}"
|
||||
# Use sed -i '' for macOS compatibility
|
||||
if sed -i '' '/^superuser = false/d' "$installed_control_path" && \
|
||||
echo 'trusted = true' >> "$installed_control_path"; then
|
||||
success "Control file modified successfully."
|
||||
else
|
||||
error "Failed to modify control file: ${installed_control_path}"
|
||||
return 2
|
||||
success "Control file modified successfully." && return 0
|
||||
fi
|
||||
|
||||
abort "Failed to modify control file: ${installed_control_path}" 2
|
||||
}
|
||||
|
||||
test-jspg() {
|
||||
test() {
|
||||
info "Running jspg tests..."
|
||||
cargo pgrx test "pg${POSTGRES_VERSION}" "$@" || return $?
|
||||
}
|
||||
|
||||
test-validator() {
|
||||
info "Running validator tests..."
|
||||
cargo test -p boon --features "pgrx/pg${POSTGRES_VERSION}" "$@" || return $?
|
||||
cargo test --tests "$@" || return $?
|
||||
}
|
||||
|
||||
clean() {
|
||||
@ -114,27 +107,27 @@ clean() {
|
||||
}
|
||||
|
||||
jspg-usage() {
|
||||
printf "prepare\tCheck OS, Cargo, and PGRX dependencies.\n"
|
||||
printf "install\tBuild and install the extension locally (after prepare).\n"
|
||||
printf "reinstall\tClean, build, and install the extension locally (after prepare).\n"
|
||||
printf "test-jspg\t\tRun pgrx integration tests.\n"
|
||||
printf "test-validator\t\tRun validator integration tests.\n"
|
||||
printf "clean\t\tRemove pgrx build artifacts.\n"
|
||||
echo "up|Check OS, Cargo, and PGRX dependencies."
|
||||
echo "install|Build and install the extension locally (after up)."
|
||||
echo "reinstall|Clean, build, and install the extension locally (after up)."
|
||||
echo "test-jspg|Run pgrx integration tests."
|
||||
echo "test-validator|Run validator integration tests."
|
||||
echo "clean|Remove pgrx build artifacts."
|
||||
}
|
||||
|
||||
jspg-flow() {
|
||||
case "$1" in
|
||||
prepare) prepare && cargo-prepare && pgrx-prepare; return $?;;
|
||||
up) up && rust-up && pgrx-up; return $?;;
|
||||
down) pgrx-down && rust-down && down; return $?;;
|
||||
build) build; return $?;;
|
||||
install) install; return $?;;
|
||||
reinstall) clean && install; return $?;;
|
||||
test-jspg) test-jspg "${@:2}"; return $?;;
|
||||
test-validator) test-validator "${@:2}"; return $?;;
|
||||
test) test "${@:2}"; return $?;;
|
||||
clean) clean; return $?;;
|
||||
*) return 1 ;;
|
||||
*) return 127 ;;
|
||||
esac
|
||||
}
|
||||
|
||||
register-flow "jspg-usage" "jspg-flow"
|
||||
register-flow "jspg"
|
||||
|
||||
dispatch "$@"
|
||||
2
flows
2
flows
Submodule flows updated: e154758056...a7b0f5dc4d
106
log_root.txt
106
log_root.txt
@ -1,106 +0,0 @@
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.34s
|
||||
Running tests/tests.rs (target/debug/deps/tests-0f6b1e496850f0af)
|
||||
|
||||
running 1 test
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"job_id", "manager_id", "name"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "manager_id", "type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"manager_id", "type", "job_id", "name"}
|
||||
DEBUG: validate_object inserted 'nested_or_super_job' at /nested_or_super_job. Keys: {"name", "job_id", "manager_id", "type"}
|
||||
DEBUG: check_strictness at /root_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /root_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /root_job/job_id. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /root_job/type. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"job_id", "name", "type"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name", "type"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"job_id", "type", "name"}
|
||||
DEBUG: validate_object inserted 'root_job' at /root_job. Keys: {"name", "job_id", "manager_id", "nested_or_super_job", "type"}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {"root_job", "name", "job_id", "manager_id", "nested_or_super_job", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"name", "manager_id", "job_id"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"manager_id", "name", "type", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"manager_id", "type", "job_id", "name"}
|
||||
DEBUG: validate_object inserted 'nested_or_super_job' at /nested_or_super_job. Keys: {"manager_id", "type", "name", "job_id"}
|
||||
DEBUG: check_strictness at /root_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /root_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /root_job/job_id. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /root_job/type. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name", "job_id", "type"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id", "type"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"job_id", "type", "name"}
|
||||
DEBUG: validate_object inserted 'root_job' at /root_job. Keys: {"manager_id", "type", "nested_or_super_job", "name", "job_id"}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {"manager_id", "root_job", "type", "nested_or_super_job", "name", "job_id"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {}
|
||||
DEBUG: validate_refs merging res from job. Keys: {}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/my_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/my_job/type. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"type", "name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name", "type"}
|
||||
DEBUG: validate_object inserted 'my_job' at /nested_or_super_job/my_job. Keys: {"type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"job_id", "manager_id", "name"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "type", "manager_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("strict_org_punc.request") ref=Some("organization")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("strict_org_punc.request") ref=Some("organization")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
|
||||
thread 'test_puncs_6' (15117383) panicked at tests/tests.rs:150:44:
|
||||
called `Result::unwrap()` on an `Err` value: "[complex punc type matching with oneOf and nested refs] Test 'valid person against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", details: ErrorDetails { path: \"/first_name\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against strict punc' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
test test_puncs_6 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
failures:
|
||||
test_puncs_6
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 338 filtered out; finished in 0.00s
|
||||
|
||||
error: test failed, to rerun pass `--test tests`
|
||||
243
old_code/lib.rs
243
old_code/lib.rs
@ -1,243 +0,0 @@
|
||||
use pgrx::*;
|
||||
|
||||
pg_module_magic!();
|
||||
|
||||
// mod schema;
|
||||
mod registry;
|
||||
mod validator;
|
||||
mod util;
|
||||
|
||||
use crate::registry::REGISTRY;
|
||||
// use crate::schema::Schema;
|
||||
use crate::validator::{Validator, ValidationOptions};
|
||||
use lazy_static::lazy_static;
|
||||
use serde_json::{json, Value};
|
||||
use std::collections::{HashMap, HashSet};
|
||||
|
||||
#[derive(Clone, Copy, Debug, PartialEq)]
|
||||
enum SchemaType {
|
||||
Enum,
|
||||
Type,
|
||||
Family,
|
||||
PublicPunc,
|
||||
PrivatePunc,
|
||||
}
|
||||
|
||||
struct CachedSchema {
|
||||
t: SchemaType,
|
||||
}
|
||||
|
||||
lazy_static! {
|
||||
static ref SCHEMA_META: std::sync::RwLock<HashMap<String, CachedSchema>> = std::sync::RwLock::new(HashMap::new());
|
||||
}
|
||||
|
||||
#[pg_extern(strict)]
|
||||
fn cache_json_schemas(enums: JsonB, types: JsonB, puncs: JsonB) -> JsonB {
|
||||
let mut meta = SCHEMA_META.write().unwrap();
|
||||
let enums_value: Value = enums.0;
|
||||
let types_value: Value = types.0;
|
||||
let puncs_value: Value = puncs.0;
|
||||
|
||||
let mut schemas_to_register = Vec::new();
|
||||
|
||||
// Phase 1: Enums
|
||||
if let Some(enums_array) = enums_value.as_array() {
|
||||
for enum_row in enums_array {
|
||||
if let Some(schemas_raw) = enum_row.get("schemas") {
|
||||
if let Some(schemas_array) = schemas_raw.as_array() {
|
||||
for schema_def in schemas_array {
|
||||
if let Some(schema_id) = schema_def.get("$id").and_then(|v| v.as_str()) {
|
||||
schemas_to_register.push((schema_id.to_string(), schema_def.clone(), SchemaType::Enum));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Phase 2: Types & Hierarchy
|
||||
let mut hierarchy_map: HashMap<String, HashSet<String>> = HashMap::new();
|
||||
if let Some(types_array) = types_value.as_array() {
|
||||
for type_row in types_array {
|
||||
if let Some(schemas_raw) = type_row.get("schemas") {
|
||||
if let Some(schemas_array) = schemas_raw.as_array() {
|
||||
for schema_def in schemas_array {
|
||||
if let Some(schema_id) = schema_def.get("$id").and_then(|v| v.as_str()) {
|
||||
schemas_to_register.push((schema_id.to_string(), schema_def.clone(), SchemaType::Type));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if let Some(type_name) = type_row.get("name").and_then(|v| v.as_str()) {
|
||||
if let Some(hierarchy_raw) = type_row.get("hierarchy") {
|
||||
if let Some(hierarchy_array) = hierarchy_raw.as_array() {
|
||||
for ancestor_val in hierarchy_array {
|
||||
if let Some(ancestor_name) = ancestor_val.as_str() {
|
||||
hierarchy_map.entry(ancestor_name.to_string()).or_default().insert(type_name.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (base_type, descendant_types) in hierarchy_map {
|
||||
let family_id = format!("{}.family", base_type);
|
||||
let values: Vec<String> = descendant_types.into_iter().collect();
|
||||
let family_schema = json!({ "$id": family_id, "type": "string", "enum": values });
|
||||
schemas_to_register.push((family_id, family_schema, SchemaType::Family));
|
||||
}
|
||||
|
||||
// Phase 3: Puncs
|
||||
if let Some(puncs_array) = puncs_value.as_array() {
|
||||
for punc_row in puncs_array {
|
||||
if let Some(punc_obj) = punc_row.as_object() {
|
||||
if let Some(punc_name) = punc_obj.get("name").and_then(|v| v.as_str()) {
|
||||
let is_public = punc_obj.get("public").and_then(|v| v.as_bool()).unwrap_or(false);
|
||||
let punc_type = if is_public { SchemaType::PublicPunc } else { SchemaType::PrivatePunc };
|
||||
if let Some(schemas_raw) = punc_obj.get("schemas") {
|
||||
if let Some(schemas_array) = schemas_raw.as_array() {
|
||||
for schema_def in schemas_array {
|
||||
if let Some(schema_id) = schema_def.get("$id").and_then(|v| v.as_str()) {
|
||||
let req_id = format!("{}.request", punc_name);
|
||||
let resp_id = format!("{}.response", punc_name);
|
||||
let st = if schema_id == req_id || schema_id == resp_id { punc_type } else { SchemaType::Type };
|
||||
schemas_to_register.push((schema_id.to_string(), schema_def.clone(), st));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut all_errors = Vec::new();
|
||||
for (id, value, st) in schemas_to_register {
|
||||
// Meta-validation: Check 'type' enum if present
|
||||
if let Some(type_val) = value.get("type") {
|
||||
let types = match type_val {
|
||||
Value::String(s) => vec![s.as_str()],
|
||||
Value::Array(a) => a.iter().filter_map(|v| v.as_str()).collect(),
|
||||
_ => vec![],
|
||||
};
|
||||
let valid_primitives = ["string", "number", "integer", "boolean", "array", "object", "null"];
|
||||
for t in types {
|
||||
if !valid_primitives.contains(&t) {
|
||||
all_errors.push(json!({ "code": "ENUM_VIOLATED", "message": format!("Invalid type: {}", t) }));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Clone value for insertion since it might be consumed/moved if we were doing other things
|
||||
let value_for_registry = value.clone();
|
||||
|
||||
// Validation: just ensure it is an object or boolean
|
||||
if value.is_object() || value.is_boolean() {
|
||||
REGISTRY.insert(id.clone(), value_for_registry);
|
||||
meta.insert(id, CachedSchema { t: st });
|
||||
} else {
|
||||
all_errors.push(json!({ "code": "INVALID_SCHEMA_TYPE", "message": format!("Schema {} must be an object or boolean", id) }));
|
||||
}
|
||||
}
|
||||
|
||||
if !all_errors.is_empty() {
|
||||
return JsonB(json!({ "errors": all_errors }));
|
||||
}
|
||||
|
||||
JsonB(json!({ "response": "success" }))
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn validate_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
let schema = match REGISTRY.get(schema_id) {
|
||||
Some(s) => s,
|
||||
None => return JsonB(json!({
|
||||
"errors": [{
|
||||
"code": "SCHEMA_NOT_FOUND",
|
||||
"message": format!("Schema '{}' not found", schema_id),
|
||||
"details": { "schema": schema_id }
|
||||
}]
|
||||
})),
|
||||
};
|
||||
|
||||
let meta = SCHEMA_META.read().unwrap();
|
||||
let st = meta.get(schema_id).map(|m| m.t).unwrap_or(SchemaType::Type);
|
||||
|
||||
let be_strict = match st {
|
||||
SchemaType::PublicPunc => true,
|
||||
_ => false,
|
||||
};
|
||||
|
||||
let options = ValidationOptions {
|
||||
be_strict,
|
||||
};
|
||||
|
||||
let mut validator = Validator::new(options, schema_id);
|
||||
match validator.validate(&schema, &instance.0) {
|
||||
Ok(_) => JsonB(json!({ "response": "success" })),
|
||||
Err(errors) => {
|
||||
let drop_errors: Vec<Value> = errors.into_iter().map(|e| json!({
|
||||
"code": e.code,
|
||||
"message": e.message,
|
||||
"details": {
|
||||
"path": e.path,
|
||||
"context": e.context,
|
||||
"cause": e.cause,
|
||||
"schema": e.schema_id
|
||||
}
|
||||
})).collect();
|
||||
|
||||
if let Ok(mut f) = std::fs::OpenOptions::new().create(true).append(true).open("/tmp/debug_jspg_errors.log") {
|
||||
use std::io::Write;
|
||||
let _ = writeln!(f, "VALIDATION FAILED for {}: {:?}", schema_id, drop_errors);
|
||||
}
|
||||
|
||||
JsonB(json!({ "errors": drop_errors }))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn json_schema_cached(schema_id: &str) -> bool {
|
||||
REGISTRY.get(schema_id).is_some()
|
||||
}
|
||||
|
||||
#[pg_extern(strict)]
|
||||
fn clear_json_schemas() -> JsonB {
|
||||
REGISTRY.reset();
|
||||
let mut meta = SCHEMA_META.write().unwrap();
|
||||
meta.clear();
|
||||
JsonB(json!({ "response": "success" }))
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn show_json_schemas() -> JsonB {
|
||||
let meta = SCHEMA_META.read().unwrap();
|
||||
let ids: Vec<String> = meta.keys().cloned().collect();
|
||||
JsonB(json!({ "response": ids }))
|
||||
}
|
||||
|
||||
/// This module is required by `cargo pgrx test` invocations.
|
||||
/// It must be visible at the root of your extension crate.
|
||||
#[cfg(test)]
|
||||
pub mod pg_test {
|
||||
pub fn setup(_options: Vec<&str>) {
|
||||
// perform one-off initialization when the pg_test framework starts
|
||||
}
|
||||
|
||||
#[must_use]
|
||||
pub fn postgresql_conf_options() -> Vec<&'static str> {
|
||||
// return any postgresql.conf settings that are required for your tests
|
||||
vec![]
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(any(test, feature = "pg_test"))]
|
||||
#[pg_schema]
|
||||
mod tests {
|
||||
use pgrx::pg_test;
|
||||
include!("suite.rs");
|
||||
}
|
||||
@ -1,217 +0,0 @@
|
||||
use serde_json::Value;
|
||||
use std::collections::HashMap;
|
||||
use std::sync::RwLock;
|
||||
use lazy_static::lazy_static;
|
||||
|
||||
lazy_static! {
|
||||
pub static ref REGISTRY: Registry = Registry::new();
|
||||
}
|
||||
|
||||
pub struct Registry {
|
||||
schemas: RwLock<HashMap<String, Value>>,
|
||||
}
|
||||
|
||||
impl Registry {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
schemas: RwLock::new(HashMap::new()),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn reset(&self) {
|
||||
let mut schemas = self.schemas.write().unwrap();
|
||||
schemas.clear();
|
||||
}
|
||||
|
||||
pub fn insert(&self, id: String, schema: Value) {
|
||||
let mut schemas = self.schemas.write().unwrap();
|
||||
|
||||
// Index the schema and its sub-resources (IDs and anchors)
|
||||
self.index_schema(&schema, &mut schemas, Some(&id));
|
||||
|
||||
// Ensure the root ID is inserted (index_schema handles it, but let's be explicit)
|
||||
schemas.insert(id, schema);
|
||||
}
|
||||
|
||||
fn index_schema(&self, schema: &Value, registry: &mut HashMap<String, Value>, current_scope: Option<&str>) {
|
||||
if let Value::Object(map) = schema {
|
||||
// Only strictly index $id for scope resolution
|
||||
let mut my_scope = current_scope.map(|s| s.to_string());
|
||||
|
||||
if let Some(Value::String(id)) = map.get("$id") {
|
||||
if id.contains("://") {
|
||||
my_scope = Some(id.clone());
|
||||
} else if let Some(scope) = current_scope {
|
||||
if let Some(pos) = scope.rfind('/') {
|
||||
my_scope = Some(format!("{}{}", &scope[..pos + 1], id));
|
||||
} else {
|
||||
my_scope = Some(id.clone());
|
||||
}
|
||||
} else {
|
||||
my_scope = Some(id.clone());
|
||||
}
|
||||
|
||||
if let Some(final_id) = &my_scope {
|
||||
registry.insert(final_id.clone(), schema.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// Minimal recursion only for definitions where sub-IDs often live
|
||||
// This is a tradeoff: we don't index EVERYWHERE, but we catch the 90% common case of
|
||||
// bundled definitions without full tree traversal.
|
||||
if let Some(Value::Object(defs)) = map.get("$defs").or_else(|| map.get("definitions")) {
|
||||
for (_, def_schema) in defs {
|
||||
self.index_schema(def_schema, registry, my_scope.as_deref());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get(&self, id: &str) -> Option<Value> {
|
||||
let schemas = self.schemas.read().unwrap();
|
||||
schemas.get(id).cloned()
|
||||
}
|
||||
|
||||
pub fn resolve(&self, ref_str: &str, current_id: Option<&str>) -> Option<(Value, String)> {
|
||||
// 1. Try full lookup (Absolute or explicit ID)
|
||||
if let Some(s) = self.get(ref_str) {
|
||||
return Some((s, ref_str.to_string()));
|
||||
}
|
||||
|
||||
// 2. Try Relative lookup against current scope
|
||||
if let Some(curr) = current_id {
|
||||
if let Some(pos) = curr.rfind('/') {
|
||||
let joined = format!("{}{}", &curr[..pos + 1], ref_str);
|
||||
if let Some(s) = self.get(&joined) {
|
||||
return Some((s, joined));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Pointer Resolution
|
||||
// Split into Base URI + Fragment
|
||||
let (base, fragment) = match ref_str.split_once('#') {
|
||||
Some((b, f)) => (b, Some(f)),
|
||||
None => (ref_str, None),
|
||||
};
|
||||
|
||||
// If base is empty, we stay in current schema.
|
||||
// If base is present, we resolve it first.
|
||||
let (root_schema, scope) = if base.is_empty() {
|
||||
if let Some(curr) = current_id {
|
||||
// If we are looking up internally, we rely on the caller having passed the correct current ID
|
||||
// But typically internal refs are just fragments.
|
||||
if let Some(s) = self.get(curr) {
|
||||
(s, curr.to_string())
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
} else {
|
||||
// Resolve external base
|
||||
if let Some(s) = self.get(base) {
|
||||
(s, base.to_string())
|
||||
} else if let Some(curr) = current_id {
|
||||
// Try relative base
|
||||
if let Some(pos) = curr.rfind('/') {
|
||||
let joined = format!("{}{}", &curr[..pos + 1], base);
|
||||
if let Some(s) = self.get(&joined) {
|
||||
(s, joined)
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
};
|
||||
|
||||
if let Some(frag_raw) = fragment {
|
||||
if frag_raw.is_empty() {
|
||||
return Some((root_schema, scope));
|
||||
}
|
||||
// Decode fragment (it is URI encoded)
|
||||
let frag_cow = percent_encoding::percent_decode_str(frag_raw).decode_utf8().unwrap_or(std::borrow::Cow::Borrowed(frag_raw));
|
||||
let frag = frag_cow.as_ref();
|
||||
|
||||
if frag.starts_with('/') {
|
||||
if let Some(sub) = root_schema.pointer(frag) {
|
||||
return Some((sub.clone(), scope));
|
||||
}
|
||||
} else {
|
||||
// It is an anchor. We scan for it at runtime to avoid complex indexing at insertion.
|
||||
if let Some(sub) = self.find_anchor(&root_schema, frag) {
|
||||
return Some((sub, scope));
|
||||
}
|
||||
}
|
||||
None
|
||||
} else {
|
||||
Some((root_schema, scope))
|
||||
}
|
||||
}
|
||||
|
||||
fn find_anchor(&self, schema: &Value, anchor: &str) -> Option<Value> {
|
||||
match schema {
|
||||
Value::Object(map) => {
|
||||
// Check if this schema itself has the anchor
|
||||
if let Some(Value::String(a)) = map.get("$anchor") {
|
||||
if a == anchor {
|
||||
return Some(schema.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// Recurse into $defs / definitions (Map of Schemas)
|
||||
if let Some(Value::Object(defs)) = map.get("$defs").or_else(|| map.get("definitions")) {
|
||||
for val in defs.values() {
|
||||
if let Some(found) = self.find_anchor(val, anchor) { return Some(found); }
|
||||
}
|
||||
}
|
||||
|
||||
// Recurse into properties / patternProperties / dependentSchemas (Map of Schemas)
|
||||
for key in ["properties", "patternProperties", "dependentSchemas"] {
|
||||
if let Some(Value::Object(props)) = map.get(key) {
|
||||
for val in props.values() {
|
||||
if let Some(found) = self.find_anchor(val, anchor) { return Some(found); }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Recurse into arrays of schemas
|
||||
for key in ["allOf", "anyOf", "oneOf", "prefixItems"] {
|
||||
if let Some(Value::Array(arr)) = map.get(key) {
|
||||
for item in arr {
|
||||
if let Some(found) = self.find_anchor(item, anchor) { return Some(found); }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Recurse into single sub-schemas
|
||||
for key in ["items", "contains", "additionalProperties", "unevaluatedProperties", "not", "if", "then", "else"] {
|
||||
if let Some(val) = map.get(key) {
|
||||
if val.is_object() || val.is_boolean() {
|
||||
if let Some(found) = self.find_anchor(val, anchor) { return Some(found); }
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
Value::Array(arr) => {
|
||||
// Should not happen for a schema object, but if we are passed an array of schemas?
|
||||
// Standard schema is object or bool.
|
||||
// But let's be safe.
|
||||
for item in arr {
|
||||
if let Some(found) = self.find_anchor(item, anchor) {
|
||||
return Some(found);
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,236 +0,0 @@
|
||||
// use crate::schema::Schema;
|
||||
use crate::registry::REGISTRY;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::{json, Value};
|
||||
use pgrx::JsonB;
|
||||
use std::{fs, path::Path};
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
struct ExpectedError {
|
||||
code: String,
|
||||
path: String,
|
||||
message_contains: Option<String>,
|
||||
cause: Option<Value>,
|
||||
context: Option<Value>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
struct Group {
|
||||
description: String,
|
||||
schema: Option<Value>,
|
||||
enums: Option<Value>,
|
||||
types: Option<Value>,
|
||||
puncs: Option<Value>,
|
||||
tests: Vec<TestCase>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
struct TestCase {
|
||||
description: String,
|
||||
data: Value,
|
||||
valid: bool,
|
||||
action: Option<String>,
|
||||
schema_id: Option<String>,
|
||||
expect_errors: Option<Vec<ExpectedError>>,
|
||||
}
|
||||
|
||||
include!("tests.rs");
|
||||
|
||||
fn load_remotes(dir: &Path, base_url: &str) {
|
||||
if !dir.exists() { return; }
|
||||
|
||||
for entry in fs::read_dir(dir).expect("Failed to read remotes directory") {
|
||||
let entry = entry.unwrap();
|
||||
let path = entry.path();
|
||||
let file_name = path.file_name().unwrap().to_str().unwrap();
|
||||
|
||||
if path.is_file() && file_name.ends_with(".json") {
|
||||
let content = fs::read_to_string(&path).expect("Failed to read remote file");
|
||||
if let Ok(schema_value) = serde_json::from_str::<Value>(&content) {
|
||||
// Just check if it's a valid JSON value for a schema (object or bool)
|
||||
if schema_value.is_object() || schema_value.is_boolean() {
|
||||
let schema_id = format!("{}{}", base_url, file_name);
|
||||
REGISTRY.insert(schema_id, schema_value);
|
||||
}
|
||||
}
|
||||
} else if path.is_dir() {
|
||||
load_remotes(&path, &format!("{}{}/", base_url, file_name));
|
||||
}
|
||||
}
|
||||
|
||||
// Mock the meta-schema for testing recursive refs
|
||||
let meta_id = "https://json-schema.org/draft/2020-12/schema";
|
||||
if REGISTRY.get(meta_id).is_none() {
|
||||
// Just mock it as a permissive schema for now so refs resolve
|
||||
REGISTRY.insert(meta_id.to_string(), json!({ "$id": meta_id }));
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
fn run_dir(dir: &Path, base_url: Option<&str>) -> (usize, usize) {
|
||||
let mut file_count = 0;
|
||||
let mut test_count = 0;
|
||||
|
||||
for entry in fs::read_dir(dir).expect("Failed to read directory") {
|
||||
let entry = entry.unwrap();
|
||||
let path = entry.path();
|
||||
let file_name = path.file_name().unwrap().to_str().unwrap();
|
||||
|
||||
if path.is_file() && file_name.ends_with(".json") {
|
||||
let count = run_file(&path, base_url);
|
||||
test_count += count;
|
||||
file_count += 1;
|
||||
} else if path.is_dir() {
|
||||
if !file_name.starts_with('.') && file_name != "optional" {
|
||||
let (f, t) = run_dir(&path, base_url);
|
||||
file_count += f;
|
||||
test_count += t;
|
||||
}
|
||||
}
|
||||
}
|
||||
(file_count, test_count)
|
||||
}
|
||||
|
||||
fn run_file(path: &Path, base_url: Option<&str>) -> usize {
|
||||
let content = fs::read_to_string(path).expect("Failed to read file");
|
||||
let groups: Vec<Group> = serde_json::from_str(&content).expect("Failed to parse JSON");
|
||||
let filename = path.file_name().unwrap().to_str().unwrap();
|
||||
|
||||
let mut test_count = 0;
|
||||
|
||||
for group in groups {
|
||||
// Handle JSPG setup if any JSPG fields are present
|
||||
if group.enums.is_some() || group.types.is_some() || group.puncs.is_some() {
|
||||
let enums = group.enums.clone().unwrap_or(json!([]));
|
||||
let types = group.types.clone().unwrap_or(json!([]));
|
||||
let puncs = group.puncs.clone().unwrap_or(json!([]));
|
||||
// Use internal helper to register without clearing
|
||||
let result = crate::cache_json_schemas(JsonB(enums), JsonB(types), JsonB(puncs));
|
||||
if let Some(errors) = result.0.get("errors") {
|
||||
// If the group has a test specifically for caching failures, don't panic here
|
||||
let has_cache_test = group.tests.iter().any(|t| t.action.as_deref() == Some("cache"));
|
||||
if !has_cache_test {
|
||||
panic!("FAILED: File: {}, Group: {}\nCache failed: {:?}", filename, group.description, errors);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut temp_id = "test_root".to_string();
|
||||
if let Some(schema_value) = &group.schema {
|
||||
temp_id = base_url.map(|b| format!("{}schema.json", b)).unwrap_or_else(|| "test_root".to_string());
|
||||
|
||||
if schema_value.is_object() || schema_value.is_boolean() {
|
||||
REGISTRY.insert(temp_id.clone(), schema_value.clone());
|
||||
}
|
||||
} else {
|
||||
// Fallback for JSPG style tests where the schema is in the puncs/types
|
||||
let get_first_id = |items: &Option<Value>| {
|
||||
items.as_ref()
|
||||
.and_then(|v| v.as_array())
|
||||
.and_then(|arr| arr.first())
|
||||
.and_then(|item| item.get("schemas"))
|
||||
.and_then(|v| v.as_array())
|
||||
.and_then(|arr| arr.first())
|
||||
.and_then(|sch| sch.get("$id"))
|
||||
.and_then(|id| id.as_str())
|
||||
.map(|s| s.to_string())
|
||||
};
|
||||
|
||||
if let Some(id) = get_first_id(&group.puncs).or_else(|| get_first_id(&group.types)) {
|
||||
temp_id = id;
|
||||
}
|
||||
}
|
||||
|
||||
for test in &group.tests {
|
||||
test_count += 1;
|
||||
let sid = test.schema_id.clone().unwrap_or_else(|| temp_id.clone());
|
||||
let action = test.action.as_deref().unwrap_or("validate");
|
||||
pgrx::notice!("Starting Test: {}", test.description);
|
||||
|
||||
let result = if action == "cache" {
|
||||
let enums = group.enums.clone().unwrap_or(json!([]));
|
||||
let types = group.types.clone().unwrap_or(json!([]));
|
||||
let puncs = group.puncs.clone().unwrap_or(json!([]));
|
||||
crate::cache_json_schemas(JsonB(enums), JsonB(types), JsonB(puncs))
|
||||
} else {
|
||||
crate::validate_json_schema(&sid, JsonB(test.data.clone()))
|
||||
};
|
||||
let is_success = result.0.get("response").is_some();
|
||||
pgrx::notice!("TEST: file={}, group={}, test={}, valid={}, outcome={}",
|
||||
filename,
|
||||
&group.description,
|
||||
&test.description,
|
||||
test.valid,
|
||||
if is_success { "SUCCESS" } else { "ERRORS" }
|
||||
);
|
||||
|
||||
if is_success != test.valid {
|
||||
if let Some(errs) = result.0.get("errors") {
|
||||
panic!(
|
||||
"FAILED: File: {}, Group: {}, Test: {}\nExpected valid: {}, got ERRORS: {:?}",
|
||||
filename,
|
||||
group.description,
|
||||
test.description,
|
||||
test.valid,
|
||||
errs
|
||||
);
|
||||
} else {
|
||||
panic!(
|
||||
"FAILED: File: {}, Group: {}, Test: {}\nExpected invalid, got SUCCESS",
|
||||
filename,
|
||||
group.description,
|
||||
test.description
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Perform detailed assertions if present
|
||||
if let Some(expectations) = &test.expect_errors {
|
||||
let actual_errors = result.0.get("errors").and_then(|e| e.as_array()).expect("Expected errors array in failure response");
|
||||
|
||||
for expected in expectations {
|
||||
let found = actual_errors.iter().any(|e| {
|
||||
let code = e["code"].as_str().unwrap_or("");
|
||||
let path = e["details"]["path"].as_str().unwrap_or("");
|
||||
let message = e["message"].as_str().unwrap_or("");
|
||||
|
||||
let code_match = code == expected.code;
|
||||
let path_match = path == expected.path;
|
||||
let msg_match = if let Some(sub) = &expected.message_contains {
|
||||
message.contains(sub)
|
||||
} else {
|
||||
true
|
||||
};
|
||||
|
||||
let matches_cause = if let Some(expected_cause) = &expected.cause {
|
||||
e["details"]["cause"] == *expected_cause
|
||||
} else {
|
||||
true
|
||||
};
|
||||
|
||||
let matches_context = if let Some(expected_context) = &expected.context {
|
||||
e["details"]["context"] == *expected_context
|
||||
} else {
|
||||
true
|
||||
};
|
||||
|
||||
code_match && path_match && msg_match && matches_cause && matches_context
|
||||
});
|
||||
|
||||
if !found {
|
||||
panic!(
|
||||
"FAILED: File: {}, Group: {}, Test: {}\nMissing expected error: code='{}', path='{}'\nActual errors: {:?}",
|
||||
filename,
|
||||
group.description,
|
||||
test.description,
|
||||
expected.code,
|
||||
expected.path,
|
||||
actual_errors
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
} // end of test loop
|
||||
} // end of group loop
|
||||
test_count
|
||||
}
|
||||
@ -1,482 +0,0 @@
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_additional_properties() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/additionalProperties.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_cache() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/cache.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_const() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/const.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_dependencies() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/dependencies.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_enum() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/enum.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_errors() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/errors.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_format() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/format.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_infinite_loop_detection() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/infinite-loop-detection.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_one_of() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/oneOf.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_properties() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/properties.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_punc() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/punc.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_ref() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/ref.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_required() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/required.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_simple() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/simple.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_strict() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/strict.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_title() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/title.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_type() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/type.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_unevaluated_properties() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/unevaluatedProperties.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_jspg_unique_items() {
|
||||
REGISTRY.reset();
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSPG-Test-Suite/uniqueItems.json"), None);
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_additional_properties() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/additionalProperties.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_all_of() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/allOf.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_anchor() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/anchor.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_any_of() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/anyOf.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_boolean_schema() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/boolean_schema.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_const() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/const.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_contains() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/contains.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_content() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/content.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_default() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/default.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_defs() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/defs.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_dependent_required() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/dependentRequired.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_dependent_schemas() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/dependentSchemas.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_dynamic_ref() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/dynamicRef.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_enum() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/enum.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_exclusive_maximum() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/exclusiveMaximum.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_exclusive_minimum() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/exclusiveMinimum.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_format() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/format.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_if_then_else() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/if-then-else.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_infinite_loop_detection() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/infinite-loop-detection.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_items() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/items.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_max_contains() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/maxContains.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_max_items() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/maxItems.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_max_length() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/maxLength.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_max_properties() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/maxProperties.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_maximum() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/maximum.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_min_contains() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/minContains.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_min_items() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/minItems.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_min_length() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/minLength.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_min_properties() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/minProperties.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_minimum() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/minimum.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_multiple_of() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/multipleOf.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_not() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/not.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_one_of() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/oneOf.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_pattern() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/pattern.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_pattern_properties() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/patternProperties.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_prefix_items() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/prefixItems.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_properties() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/properties.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_property_names() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/propertyNames.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_ref() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/ref.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_ref_remote() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/refRemote.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_required() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/required.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_type() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/type.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_unevaluated_items() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/unevaluatedItems.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_unevaluated_properties() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/unevaluatedProperties.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_unique_items() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/uniqueItems.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_json_schema_vocabulary() {
|
||||
REGISTRY.reset();
|
||||
let remotes_dir = Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/remotes");
|
||||
load_remotes(remotes_dir, "http://localhost:1234/");
|
||||
run_file(Path::new("/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/tests/fixtures/JSON-Schema-Test-Suite/tests/draft2020-12/vocabulary.json"), Some("http://localhost:1234/"));
|
||||
}
|
||||
@ -1,53 +0,0 @@
|
||||
use serde_json::Value;
|
||||
|
||||
/// serde_json treats 0 and 0.0 not equal. so we cannot simply use v1==v2
|
||||
pub fn equals(v1: &Value, v2: &Value) -> bool {
|
||||
match (v1, v2) {
|
||||
(Value::Null, Value::Null) => true,
|
||||
(Value::Bool(b1), Value::Bool(b2)) => b1 == b2,
|
||||
(Value::Number(n1), Value::Number(n2)) => {
|
||||
if let (Some(n1), Some(n2)) = (n1.as_u64(), n2.as_u64()) {
|
||||
return n1 == n2;
|
||||
}
|
||||
if let (Some(n1), Some(n2)) = (n1.as_i64(), n2.as_i64()) {
|
||||
return n1 == n2;
|
||||
}
|
||||
if let (Some(n1), Some(n2)) = (n1.as_f64(), n2.as_f64()) {
|
||||
return (n1 - n2).abs() < f64::EPSILON;
|
||||
}
|
||||
false
|
||||
}
|
||||
(Value::String(s1), Value::String(s2)) => s1 == s2,
|
||||
(Value::Array(arr1), Value::Array(arr2)) => {
|
||||
if arr1.len() != arr2.len() {
|
||||
return false;
|
||||
}
|
||||
arr1.iter().zip(arr2).all(|(e1, e2)| equals(e1, e2))
|
||||
}
|
||||
(Value::Object(obj1), Value::Object(obj2)) => {
|
||||
if obj1.len() != obj2.len() {
|
||||
return false;
|
||||
}
|
||||
for (k1, v1) in obj1 {
|
||||
if let Some(v2) = obj2.get(k1) {
|
||||
if !equals(v1, v2) {
|
||||
return false;
|
||||
}
|
||||
} else {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
true
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn is_integer(v: &Value) -> bool {
|
||||
match v {
|
||||
Value::Number(n) => {
|
||||
n.is_i64() || n.is_u64() || n.as_f64().filter(|n| n.fract() == 0.0).is_some()
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
@ -1,621 +0,0 @@
|
||||
use crate::registry::REGISTRY;
|
||||
use crate::util::{equals, is_integer};
|
||||
use serde_json::{Value, json, Map};
|
||||
use std::collections::HashSet;
|
||||
|
||||
#[derive(Debug, Clone, serde::Serialize)]
|
||||
pub struct ValidationError {
|
||||
pub code: String,
|
||||
pub message: String,
|
||||
pub path: String,
|
||||
pub context: Value,
|
||||
pub cause: Value,
|
||||
pub schema_id: String,
|
||||
}
|
||||
|
||||
#[derive(Default, Clone, Copy)]
|
||||
pub struct ValidationOptions {
|
||||
pub be_strict: bool,
|
||||
}
|
||||
|
||||
pub struct Validator<'a> {
|
||||
options: ValidationOptions,
|
||||
// The top-level root schema ID we started with
|
||||
root_schema_id: String,
|
||||
// Accumulated errors
|
||||
errors: Vec<ValidationError>,
|
||||
// Max depth to prevent stack overflow
|
||||
max_depth: usize,
|
||||
_phantom: std::marker::PhantomData<&'a ()>,
|
||||
}
|
||||
|
||||
/// Context passed down through the recursion
|
||||
#[derive(Clone)]
|
||||
struct ValidationContext {
|
||||
// Current JSON pointer path in the instance (e.g. "/users/0/name")
|
||||
current_path: String,
|
||||
// The properties overridden by parent schemas (for JSPG inheritance)
|
||||
overrides: HashSet<String>,
|
||||
// Current resolution scope for $ref (changes when following refs)
|
||||
resolution_scope: String,
|
||||
// Current recursion depth
|
||||
depth: usize,
|
||||
}
|
||||
|
||||
impl ValidationContext {
|
||||
fn append_path(&self, extra: &str) -> ValidationContext {
|
||||
let mut new_ctx = self.clone();
|
||||
if new_ctx.current_path.ends_with('/') {
|
||||
new_ctx.current_path.push_str(extra);
|
||||
} else if new_ctx.current_path.is_empty() {
|
||||
new_ctx.current_path.push('/');
|
||||
new_ctx.current_path.push_str(extra);
|
||||
} else {
|
||||
new_ctx.current_path.push('/');
|
||||
new_ctx.current_path.push_str(extra);
|
||||
}
|
||||
new_ctx
|
||||
}
|
||||
|
||||
fn append_path_new_scope(&self, extra: &str) -> ValidationContext {
|
||||
let mut new_ctx = self.append_path(extra);
|
||||
// Structural recursion clears overrides
|
||||
new_ctx.overrides.clear();
|
||||
new_ctx
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Validator<'a> {
|
||||
pub fn new(options: ValidationOptions, root_schema_id: &str) -> Self {
|
||||
Self {
|
||||
options,
|
||||
root_schema_id: root_schema_id.to_string(),
|
||||
errors: Vec::new(),
|
||||
max_depth: 100,
|
||||
_phantom: std::marker::PhantomData,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn validate(&mut self, schema: &Value, instance: &Value) -> Result<(), Vec<ValidationError>> {
|
||||
let ctx = ValidationContext {
|
||||
current_path: String::new(),
|
||||
overrides: HashSet::new(),
|
||||
resolution_scope: self.root_schema_id.clone(),
|
||||
depth: 0,
|
||||
};
|
||||
|
||||
// We treat the top-level validate as "not lax" by default, unless specific schema logic says otherwise.
|
||||
let is_lax = !self.options.be_strict;
|
||||
|
||||
self.validate_node(schema, instance, ctx, is_lax, false, false);
|
||||
|
||||
if self.errors.is_empty() {
|
||||
Ok(())
|
||||
} else {
|
||||
Err(self.errors.clone())
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_node(
|
||||
&mut self,
|
||||
schema: &Value,
|
||||
instance: &Value,
|
||||
mut ctx: ValidationContext,
|
||||
is_lax: bool,
|
||||
skip_strict: bool,
|
||||
skip_id: bool,
|
||||
) -> HashSet<String> {
|
||||
let mut evaluated = HashSet::new();
|
||||
|
||||
// Recursion limit
|
||||
if ctx.depth > self.max_depth {
|
||||
self.add_error("MAX_DEPTH_REACHED", "Maximum recursion depth exceeded".to_string(), instance, json!({ "depth": ctx.depth }), &ctx);
|
||||
return evaluated;
|
||||
}
|
||||
|
||||
ctx.depth += 1;
|
||||
|
||||
// Handle Boolean Schemas
|
||||
if let Value::Bool(b) = schema {
|
||||
if !b {
|
||||
self.add_error("FALSE_SCHEMA", "Schema is always false".to_string(), instance, Value::Null, &ctx);
|
||||
}
|
||||
return evaluated;
|
||||
}
|
||||
|
||||
let schema_obj = match schema.as_object() {
|
||||
Some(o) => o,
|
||||
None => return evaluated, // Should be object or bool
|
||||
};
|
||||
|
||||
// 1. Update Resolution Scope ($id)
|
||||
if !skip_id {
|
||||
if let Some(Value::String(id)) = schema_obj.get("$id") {
|
||||
if id.contains("://") {
|
||||
ctx.resolution_scope = id.clone();
|
||||
} else {
|
||||
if let Some(pos) = ctx.resolution_scope.rfind('/') {
|
||||
let base = &ctx.resolution_scope[..pos + 1];
|
||||
ctx.resolution_scope = format!("{}{}", base, id);
|
||||
} else {
|
||||
ctx.resolution_scope = id.clone();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Identify Overrides (JSPG Custom Logic)
|
||||
let mut inheritance_ctx = ctx.clone();
|
||||
if let Some(Value::Object(props)) = schema_obj.get("properties") {
|
||||
for (pname, pval) in props {
|
||||
if let Some(Value::Bool(true)) = pval.get("override") {
|
||||
inheritance_ctx.overrides.insert(pname.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Determine Laxness
|
||||
let mut current_lax = is_lax;
|
||||
if let Some(Value::Bool(true)) = schema_obj.get("unevaluatedProperties") { current_lax = true; }
|
||||
if let Some(Value::Bool(true)) = schema_obj.get("additionalProperties") { current_lax = true; }
|
||||
|
||||
// ======== VALIDATION KEYWORDS ========
|
||||
|
||||
// Type
|
||||
if let Some(type_val) = schema_obj.get("type") {
|
||||
if !self.check_type(type_val, instance) {
|
||||
let got = value_type_name(instance);
|
||||
let want_json = serde_json::to_value(type_val).unwrap_or(json!("unknown"));
|
||||
self.add_error("TYPE_MISMATCH", format!("Expected type {:?} but got {}", type_val, got), instance, json!({ "want": type_val, "got": got }), &ctx);
|
||||
}
|
||||
}
|
||||
|
||||
// Enum
|
||||
if let Some(Value::Array(vals)) = schema_obj.get("enum") {
|
||||
if !vals.iter().any(|v| equals(v, instance)) {
|
||||
self.add_error("ENUM_VIOLATED", "Value not in enum".to_string(), instance, json!({ "want": vals }), &ctx);
|
||||
}
|
||||
}
|
||||
|
||||
// Const
|
||||
if let Some(c) = schema_obj.get("const") {
|
||||
if !equals(c, instance) {
|
||||
self.add_error("CONST_VIOLATED", "Value does not match constant".to_string(), instance, json!({ "want": c }), &ctx);
|
||||
}
|
||||
}
|
||||
|
||||
// Object Validation
|
||||
if let Value::Object(obj) = instance {
|
||||
let obj_eval = self.validate_object(schema_obj, obj, instance, &ctx, current_lax);
|
||||
evaluated.extend(obj_eval);
|
||||
}
|
||||
|
||||
// Array Validation
|
||||
if let Value::Array(arr) = instance {
|
||||
self.validate_array(schema_obj, arr, &ctx, current_lax);
|
||||
}
|
||||
|
||||
// Primitive Validation
|
||||
self.validate_primitives(schema_obj, instance, &ctx);
|
||||
|
||||
// Combinators
|
||||
evaluated.extend(self.validate_combinators(schema_obj, instance, &inheritance_ctx, current_lax));
|
||||
|
||||
// Conditionals
|
||||
evaluated.extend(self.validate_conditionals(schema_obj, instance, &inheritance_ctx, current_lax));
|
||||
|
||||
// $ref
|
||||
if let Some(Value::String(ref_str)) = schema_obj.get("$ref") {
|
||||
if let Some((ref_schema, scope_uri)) = REGISTRY.resolve(ref_str, Some(&inheritance_ctx.resolution_scope)) {
|
||||
let mut new_ctx = inheritance_ctx.clone();
|
||||
new_ctx.resolution_scope = scope_uri;
|
||||
let ref_evaluated = self.validate_node(&ref_schema, instance, new_ctx, is_lax, true, true);
|
||||
evaluated.extend(ref_evaluated);
|
||||
} else {
|
||||
self.add_error("SCHEMA_NOT_FOUND", format!("Ref '{}' not found", ref_str), instance, json!({ "ref": ref_str }), &ctx);
|
||||
}
|
||||
}
|
||||
|
||||
// Unevaluated / Strictness Check
|
||||
self.check_unevaluated(schema_obj, instance, &evaluated, &ctx, current_lax, skip_strict);
|
||||
|
||||
evaluated
|
||||
}
|
||||
|
||||
fn validate_object(
|
||||
&mut self,
|
||||
schema: &Map<String, Value>,
|
||||
obj: &Map<String, Value>,
|
||||
instance: &Value,
|
||||
ctx: &ValidationContext,
|
||||
is_lax: bool,
|
||||
) -> HashSet<String> {
|
||||
let mut evaluated = HashSet::new();
|
||||
|
||||
// required
|
||||
if let Some(Value::Array(req)) = schema.get("required") {
|
||||
for field_val in req {
|
||||
if let Some(field) = field_val.as_str() {
|
||||
if !obj.contains_key(field) {
|
||||
self.add_error("REQUIRED_FIELD_MISSING", format!("Required field '{}' is missing", field), &Value::Null, json!({ "want": [field] }), &ctx.append_path(field));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// properties
|
||||
if let Some(Value::Object(props)) = schema.get("properties") {
|
||||
for (pname, psch) in props {
|
||||
if obj.contains_key(pname) {
|
||||
if ctx.overrides.contains(pname) {
|
||||
evaluated.insert(pname.clone());
|
||||
continue;
|
||||
}
|
||||
evaluated.insert(pname.clone());
|
||||
let sub_ctx = ctx.append_path_new_scope(pname);
|
||||
self.validate_node(psch, &obj[pname], sub_ctx, is_lax, false, false);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// patternProperties
|
||||
if let Some(Value::Object(pprops)) = schema.get("patternProperties") {
|
||||
for (pattern, psch) in pprops {
|
||||
if let Ok(re) = regex::Regex::new(pattern) {
|
||||
for (pname, pval) in obj {
|
||||
if re.is_match(pname) {
|
||||
if ctx.overrides.contains(pname) {
|
||||
evaluated.insert(pname.clone());
|
||||
continue;
|
||||
}
|
||||
evaluated.insert(pname.clone());
|
||||
let sub_ctx = ctx.append_path_new_scope(pname);
|
||||
self.validate_node(psch, pval, sub_ctx, is_lax, false, false);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// additionalProperties
|
||||
if let Some(apsch) = schema.get("additionalProperties") {
|
||||
if apsch.is_object() || apsch.is_boolean() {
|
||||
for (key, val) in obj {
|
||||
let in_props = schema.get("properties").and_then(|p| p.as_object()).map_or(false, |p| p.contains_key(key));
|
||||
let in_patterns = schema.get("patternProperties").and_then(|p| p.as_object()).map_or(false, |pp| {
|
||||
pp.keys().any(|k| regex::Regex::new(k).map(|re| re.is_match(key)).unwrap_or(false))
|
||||
});
|
||||
|
||||
if !in_props && !in_patterns {
|
||||
evaluated.insert(key.clone());
|
||||
let sub_ctx = ctx.append_path_new_scope(key);
|
||||
self.validate_node(apsch, val, sub_ctx, is_lax, false, false);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// dependentRequired
|
||||
if let Some(Value::Object(dep_req)) = schema.get("dependentRequired") {
|
||||
for (prop, required_fields_val) in dep_req {
|
||||
if obj.contains_key(prop) {
|
||||
if let Value::Array(required_fields) = required_fields_val {
|
||||
for req_field_val in required_fields {
|
||||
if let Some(req_field) = req_field_val.as_str() {
|
||||
if !obj.contains_key(req_field) {
|
||||
self.add_error("DEPENDENCY_FAILED", format!("Field '{}' is required when '{}' is present", req_field, prop), &Value::Null, json!({ "prop": prop, "missing": [req_field] }), &ctx.append_path(req_field));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// dependentSchemas
|
||||
if let Some(Value::Object(dep_sch)) = schema.get("dependentSchemas") {
|
||||
for (prop, psch) in dep_sch {
|
||||
if obj.contains_key(prop) {
|
||||
let sub_evaluated = self.validate_node(psch, instance, ctx.clone(), is_lax, false, false);
|
||||
evaluated.extend(sub_evaluated);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// legacy dependencies (Draft 4-7 compat)
|
||||
if let Some(Value::Object(deps)) = schema.get("dependencies") {
|
||||
for (prop, dep_val) in deps {
|
||||
if obj.contains_key(prop) {
|
||||
match dep_val {
|
||||
Value::Array(arr) => {
|
||||
for req_val in arr {
|
||||
if let Some(req_field) = req_val.as_str() {
|
||||
if !obj.contains_key(req_field) {
|
||||
self.add_error(
|
||||
"DEPENDENCY_FAILED",
|
||||
format!("Field '{}' is required when '{}' is present", req_field, prop),
|
||||
&Value::Null,
|
||||
json!({ "prop": prop, "missing": [req_field] }),
|
||||
&ctx.append_path(req_field),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
Value::Object(_) => {
|
||||
// Schema dependency
|
||||
let sub_evaluated = self.validate_node(dep_val, instance, ctx.clone(), is_lax, false, false);
|
||||
evaluated.extend(sub_evaluated);
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// minProperties / maxProperties
|
||||
if let Some(min) = schema.get("minProperties").and_then(|v| v.as_u64()) {
|
||||
if (obj.len() as u64) < min {
|
||||
self.add_error("MIN_PROPERTIES_VIOLATED", format!("Object must have at least {} properties", min), &json!(obj.len()), json!({ "want": min, "got": obj.len() }), ctx);
|
||||
}
|
||||
}
|
||||
if let Some(max) = schema.get("maxProperties").and_then(|v| v.as_u64()) {
|
||||
if (obj.len() as u64) > max {
|
||||
self.add_error("MAX_PROPERTIES_VIOLATED", format!("Object must have at most {} properties", max), &json!(obj.len()), json!({ "want": max, "got": obj.len() }), ctx);
|
||||
}
|
||||
}
|
||||
|
||||
evaluated
|
||||
}
|
||||
|
||||
fn validate_array(
|
||||
&mut self,
|
||||
schema: &Map<String, Value>,
|
||||
arr: &Vec<Value>,
|
||||
ctx: &ValidationContext,
|
||||
is_lax: bool,
|
||||
) {
|
||||
if let Some(min) = schema.get("minItems").and_then(|v| v.as_u64()) {
|
||||
if (arr.len() as u64) < min {
|
||||
self.add_error("MIN_ITEMS_VIOLATED", format!("Array must have at least {} items", min), &json!(arr.len()), json!({ "want": min, "got": arr.len() }), ctx);
|
||||
}
|
||||
}
|
||||
if let Some(max) = schema.get("maxItems").and_then(|v| v.as_u64()) {
|
||||
if (arr.len() as u64) > max {
|
||||
self.add_error("MAX_ITEMS_VIOLATED", format!("Array must have at most {} items", max), &json!(arr.len()), json!({ "want": max, "got": arr.len() }), ctx);
|
||||
}
|
||||
}
|
||||
|
||||
let mut evaluated_index = 0;
|
||||
if let Some(Value::Array(prefix)) = schema.get("prefixItems") {
|
||||
for (i, psch) in prefix.iter().enumerate() {
|
||||
if let Some(item) = arr.get(i) {
|
||||
let sub_ctx = ctx.append_path_new_scope(&i.to_string());
|
||||
self.validate_node(psch, item, sub_ctx, is_lax, false, false);
|
||||
evaluated_index = i + 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(items_val) = schema.get("items") {
|
||||
if let Value::Bool(false) = items_val {
|
||||
if arr.len() > evaluated_index {
|
||||
self.add_error("ADDITIONAL_ITEMS_NOT_ALLOWED", "Extra items not allowed".to_string(), &json!(arr.len()), json!({ "got": arr.len() - evaluated_index }), ctx);
|
||||
}
|
||||
} else {
|
||||
// Schema or true
|
||||
for i in evaluated_index..arr.len() {
|
||||
let sub_ctx = ctx.append_path_new_scope(&i.to_string());
|
||||
self.validate_node(items_val, &arr[i], sub_ctx, is_lax, false, false);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(contains_sch) = schema.get("contains") {
|
||||
let mut matches = 0;
|
||||
for (i, item) in arr.iter().enumerate() {
|
||||
let mut sub = self.branch();
|
||||
let sub_ctx = ctx.append_path_new_scope(&i.to_string());
|
||||
sub.validate_node(contains_sch, item, sub_ctx, is_lax, false, false);
|
||||
if sub.errors.is_empty() {
|
||||
matches += 1;
|
||||
}
|
||||
}
|
||||
if matches == 0 {
|
||||
self.add_error("CONTAINS_FAILED", "No items match 'contains' schema".to_string(), &json!(arr), json!({}), ctx);
|
||||
}
|
||||
if let Some(min) = schema.get("minContains").and_then(|v| v.as_u64()) {
|
||||
if (matches as u64) < min {
|
||||
self.add_error("MIN_CONTAINS_VIOLATED", format!("Expected at least {} items to match 'contains'", min), &json!(arr), json!({ "want": min, "got": matches }), ctx);
|
||||
}
|
||||
}
|
||||
if let Some(max) = schema.get("maxContains").and_then(|v| v.as_u64()) {
|
||||
if (matches as u64) > max {
|
||||
self.add_error("MAX_CONTAINS_VIOLATED", format!("Expected at most {} items to match 'contains'", max), &json!(arr), json!({ "want": max, "got": matches }), ctx);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// uniqueItems
|
||||
if let Some(Value::Bool(true)) = schema.get("uniqueItems") {
|
||||
for i in 0..arr.len() {
|
||||
for j in (i + 1)..arr.len() {
|
||||
if equals(&arr[i], &arr[j]) {
|
||||
self.add_error("UNIQUE_ITEMS_VIOLATED", format!("Array items at indices {} and {} are equal", i, j), &json!(arr), json!({ "got": [i, j] }), ctx);
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_primitives(&mut self, schema: &Map<String, Value>, instance: &Value, ctx: &ValidationContext) {
|
||||
if let Some(s) = instance.as_str() {
|
||||
if let Some(min) = schema.get("minLength").and_then(|v| v.as_u64()) {
|
||||
if (s.chars().count() as u64) < min { self.add_error("MIN_LENGTH_VIOLATED", format!("String too short (min {})", min), instance, json!({ "want": min, "got": s.len() }), ctx); }
|
||||
}
|
||||
if let Some(max) = schema.get("maxLength").and_then(|v| v.as_u64()) {
|
||||
if (s.chars().count() as u64) > max { self.add_error("MAX_LENGTH_VIOLATED", format!("String too long (max {})", max), instance, json!({ "want": max, "got": s.len() }), ctx); }
|
||||
}
|
||||
if let Some(Value::String(pat)) = schema.get("pattern") {
|
||||
if let Ok(re) = regex::Regex::new(pat) {
|
||||
if !re.is_match(s) { self.add_error("PATTERN_VIOLATED", format!("String does not match pattern '{}'", pat), instance, json!({ "want": pat, "got": s }), ctx); }
|
||||
}
|
||||
}
|
||||
if let Some(Value::String(fmt)) = schema.get("format") {
|
||||
if !s.is_empty() {
|
||||
match fmt.as_str() {
|
||||
"uuid" => { if uuid::Uuid::parse_str(s).is_err() { self.add_error("FORMAT_INVALID", format!("Value '{}' is not a valid UUID", s), instance, json!({ "format": "uuid" }), ctx); } }
|
||||
"date-time" => { if chrono::DateTime::parse_from_rfc3339(s).is_err() { self.add_error("FORMAT_INVALID", format!("Value '{}' is not a valid date-time", s), instance, json!({ "format": "date-time" }), ctx); } }
|
||||
"email" => { if !s.contains('@') { self.add_error("FORMAT_INVALID", format!("Value '{}' is not a valid email", s), instance, json!({ "format": "email" }), ctx); } }
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(n) = instance.as_f64() {
|
||||
if let Some(min) = schema.get("minimum").and_then(|v| v.as_f64()) {
|
||||
if n < min { self.add_error("MINIMUM_VIOLATED", format!("Value {} < minimum {}", n, min), instance, json!({ "want": min, "got": n }), ctx); }
|
||||
}
|
||||
if let Some(max) = schema.get("maximum").and_then(|v| v.as_f64()) {
|
||||
if n > max { self.add_error("MAXIMUM_VIOLATED", format!("Value {} > maximum {}", n, max), instance, json!({ "want": max, "got": n }), ctx); }
|
||||
}
|
||||
if let Some(min) = schema.get("exclusiveMinimum").and_then(|v| v.as_f64()) {
|
||||
if n <= min { self.add_error("EXCLUSIVE_MINIMUM_VIOLATED", format!("Value {} <= exclusive minimum {}", n, min), instance, json!({ "want": min, "got": n }), ctx); }
|
||||
}
|
||||
if let Some(max) = schema.get("exclusiveMaximum").and_then(|v| v.as_f64()) {
|
||||
if n >= max { self.add_error("EXCLUSIVE_MAXIMUM_VIOLATED", format!("Value {} >= exclusive maximum {}", n, max), instance, json!({ "want": max, "got": n }), ctx); }
|
||||
}
|
||||
if let Some(mult) = schema.get("multipleOf").and_then(|v| v.as_f64()) {
|
||||
let rem = (n / mult).fract();
|
||||
if rem.abs() > f64::EPSILON && (1.0 - rem).abs() > f64::EPSILON {
|
||||
self.add_error("MULTIPLE_OF_VIOLATED", format!("Value {} not multiple of {}", n, mult), instance, json!({ "want": mult, "got": n }), ctx);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_combinators(&mut self, schema: &Map<String, Value>, instance: &Value, ctx: &ValidationContext, is_lax: bool) -> HashSet<String> {
|
||||
let mut evaluated = HashSet::new();
|
||||
if let Some(Value::Array(all_of)) = schema.get("allOf") {
|
||||
for sch in all_of { evaluated.extend(self.validate_node(sch, instance, ctx.clone(), is_lax, true, false)); }
|
||||
}
|
||||
if let Some(Value::Array(any_of)) = schema.get("anyOf") {
|
||||
let mut matched = false;
|
||||
let mut errors_acc = Vec::new();
|
||||
for sch in any_of {
|
||||
let mut sub = self.branch();
|
||||
let sub_eval = sub.validate_node(sch, instance, ctx.clone(), is_lax, false, false);
|
||||
if sub.errors.is_empty() { matched = true; evaluated.extend(sub_eval); } else { errors_acc.extend(sub.errors); }
|
||||
}
|
||||
if !matched { self.add_error("ANY_OF_VIOLATED", "Value did not match any allowed schema".to_string(), instance, json!({ "causes": errors_acc }), ctx); }
|
||||
}
|
||||
if let Some(Value::Array(one_of)) = schema.get("oneOf") {
|
||||
let mut match_count = 0;
|
||||
let mut last_eval = HashSet::new();
|
||||
let mut error_causes = Vec::new();
|
||||
for sch in one_of {
|
||||
let mut sub = self.branch();
|
||||
let sub_eval = sub.validate_node(sch, instance, ctx.clone(), is_lax, false, false);
|
||||
if sub.errors.is_empty() { match_count += 1; last_eval = sub_eval; } else { error_causes.extend(sub.errors); }
|
||||
}
|
||||
if match_count == 1 { evaluated.extend(last_eval); }
|
||||
else { self.add_error("ONE_OF_VIOLATED", format!("Value matched {} schemas, expected 1", match_count), instance, json!({ "matched": match_count, "causes": error_causes }), ctx); }
|
||||
}
|
||||
if let Some(not_sch) = schema.get("not") {
|
||||
let mut sub = self.branch();
|
||||
sub.validate_node(not_sch, instance, ctx.clone(), is_lax, false, false);
|
||||
if sub.errors.is_empty() { self.add_error("NOT_VIOLATED", "Value matched 'not' schema".to_string(), instance, Value::Null, ctx); }
|
||||
}
|
||||
evaluated
|
||||
}
|
||||
|
||||
fn validate_conditionals(&mut self, schema: &Map<String, Value>, instance: &Value, ctx: &ValidationContext, is_lax: bool) -> HashSet<String> {
|
||||
let mut evaluated = HashSet::new();
|
||||
if let Some(if_sch) = schema.get("if") {
|
||||
let mut sub = self.branch();
|
||||
let sub_eval = sub.validate_node(if_sch, instance, ctx.clone(), is_lax, true, false);
|
||||
if sub.errors.is_empty() {
|
||||
evaluated.extend(sub_eval);
|
||||
if let Some(then_sch) = schema.get("then") { evaluated.extend(self.validate_node(then_sch, instance, ctx.clone(), is_lax, false, false)); }
|
||||
} else if let Some(else_sch) = schema.get("else") {
|
||||
evaluated.extend(self.validate_node(else_sch, instance, ctx.clone(), is_lax, false, false));
|
||||
}
|
||||
}
|
||||
evaluated
|
||||
}
|
||||
|
||||
fn check_unevaluated(&mut self, schema: &Map<String, Value>, instance: &Value, evaluated: &HashSet<String>, ctx: &ValidationContext, is_lax: bool, skip_strict: bool) {
|
||||
if let Value::Object(obj) = instance {
|
||||
if let Some(Value::Bool(false)) = schema.get("additionalProperties") {
|
||||
for key in obj.keys() {
|
||||
let in_props = schema.get("properties").and_then(|p| p.as_object()).map_or(false, |p| p.contains_key(key));
|
||||
let in_pattern = schema.get("patternProperties").and_then(|p| p.as_object()).map_or(false, |pp| pp.keys().any(|k| regex::Regex::new(k).map(|re| re.is_match(key)).unwrap_or(false)));
|
||||
if !in_props && !in_pattern {
|
||||
if ctx.overrides.contains(key) { continue; }
|
||||
self.add_error("ADDITIONAL_PROPERTIES_NOT_ALLOWED", format!("Property '{}' is not allowed", key), &Value::Null, json!({ "got": [key] }), &ctx.append_path(key));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let explicit_opts = schema.contains_key("unevaluatedProperties") || schema.contains_key("additionalProperties");
|
||||
let should_check_strict = self.options.be_strict && !is_lax && !explicit_opts && !skip_strict;
|
||||
let check_unevaluated = matches!(schema.get("unevaluatedProperties"), Some(Value::Bool(false)));
|
||||
if should_check_strict || check_unevaluated {
|
||||
for key in obj.keys() {
|
||||
if !evaluated.contains(key) {
|
||||
if ctx.overrides.contains(key) { continue; }
|
||||
self.add_error("ADDITIONAL_PROPERTIES_NOT_ALLOWED", format!("Property '{}' is not allowed (strict/unevaluated)", key), &Value::Null, json!({ "got": [key] }), &ctx.append_path(key));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn check_type(&self, expected: &Value, instance: &Value) -> bool {
|
||||
match expected {
|
||||
Value::String(s) => self.is_primitive_type(s, instance),
|
||||
Value::Array(arr) => arr.iter().filter_map(|v| v.as_str()).any(|pt| self.is_primitive_type(pt, instance)),
|
||||
_ => false
|
||||
}
|
||||
}
|
||||
|
||||
fn is_primitive_type(&self, pt: &str, instance: &Value) -> bool {
|
||||
match pt {
|
||||
"string" => instance.is_string(),
|
||||
"number" => instance.is_number(),
|
||||
"integer" => is_integer(instance),
|
||||
"boolean" => instance.is_boolean(),
|
||||
"array" => instance.is_array(),
|
||||
"object" => instance.is_object(),
|
||||
"null" => instance.is_null(),
|
||||
_ => false
|
||||
}
|
||||
}
|
||||
|
||||
fn branch(&self) -> Self {
|
||||
Self { options: self.options, root_schema_id: self.root_schema_id.clone(), errors: Vec::new(), max_depth: self.max_depth, _phantom: std::marker::PhantomData }
|
||||
}
|
||||
|
||||
fn add_error(&mut self, code: &str, message: String, context: &Value, cause: Value, ctx: &ValidationContext) {
|
||||
let path = ctx.current_path.clone();
|
||||
if self.errors.iter().any(|e| e.code == code && e.path == path) { return; }
|
||||
self.errors.push(ValidationError { code: code.to_string(), message, path, context: context.clone(), cause, schema_id: self.root_schema_id.clone() });
|
||||
}
|
||||
|
||||
fn extend_unique(&mut self, errors: Vec<ValidationError>) {
|
||||
for e in errors { if !self.errors.iter().any(|existing| existing.code == e.code && existing.path == e.path) { self.errors.push(e); } }
|
||||
}
|
||||
}
|
||||
|
||||
fn value_type_name(v: &Value) -> &'static str {
|
||||
match v {
|
||||
Value::Null => "null",
|
||||
Value::Bool(_) => "boolean",
|
||||
Value::Number(n) => if n.is_i64() { "integer" } else { "number" },
|
||||
Value::String(_) => "string",
|
||||
Value::Array(_) => "array",
|
||||
Value::Object(_) => "object",
|
||||
}
|
||||
}
|
||||
@ -1,88 +0,0 @@
|
||||
use serde_json::Value;
|
||||
use pgrx::JsonB;
|
||||
|
||||
// Simple test helpers for cleaner test code
|
||||
pub fn assert_success(result: &JsonB) {
|
||||
let json = &result.0;
|
||||
if !json.get("response").is_some() || json.get("errors").is_some() {
|
||||
let pretty = serde_json::to_string_pretty(json).unwrap_or_else(|_| format!("{:?}", json));
|
||||
panic!("Expected success but got:\n{}", pretty);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn assert_failure(result: &JsonB) {
|
||||
let json = &result.0;
|
||||
if json.get("response").is_some() || !json.get("errors").is_some() {
|
||||
let pretty = serde_json::to_string_pretty(json).unwrap_or_else(|_| format!("{:?}", json));
|
||||
panic!("Expected failure but got:\n{}", pretty);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn assert_error_count(result: &JsonB, expected_count: usize) {
|
||||
assert_failure(result);
|
||||
let errors = get_errors(result);
|
||||
if errors.len() != expected_count {
|
||||
let pretty = serde_json::to_string_pretty(&result.0).unwrap_or_else(|_| format!("{:?}", result.0));
|
||||
panic!("Expected {} errors, got {}:\n{}", expected_count, errors.len(), pretty);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_errors(result: &JsonB) -> &Vec<Value> {
|
||||
result.0["errors"].as_array().expect("errors should be an array")
|
||||
}
|
||||
|
||||
pub fn has_error_with_code(result: &JsonB, code: &str) -> bool {
|
||||
get_errors(result).iter().any(|e| e["code"] == code)
|
||||
}
|
||||
|
||||
|
||||
pub fn has_error_with_code_and_path(result: &JsonB, code: &str, path: &str) -> bool {
|
||||
get_errors(result).iter().any(|e| e["code"] == code && e["details"]["path"] == path)
|
||||
}
|
||||
|
||||
pub fn assert_has_error(result: &JsonB, code: &str, path: &str) {
|
||||
if !has_error_with_code_and_path(result, code, path) {
|
||||
let pretty = serde_json::to_string_pretty(&result.0).unwrap_or_else(|_| format!("{:?}", result.0));
|
||||
panic!("Expected error with code='{}' and path='{}' but not found:\n{}", code, path, pretty);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn find_error_with_code<'a>(result: &'a JsonB, code: &str) -> &'a Value {
|
||||
get_errors(result).iter().find(|e| e["code"] == code)
|
||||
.unwrap_or_else(|| panic!("No error found with code '{}'", code))
|
||||
}
|
||||
|
||||
|
||||
pub fn find_error_with_code_and_path<'a>(result: &'a JsonB, code: &str, path: &str) -> &'a Value {
|
||||
get_errors(result).iter().find(|e| e["code"] == code && e["details"]["path"] == path)
|
||||
.unwrap_or_else(|| panic!("No error found with code '{}' and path '{}'", code, path))
|
||||
}
|
||||
|
||||
pub fn assert_error_detail(error: &Value, detail_key: &str, expected_value: &str) {
|
||||
let actual = error["details"][detail_key].as_str()
|
||||
.unwrap_or_else(|| panic!("Error detail '{}' is not a string", detail_key));
|
||||
assert_eq!(actual, expected_value, "Error detail '{}' mismatch", detail_key);
|
||||
}
|
||||
|
||||
|
||||
// Additional convenience helpers for common patterns
|
||||
|
||||
pub fn assert_error_message_contains(error: &Value, substring: &str) {
|
||||
let message = error["message"].as_str().expect("error should have message");
|
||||
assert!(message.contains(substring), "Expected message to contain '{}', got '{}'", substring, message);
|
||||
}
|
||||
|
||||
pub fn assert_error_cause_json(error: &Value, expected_cause: &Value) {
|
||||
let cause = &error["details"]["cause"];
|
||||
assert!(cause.is_object(), "cause should be JSON object");
|
||||
assert_eq!(cause, expected_cause, "cause mismatch");
|
||||
}
|
||||
|
||||
pub fn assert_error_context(error: &Value, expected_context: &Value) {
|
||||
assert_eq!(&error["details"]["context"], expected_context, "context mismatch");
|
||||
}
|
||||
|
||||
|
||||
pub fn jsonb(val: Value) -> JsonB {
|
||||
JsonB(val)
|
||||
}
|
||||
1128
old_tests/schemas.rs
1128
old_tests/schemas.rs
File diff suppressed because it is too large
Load Diff
1089
old_tests/tests.rs
1089
old_tests/tests.rs
File diff suppressed because it is too large
Load Diff
106
puncs_6_fix.txt
106
puncs_6_fix.txt
@ -1,106 +0,0 @@
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.39s
|
||||
Running tests/tests.rs (target/debug/deps/tests-0f6b1e496850f0af)
|
||||
|
||||
running 1 test
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"name", "job_id", "manager_id"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "name", "manager_id", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"name", "manager_id", "job_id", "type"}
|
||||
DEBUG: validate_object inserted 'nested_or_super_job' at /nested_or_super_job. Keys: {"name", "manager_id", "job_id", "type"}
|
||||
DEBUG: check_strictness at /root_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /root_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /root_job/job_id. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /root_job/type. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"job_id", "type", "name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "type", "name"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name", "job_id", "type"}
|
||||
DEBUG: validate_object inserted 'root_job' at /root_job. Keys: {"name", "manager_id", "job_id", "type", "nested_or_super_job"}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {"name", "manager_id", "job_id", "type", "nested_or_super_job", "root_job"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"job_id", "name", "manager_id"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "name", "type", "manager_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"type", "job_id", "manager_id", "name"}
|
||||
DEBUG: validate_object inserted 'nested_or_super_job' at /nested_or_super_job. Keys: {"job_id", "name", "manager_id", "type"}
|
||||
DEBUG: check_strictness at /root_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /root_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /root_job/job_id. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /root_job/type. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name", "type", "job_id"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "type", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"type", "name", "job_id"}
|
||||
DEBUG: validate_object inserted 'root_job' at /root_job. Keys: {"job_id", "name", "manager_id", "type", "nested_or_super_job"}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {"root_job", "job_id", "name", "manager_id", "type", "nested_or_super_job"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {}
|
||||
DEBUG: validate_refs merging res from job. Keys: {}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/my_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/my_job/type. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name", "type"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name", "type"}
|
||||
DEBUG: validate_object inserted 'my_job' at /nested_or_super_job/my_job. Keys: {"type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"job_id", "name", "manager_id"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "manager_id", "type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("strict_org_punc.request") ref=Some("organization")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("strict_org_punc.request") ref=Some("organization")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
|
||||
thread 'test_puncs_6' (15118678) panicked at tests/tests.rs:150:44:
|
||||
called `Result::unwrap()` on an `Err` value: "[complex punc type matching with oneOf and nested refs] Test 'valid person against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", details: ErrorDetails { path: \"/first_name\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against strict punc' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
test test_puncs_6 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
failures:
|
||||
test_puncs_6
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 338 filtered out; finished in 0.01s
|
||||
|
||||
error: test failed, to rerun pass `--test tests`
|
||||
103
puncs_6_full.txt
103
puncs_6_full.txt
@ -1,103 +0,0 @@
|
||||
Blocking waiting for file lock on artifact directory
|
||||
Compiling jspg v0.1.0 (/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg)
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 7.63s
|
||||
Running tests/tests.rs (target/debug/deps/tests-0f6b1e496850f0af)
|
||||
|
||||
running 1 test
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"name", "job_id", "manager_id"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "manager_id", "type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"manager_id", "type", "name", "job_id"}
|
||||
DEBUG: validate_object inserted 'nested_or_super_job' at /nested_or_super_job. Keys: {"name", "job_id", "manager_id", "type"}
|
||||
DEBUG: check_strictness at /root_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /root_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /root_job/job_id. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /root_job/type. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name", "job_id", "type"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id", "type"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"job_id", "type", "name"}
|
||||
DEBUG: validate_object inserted 'root_job' at /root_job. Keys: {"name", "job_id", "nested_or_super_job", "manager_id", "type"}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {"root_job", "name", "job_id", "nested_or_super_job", "manager_id", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"job_id", "manager_id", "name"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"type", "job_id", "manager_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"job_id", "manager_id", "type", "name"}
|
||||
DEBUG: validate_object inserted 'nested_or_super_job' at /nested_or_super_job. Keys: {"name", "manager_id", "job_id", "type"}
|
||||
DEBUG: check_strictness at /root_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /root_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /root_job/job_id. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /root_job/type. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name", "type", "job_id"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "type", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"type", "job_id", "name"}
|
||||
DEBUG: validate_object inserted 'root_job' at /root_job. Keys: {"name", "manager_id", "job_id", "type", "nested_or_super_job"}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {"name", "root_job", "manager_id", "job_id", "type", "nested_or_super_job"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {}
|
||||
DEBUG: validate_refs merging res from job. Keys: {}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/my_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/my_job/type. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name", "type"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"type", "name"}
|
||||
DEBUG: validate_object inserted 'my_job' at /nested_or_super_job/my_job. Keys: {"type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"name", "job_id", "manager_id"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "manager_id", "name", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
|
||||
thread 'test_puncs_6' (15113120) panicked at tests/tests.rs:150:44:
|
||||
called `Result::unwrap()` on an `Err` value: "[complex punc type matching with oneOf and nested refs] Test 'valid person against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", details: ErrorDetails { path: \"/first_name\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against strict punc' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
test test_puncs_6 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
failures:
|
||||
test_puncs_6
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 338 filtered out; finished in 0.01s
|
||||
|
||||
error: test failed, to rerun pass `--test tests`
|
||||
@ -1,57 +0,0 @@
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.47s
|
||||
Running tests/tests.rs (target/debug/deps/tests-0f6b1e496850f0af)
|
||||
|
||||
running 1 test
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "type", "manager_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"type", "name", "manager_id", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"manager_id", "type", "job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"manager_id", "job_id", "type", "name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {}
|
||||
DEBUG: validate_refs merging res from job. Keys: {}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/my_job/name
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"type", "name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"manager_id", "type", "name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
|
||||
thread 'test_puncs_6' (15109801) panicked at tests/tests.rs:150:44:
|
||||
called `Result::unwrap()` on an `Err` value: "[complex punc type matching with oneOf and nested refs] Test 'valid person against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", details: ErrorDetails { path: \"/first_name\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against strict punc' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
test test_puncs_6 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
failures:
|
||||
test_puncs_6
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 338 filtered out; finished in 0.00s
|
||||
|
||||
error: test failed, to rerun pass `--test tests`
|
||||
@ -1,106 +0,0 @@
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.41s
|
||||
Running tests/tests.rs (target/debug/deps/tests-0f6b1e496850f0af)
|
||||
|
||||
running 1 test
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"name", "job_id", "manager_id"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"name", "job_id", "manager_id", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"name", "manager_id", "job_id", "type"}
|
||||
DEBUG: validate_object inserted 'nested_or_super_job' at /nested_or_super_job. Keys: {"job_id", "manager_id", "type", "name"}
|
||||
DEBUG: check_strictness at /root_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /root_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /root_job/job_id. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /root_job/type. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name", "job_id", "type"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id", "type"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"job_id", "type", "name"}
|
||||
DEBUG: validate_object inserted 'root_job' at /root_job. Keys: {"job_id", "nested_or_super_job", "manager_id", "type", "name"}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {"job_id", "nested_or_super_job", "manager_id", "type", "name", "root_job"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"manager_id", "name", "job_id"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"manager_id", "name", "job_id", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {"job_id", "manager_id", "type", "name"}
|
||||
DEBUG: validate_object inserted 'nested_or_super_job' at /nested_or_super_job. Keys: {"type", "manager_id", "job_id", "name"}
|
||||
DEBUG: check_strictness at /root_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /root_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /root_job/job_id. Keys: {"name"}
|
||||
DEBUG: check_strictness at /root_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /root_job/type. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"name", "type", "job_id"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "type", "job_id"}
|
||||
DEBUG: check_strictness at /root_job. Extensible: false. Keys: {"type", "name", "job_id"}
|
||||
DEBUG: validate_object inserted 'root_job' at /root_job. Keys: {"type", "manager_id", "job_id", "name", "nested_or_super_job"}
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {"type", "root_job", "manager_id", "job_id", "name", "nested_or_super_job"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {}
|
||||
DEBUG: validate_refs merging res from job. Keys: {}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/my_job/name. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/my_job/type. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"type", "name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/my_job. Extensible: false. Keys: {"name", "type"}
|
||||
DEBUG: validate_object inserted 'my_job' at /nested_or_super_job/my_job. Keys: {"name", "type"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: check_strictness at /nested_or_super_job/name. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'name' at /nested_or_super_job/name. Keys: {}
|
||||
DEBUG: validate_refs merging res from entity. Keys: {"name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/job_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'job_id' at /nested_or_super_job/job_id. Keys: {"name"}
|
||||
DEBUG: validate_refs merging res from job. Keys: {"name", "job_id"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/manager_id. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'manager_id' at /nested_or_super_job/manager_id. Keys: {"job_id", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job/type. Extensible: false. Keys: {}
|
||||
DEBUG: validate_object inserted 'type' at /nested_or_super_job/type. Keys: {"manager_id", "job_id", "name"}
|
||||
DEBUG: validate_refs merging res from super_job. Keys: {"job_id", "manager_id", "type", "name"}
|
||||
DEBUG: check_strictness at /nested_or_super_job. Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("polymorphic_org_punc.request") ref=Some("organization.family")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("strict_org_punc.request") ref=Some("organization")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
DEBUG: VALIDATE ROOT: id=Some("strict_org_punc.request") ref=Some("organization")
|
||||
DEBUG: check_strictness at . Extensible: false. Keys: {}
|
||||
|
||||
thread 'test_puncs_6' (15121282) panicked at tests/tests.rs:150:44:
|
||||
called `Result::unwrap()` on an `Err` value: "[complex punc type matching with oneOf and nested refs] Test 'valid person against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", details: ErrorDetails { path: \"/first_name\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against organization punc (polymorphic)' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]\n[complex punc type matching with oneOf and nested refs] Test 'valid organization against strict punc' failed. Expected: true, Got: true. Errors: [Error { punc: None, code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'id'\", details: ErrorDetails { path: \"/id\" } }]"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
test test_puncs_6 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
failures:
|
||||
test_puncs_6
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 338 filtered out; finished in 0.01s
|
||||
|
||||
error: test failed, to rerun pass `--test tests`
|
||||
386
src/compiler.rs
386
src/compiler.rs
@ -1,386 +0,0 @@
|
||||
use crate::schema::Schema;
|
||||
use regex::Regex;
|
||||
use serde_json::Value;
|
||||
// use std::collections::HashMap;
|
||||
use std::error::Error;
|
||||
use std::sync::Arc;
|
||||
|
||||
/// Represents a compiled format validator
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum CompiledFormat {
|
||||
/// A simple function pointer validator
|
||||
Func(fn(&Value) -> Result<(), Box<dyn Error + Send + Sync>>),
|
||||
/// A regex-based validator
|
||||
Regex(Regex),
|
||||
}
|
||||
|
||||
/// A wrapper for compiled regex patterns
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct CompiledRegex(pub Regex);
|
||||
|
||||
/// The Compiler is responsible for pre-calculating high-cost schema operations
|
||||
pub struct Compiler;
|
||||
|
||||
impl Compiler {
|
||||
/// Internal: Compiles formats and regexes in-place
|
||||
fn compile_formats_and_regexes(schema: &mut Schema) {
|
||||
// 1. Compile Format
|
||||
if let Some(format_str) = &schema.format {
|
||||
if let Some(fmt) = crate::formats::FORMATS.get(format_str.as_str()) {
|
||||
schema.compiled_format = Some(CompiledFormat::Func(fmt.func));
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Compile Pattern (regex)
|
||||
if let Some(pattern_str) = &schema.pattern {
|
||||
if let Ok(re) = Regex::new(pattern_str) {
|
||||
schema.compiled_pattern = Some(CompiledRegex(re));
|
||||
}
|
||||
}
|
||||
|
||||
// 2.5 Compile Pattern Properties
|
||||
if let Some(pp) = &schema.pattern_properties {
|
||||
let mut compiled_pp = Vec::new();
|
||||
for (pattern, sub_schema) in pp {
|
||||
if let Ok(re) = Regex::new(pattern) {
|
||||
compiled_pp.push((CompiledRegex(re), sub_schema.clone()));
|
||||
} else {
|
||||
eprintln!(
|
||||
"Invalid patternProperty regex in schema (compile time): {}",
|
||||
pattern
|
||||
);
|
||||
}
|
||||
}
|
||||
if !compiled_pp.is_empty() {
|
||||
schema.compiled_pattern_properties = Some(compiled_pp);
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Recurse
|
||||
Self::compile_recursive(schema);
|
||||
}
|
||||
|
||||
fn normalize_dependencies(schema: &mut Schema) {
|
||||
if let Some(deps) = schema.dependencies.take() {
|
||||
for (key, dep) in deps {
|
||||
match dep {
|
||||
crate::schema::Dependency::Props(props) => {
|
||||
schema
|
||||
.dependent_required
|
||||
.get_or_insert_with(std::collections::BTreeMap::new)
|
||||
.insert(key, props);
|
||||
}
|
||||
crate::schema::Dependency::Schema(sub_schema) => {
|
||||
schema
|
||||
.dependent_schemas
|
||||
.get_or_insert_with(std::collections::BTreeMap::new)
|
||||
.insert(key, sub_schema);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn compile_recursive(schema: &mut Schema) {
|
||||
Self::normalize_dependencies(schema);
|
||||
|
||||
// Compile self
|
||||
if let Some(format_str) = &schema.format {
|
||||
if let Some(fmt) = crate::formats::FORMATS.get(format_str.as_str()) {
|
||||
schema.compiled_format = Some(CompiledFormat::Func(fmt.func));
|
||||
}
|
||||
}
|
||||
if let Some(pattern_str) = &schema.pattern {
|
||||
if let Ok(re) = Regex::new(pattern_str) {
|
||||
schema.compiled_pattern = Some(CompiledRegex(re));
|
||||
}
|
||||
}
|
||||
|
||||
// Recurse
|
||||
|
||||
if let Some(defs) = &mut schema.definitions {
|
||||
for s in defs.values_mut() {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
if let Some(defs) = &mut schema.defs {
|
||||
for s in defs.values_mut() {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
if let Some(props) = &mut schema.properties {
|
||||
for s in props.values_mut() {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
|
||||
// ... Recurse logic ...
|
||||
if let Some(items) = &mut schema.items {
|
||||
Self::compile_recursive(Arc::make_mut(items));
|
||||
}
|
||||
if let Some(prefix_items) = &mut schema.prefix_items {
|
||||
for s in prefix_items {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
if let Some(not) = &mut schema.not {
|
||||
Self::compile_recursive(Arc::make_mut(not));
|
||||
}
|
||||
if let Some(all_of) = &mut schema.all_of {
|
||||
for s in all_of {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
if let Some(any_of) = &mut schema.any_of {
|
||||
for s in any_of {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
if let Some(one_of) = &mut schema.one_of {
|
||||
for s in one_of {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
if let Some(s) = &mut schema.if_ {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
if let Some(s) = &mut schema.then_ {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
if let Some(s) = &mut schema.else_ {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
|
||||
if let Some(ds) = &mut schema.dependent_schemas {
|
||||
for s in ds.values_mut() {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
if let Some(pn) = &mut schema.property_names {
|
||||
Self::compile_recursive(Arc::make_mut(pn));
|
||||
}
|
||||
}
|
||||
|
||||
/// Recursively traverses the schema tree to build the local registry index.
|
||||
fn compile_index(
|
||||
schema: &Arc<Schema>,
|
||||
registry: &mut crate::registry::Registry,
|
||||
parent_base: Option<String>,
|
||||
pointer: json_pointer::JsonPointer<String, Vec<String>>,
|
||||
) {
|
||||
// 1. Index using Parent Base (Path from Parent)
|
||||
if let Some(base) = &parent_base {
|
||||
// We use the pointer's string representation (e.g., "/definitions/foo")
|
||||
// and append it to the base.
|
||||
let fragment = pointer.to_string();
|
||||
let ptr_uri = if fragment.is_empty() {
|
||||
base.clone()
|
||||
} else {
|
||||
format!("{}#{}", base, fragment)
|
||||
};
|
||||
registry.insert(ptr_uri, schema.clone());
|
||||
}
|
||||
|
||||
// 2. Determine Current Scope... (unchanged logic)
|
||||
let mut current_base = parent_base.clone();
|
||||
let mut child_pointer = pointer.clone();
|
||||
|
||||
if let Some(id) = &schema.obj.id {
|
||||
let mut new_base = None;
|
||||
if let Ok(_) = url::Url::parse(id) {
|
||||
new_base = Some(id.clone());
|
||||
} else if let Some(base) = ¤t_base {
|
||||
if let Ok(base_url) = url::Url::parse(base) {
|
||||
if let Ok(joined) = base_url.join(id) {
|
||||
new_base = Some(joined.to_string());
|
||||
}
|
||||
}
|
||||
} else {
|
||||
new_base = Some(id.clone());
|
||||
}
|
||||
|
||||
if let Some(base) = new_base {
|
||||
// println!("DEBUG: Compiling index for path: {}", base); // Added println
|
||||
registry.insert(base.clone(), schema.clone());
|
||||
current_base = Some(base);
|
||||
child_pointer = json_pointer::JsonPointer::new(vec![]); // Reset
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Index by Anchor
|
||||
if let Some(anchor) = &schema.obj.anchor {
|
||||
if let Some(base) = ¤t_base {
|
||||
let anchor_uri = format!("{}#{}", base, anchor);
|
||||
registry.insert(anchor_uri, schema.clone());
|
||||
}
|
||||
}
|
||||
// Index by Dynamic Anchor
|
||||
if let Some(d_anchor) = &schema.obj.dynamic_anchor {
|
||||
if let Some(base) = ¤t_base {
|
||||
let anchor_uri = format!("{}#{}", base, d_anchor);
|
||||
registry.insert(anchor_uri, schema.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Recurse (unchanged logic structure, just passing registry)
|
||||
if let Some(defs) = schema.defs.as_ref().or(schema.definitions.as_ref()) {
|
||||
let segment = if schema.defs.is_some() {
|
||||
"$defs"
|
||||
} else {
|
||||
"definitions"
|
||||
};
|
||||
for (key, sub_schema) in defs {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push(segment.to_string());
|
||||
let decoded_key = percent_encoding::percent_decode_str(key).decode_utf8_lossy();
|
||||
sub.push(decoded_key.to_string());
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(props) = &schema.properties {
|
||||
for (key, sub_schema) in props {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("properties".to_string());
|
||||
sub.push(key.to_string());
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(items) = &schema.items {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("items".to_string());
|
||||
Self::compile_index(items, registry, current_base.clone(), sub);
|
||||
}
|
||||
|
||||
if let Some(prefix_items) = &schema.prefix_items {
|
||||
for (i, sub_schema) in prefix_items.iter().enumerate() {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("prefixItems".to_string());
|
||||
sub.push(i.to_string());
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(all_of) = &schema.all_of {
|
||||
for (i, sub_schema) in all_of.iter().enumerate() {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("allOf".to_string());
|
||||
sub.push(i.to_string());
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
if let Some(any_of) = &schema.any_of {
|
||||
for (i, sub_schema) in any_of.iter().enumerate() {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("anyOf".to_string());
|
||||
sub.push(i.to_string());
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
if let Some(one_of) = &schema.one_of {
|
||||
for (i, sub_schema) in one_of.iter().enumerate() {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("oneOf".to_string());
|
||||
sub.push(i.to_string());
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(not) = &schema.not {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("not".to_string());
|
||||
Self::compile_index(not, registry, current_base.clone(), sub);
|
||||
}
|
||||
if let Some(if_) = &schema.if_ {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("if".to_string());
|
||||
Self::compile_index(if_, registry, current_base.clone(), sub);
|
||||
}
|
||||
if let Some(then_) = &schema.then_ {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("then".to_string());
|
||||
Self::compile_index(then_, registry, current_base.clone(), sub);
|
||||
}
|
||||
if let Some(else_) = &schema.else_ {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("else".to_string());
|
||||
Self::compile_index(else_, registry, current_base.clone(), sub);
|
||||
}
|
||||
if let Some(deps) = &schema.dependent_schemas {
|
||||
for (key, sub_schema) in deps {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("dependentSchemas".to_string());
|
||||
sub.push(key.to_string());
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
if let Some(pp) = &schema.pattern_properties {
|
||||
for (key, sub_schema) in pp {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("patternProperties".to_string());
|
||||
sub.push(key.to_string());
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
if let Some(contains) = &schema.contains {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("contains".to_string());
|
||||
Self::compile_index(contains, registry, current_base.clone(), sub);
|
||||
}
|
||||
if let Some(property_names) = &schema.property_names {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("propertyNames".to_string());
|
||||
Self::compile_index(property_names, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn compile(mut root_schema: Schema, root_id: Option<String>) -> Arc<Schema> {
|
||||
// 1. Compile in-place (formats/regexes/normalization)
|
||||
Self::compile_formats_and_regexes(&mut root_schema);
|
||||
|
||||
// Apply root_id override if schema ID is missing
|
||||
if let Some(rid) = &root_id {
|
||||
if root_schema.obj.id.is_none() {
|
||||
root_schema.obj.id = Some(rid.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Build ID/Pointer Index
|
||||
let mut registry = crate::registry::Registry::new();
|
||||
|
||||
// We need a temporary Arc to satisfy compile_index recursion
|
||||
// But we are modifying root_schema.
|
||||
// This is tricky. compile_index takes &Arc<Schema>.
|
||||
// We should build the index first, THEN attach it.
|
||||
|
||||
let root = Arc::new(root_schema);
|
||||
|
||||
// Default base_uri to ""
|
||||
let base_uri = root_id
|
||||
.clone()
|
||||
.or_else(|| root.obj.id.clone())
|
||||
.or(Some("".to_string()));
|
||||
|
||||
Self::compile_index(
|
||||
&root,
|
||||
&mut registry,
|
||||
base_uri,
|
||||
json_pointer::JsonPointer::new(vec![]),
|
||||
);
|
||||
|
||||
// Also ensure root id is indexed if present
|
||||
if let Some(rid) = root_id {
|
||||
registry.insert(rid, root.clone());
|
||||
}
|
||||
|
||||
// Now we need to attach this registry to the root schema.
|
||||
// Since root is an Arc, we might need to recreate it if we can't mutate.
|
||||
// Schema struct modifications require &mut.
|
||||
|
||||
let mut final_schema = Arc::try_unwrap(root).unwrap_or_else(|arc| (*arc).clone());
|
||||
final_schema.obj.compiled_schemas = Some(Arc::new(registry));
|
||||
|
||||
Arc::new(final_schema)
|
||||
}
|
||||
}
|
||||
12
src/database/enum.rs
Normal file
12
src/database/enum.rs
Normal file
@ -0,0 +1,12 @@
|
||||
use crate::database::schema::Schema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct Enum {
|
||||
pub name: String,
|
||||
pub module: String,
|
||||
pub source: String,
|
||||
pub values: Vec<String>,
|
||||
pub schemas: Vec<Schema>,
|
||||
}
|
||||
880
src/database/formats.rs
Normal file
880
src/database/formats.rs
Normal file
@ -0,0 +1,880 @@
|
||||
use std::{
|
||||
collections::HashMap,
|
||||
error::Error,
|
||||
net::{Ipv4Addr, Ipv6Addr},
|
||||
};
|
||||
|
||||
use lazy_static::lazy_static;
|
||||
use percent_encoding::percent_decode_str;
|
||||
use serde_json::Value;
|
||||
use url::Url;
|
||||
|
||||
// use crate::ecma; // Assuming ecma is not yet available, stubbing regex for now
|
||||
|
||||
/// Defines format for `format` keyword.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Format {
|
||||
/// Name of the format
|
||||
pub name: &'static str,
|
||||
|
||||
/// validates given value.
|
||||
pub func: fn(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>>, // Ensure thread safety if needed
|
||||
}
|
||||
|
||||
lazy_static! {
|
||||
pub(crate) static ref FORMATS: HashMap<&'static str, Format> = {
|
||||
let mut m = HashMap::<&'static str, Format>::new();
|
||||
// Helper to register formats
|
||||
let mut register = |name, func| m.insert(name, Format { name, func });
|
||||
|
||||
// register("regex", validate_regex); // Stubbed
|
||||
register("ipv4", validate_ipv4);
|
||||
register("ipv6", validate_ipv6);
|
||||
register("hostname", validate_hostname);
|
||||
register("idn-hostname", validate_idn_hostname);
|
||||
register("email", validate_email);
|
||||
register("idn-email", validate_idn_email);
|
||||
register("date", validate_date);
|
||||
register("time", validate_time);
|
||||
register("date-time", validate_date_time);
|
||||
register("duration", validate_duration);
|
||||
register("period", validate_period);
|
||||
register("json-pointer", validate_json_pointer);
|
||||
register("relative-json-pointer", validate_relative_json_pointer);
|
||||
register("uuid", validate_uuid);
|
||||
register("uri", validate_uri);
|
||||
register("iri", validate_iri);
|
||||
register("uri-reference", validate_uri_reference);
|
||||
register("iri-reference", validate_iri_reference);
|
||||
register("uri-template", validate_uri_template);
|
||||
m
|
||||
};
|
||||
}
|
||||
|
||||
/*
|
||||
fn validate_regex(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
// ecma::convert(s).map(|_| ())
|
||||
Ok(())
|
||||
}
|
||||
*/
|
||||
|
||||
fn validate_ipv4(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
s.parse::<Ipv4Addr>()?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_ipv6(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
s.parse::<Ipv6Addr>()?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_date(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_date(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn matches_char(s: &str, index: usize, ch: char) -> bool {
|
||||
s.is_char_boundary(index) && s[index..].starts_with(ch)
|
||||
}
|
||||
|
||||
// see https://datatracker.ietf.org/doc/html/rfc3339#section-5.6
|
||||
fn check_date(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// yyyy-mm-dd
|
||||
if s.len() != 10 {
|
||||
Err("must be 10 characters long")?;
|
||||
}
|
||||
if !matches_char(s, 4, '-') || !matches_char(s, 7, '-') {
|
||||
Err("missing hyphen in correct place")?;
|
||||
}
|
||||
|
||||
let mut ymd = s.splitn(3, '-').filter_map(|t| t.parse::<usize>().ok());
|
||||
let (Some(y), Some(m), Some(d)) = (ymd.next(), ymd.next(), ymd.next()) else {
|
||||
Err("non-positive year/month/day")?
|
||||
};
|
||||
|
||||
if !matches!(m, 1..=12) {
|
||||
Err(format!("{m} months in year"))?;
|
||||
}
|
||||
if !matches!(d, 1..=31) {
|
||||
Err(format!("{d} days in month"))?;
|
||||
}
|
||||
|
||||
match m {
|
||||
2 => {
|
||||
let mut feb_days = 28;
|
||||
if y % 4 == 0 && (y % 100 != 0 || y % 400 == 0) {
|
||||
feb_days += 1; // leap year
|
||||
};
|
||||
if d > feb_days {
|
||||
Err(format!("february has {feb_days} days only"))?;
|
||||
}
|
||||
}
|
||||
4 | 6 | 9 | 11 => {
|
||||
if d > 30 {
|
||||
Err("month has 30 days only")?;
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_time(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_time(s)
|
||||
}
|
||||
|
||||
fn check_time(mut str: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// min: hh:mm:ssZ
|
||||
if str.len() < 9 {
|
||||
Err("less than 9 characters long")?
|
||||
}
|
||||
if !matches_char(str, 2, ':') || !matches_char(str, 5, ':') {
|
||||
Err("missing colon in correct place")?
|
||||
}
|
||||
|
||||
// parse hh:mm:ss
|
||||
if !str.is_char_boundary(8) {
|
||||
Err("contains non-ascii char")?
|
||||
}
|
||||
let mut hms = (str[..8])
|
||||
.splitn(3, ':')
|
||||
.filter_map(|t| t.parse::<usize>().ok());
|
||||
let (Some(mut h), Some(mut m), Some(s)) = (hms.next(), hms.next(), hms.next()) else {
|
||||
Err("non-positive hour/min/sec")?
|
||||
};
|
||||
if h > 23 || m > 59 || s > 60 {
|
||||
Err("hour/min/sec out of range")?
|
||||
}
|
||||
str = &str[8..];
|
||||
|
||||
// parse sec-frac if present
|
||||
if let Some(rem) = str.strip_prefix('.') {
|
||||
let n_digits = rem.chars().take_while(char::is_ascii_digit).count();
|
||||
if n_digits == 0 {
|
||||
Err("no digits in second fraction")?;
|
||||
}
|
||||
str = &rem[n_digits..];
|
||||
}
|
||||
|
||||
if str != "z" && str != "Z" {
|
||||
// parse time-numoffset
|
||||
if str.len() != 6 {
|
||||
Err("offset must be 6 characters long")?;
|
||||
}
|
||||
let sign: isize = match str.chars().next() {
|
||||
Some('+') => -1,
|
||||
Some('-') => 1,
|
||||
_ => return Err("offset must begin with plus/minus")?,
|
||||
};
|
||||
str = &str[1..];
|
||||
if !matches_char(str, 2, ':') {
|
||||
Err("missing colon in offset at correct place")?
|
||||
}
|
||||
|
||||
let mut zhm = str.splitn(2, ':').filter_map(|t| t.parse::<usize>().ok());
|
||||
let (Some(zh), Some(zm)) = (zhm.next(), zhm.next()) else {
|
||||
Err("non-positive hour/min in offset")?
|
||||
};
|
||||
if zh > 23 || zm > 59 {
|
||||
Err("hour/min in offset out of range")?
|
||||
}
|
||||
|
||||
// apply timezone
|
||||
let mut hm = (h * 60 + m) as isize + sign * (zh * 60 + zm) as isize;
|
||||
if hm < 0 {
|
||||
hm += 24 * 60;
|
||||
debug_assert!(hm >= 0);
|
||||
}
|
||||
let hm = hm as usize;
|
||||
(h, m) = (hm / 60, hm % 60);
|
||||
}
|
||||
|
||||
// check leap second
|
||||
if !(s < 60 || (h == 23 && m == 59)) {
|
||||
Err("invalid leap second")?
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_date_time(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_date_time(s)
|
||||
}
|
||||
|
||||
fn check_date_time(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// min: yyyy-mm-ddThh:mm:ssZ
|
||||
if s.len() < 20 {
|
||||
Err("less than 20 characters long")?;
|
||||
}
|
||||
if !s.is_char_boundary(10) || !s[10..].starts_with(['t', 'T']) {
|
||||
Err("11th character must be t or T")?;
|
||||
}
|
||||
if let Err(e) = check_date(&s[..10]) {
|
||||
Err(format!("invalid date element: {e}"))?;
|
||||
}
|
||||
if let Err(e) = check_time(&s[11..]) {
|
||||
Err(format!("invalid time element: {e}"))?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_duration(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_duration(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://datatracker.ietf.org/doc/html/rfc3339#appendix-A
|
||||
fn check_duration(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// must start with 'P'
|
||||
let Some(s) = s.strip_prefix('P') else {
|
||||
Err("must start with P")?
|
||||
};
|
||||
if s.is_empty() {
|
||||
Err("nothing after P")?
|
||||
}
|
||||
|
||||
// dur-week
|
||||
if let Some(s) = s.strip_suffix('W') {
|
||||
if s.is_empty() {
|
||||
Err("no number in week")?
|
||||
}
|
||||
if !s.chars().all(|c| c.is_ascii_digit()) {
|
||||
Err("invalid week")?
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
static UNITS: [&str; 2] = ["YMD", "HMS"];
|
||||
for (i, s) in s.split('T').enumerate() {
|
||||
let mut s = s;
|
||||
if i != 0 && s.is_empty() {
|
||||
Err("no time elements")?
|
||||
}
|
||||
let Some(mut units) = UNITS.get(i).cloned() else {
|
||||
Err("more than one T")?
|
||||
};
|
||||
while !s.is_empty() {
|
||||
let digit_count = s.chars().take_while(char::is_ascii_digit).count();
|
||||
if digit_count == 0 {
|
||||
Err("missing number")?
|
||||
}
|
||||
s = &s[digit_count..];
|
||||
let Some(unit) = s.chars().next() else {
|
||||
Err("missing unit")?
|
||||
};
|
||||
let Some(j) = units.find(unit) else {
|
||||
if UNITS[i].contains(unit) {
|
||||
Err(format!("unit {unit} out of order"))?
|
||||
}
|
||||
Err(format!("invalid unit {unit}"))?
|
||||
};
|
||||
units = &units[j + 1..];
|
||||
s = &s[1..];
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://datatracker.ietf.org/doc/html/rfc3339#appendix-A
|
||||
fn validate_period(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
let Some(slash) = s.find('/') else {
|
||||
Err("missing slash")?
|
||||
};
|
||||
|
||||
let (start, end) = (&s[..slash], &s[slash + 1..]);
|
||||
if start.starts_with('P') {
|
||||
if let Err(e) = check_duration(start) {
|
||||
Err(format!("invalid start duration: {e}"))?
|
||||
}
|
||||
if let Err(e) = check_date_time(end) {
|
||||
Err(format!("invalid end date-time: {e}"))?
|
||||
}
|
||||
} else {
|
||||
if let Err(e) = check_date_time(start) {
|
||||
Err(format!("invalid start date-time: {e}"))?
|
||||
}
|
||||
if end.starts_with('P') {
|
||||
if let Err(e) = check_duration(end) {
|
||||
Err(format!("invalid end duration: {e}"))?;
|
||||
}
|
||||
} else if let Err(e) = check_date_time(end) {
|
||||
Err(format!("invalid end date-time: {e}"))?;
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_hostname(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_hostname(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://en.wikipedia.org/wiki/Hostname#Restrictions_on_valid_host_names
|
||||
fn check_hostname(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// entire hostname (including the delimiting dots but not a trailing dot) has a maximum of 253 ASCII characters
|
||||
|
||||
if s.len() > 253 {
|
||||
Err("more than 253 characters long")?
|
||||
}
|
||||
|
||||
// Hostnames are composed of series of labels concatenated with dots, as are all domain names
|
||||
for label in s.split('.') {
|
||||
// Each label must be from 1 to 63 characters long
|
||||
if !matches!(label.len(), 1..=63) {
|
||||
Err("label must be 1 to 63 characters long")?;
|
||||
}
|
||||
|
||||
// labels must not start or end with a hyphen
|
||||
if label.starts_with('-') {
|
||||
Err("label starts with hyphen")?;
|
||||
}
|
||||
|
||||
if label.ends_with('-') {
|
||||
Err("label ends with hyphen")?;
|
||||
}
|
||||
|
||||
// labels may contain only the ASCII letters 'a' through 'z' (in a case-insensitive manner),
|
||||
// the digits '0' through '9', and the hyphen ('-')
|
||||
if let Some(ch) = label
|
||||
.chars()
|
||||
.find(|c| !matches!(c, 'a'..='z' | 'A'..='Z' | '0'..='9' | '-'))
|
||||
{
|
||||
Err(format!("invalid character {ch:?}"))?;
|
||||
}
|
||||
|
||||
// labels must not contain "--" in 3rd and 4th position unless they start with "xn--"
|
||||
if label.len() >= 4 && &label[2..4] == "--" {
|
||||
if !label.starts_with("xn--") {
|
||||
Err("label has -- in 3rd/4th position but does not start with xn--")?;
|
||||
} else {
|
||||
let (unicode, errors) = idna::domain_to_unicode(label);
|
||||
if errors.is_err() {
|
||||
Err("invalid punycode")?;
|
||||
}
|
||||
check_unicode_idn_constraints(&unicode)
|
||||
.map_err(|e| format!("invalid punycode/IDN: {e}"))?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_idn_hostname(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_idn_hostname(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
static DISALLOWED: [char; 10] = [
|
||||
'\u{0640}', // ARABIC TATWEEL
|
||||
'\u{07FA}', // NKO LAJANYALAN
|
||||
'\u{302E}', // HANGUL SINGLE DOT TONE MARK
|
||||
'\u{302F}', // HANGUL DOUBLE DOT TONE MARK
|
||||
'\u{3031}', // VERTICAL KANA REPEAT MARK
|
||||
'\u{3032}', // VERTICAL KANA REPEAT WITH VOICED SOUND MARK
|
||||
'\u{3033}', // VERTICAL KANA REPEAT MARK UPPER HALF
|
||||
'\u{3034}', // VERTICAL KANA REPEAT WITH VOICED SOUND MARK UPPER HA
|
||||
'\u{3035}', // VERTICAL KANA REPEAT MARK LOWER HALF
|
||||
'\u{303B}', // VERTICAL IDEOGRAPHIC ITERATION MARK
|
||||
];
|
||||
|
||||
fn check_idn_hostname(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let s = idna::domain_to_ascii_strict(s).map_err(|e| format!("idna error: {:?}", e))?;
|
||||
let (unicode, errors) = idna::domain_to_unicode(&s);
|
||||
if let Err(e) = errors {
|
||||
Err(format!("idna decoding error: {:?}", e))?;
|
||||
}
|
||||
check_unicode_idn_constraints(&unicode)?;
|
||||
check_hostname(&s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn check_unicode_idn_constraints(unicode: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#section-2.6
|
||||
{
|
||||
if unicode.contains(DISALLOWED) {
|
||||
Err("contains disallowed character")?;
|
||||
}
|
||||
}
|
||||
|
||||
// unicode string must not contain "--" in 3rd and 4th position
|
||||
// and must not start and end with a '-'
|
||||
// see https://www.rfc-editor.org/rfc/rfc5891#section-4.2.3.1
|
||||
{
|
||||
let count: usize = unicode
|
||||
.chars()
|
||||
.skip(2)
|
||||
.take(2)
|
||||
.map(|c| if c == '-' { 1 } else { 0 })
|
||||
.sum();
|
||||
if count == 2 {
|
||||
Err("unicode string must not contain '--' in 3rd and 4th position")?;
|
||||
}
|
||||
}
|
||||
|
||||
// MIDDLE DOT is allowed between 'l' characters only
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.3
|
||||
{
|
||||
let middle_dot = '\u{00b7}';
|
||||
let mut s = unicode;
|
||||
while let Some(i) = s.find(middle_dot) {
|
||||
let prefix = &s[..i];
|
||||
let suffix = &s[i + middle_dot.len_utf8()..];
|
||||
if !prefix.ends_with('l') || !suffix.ends_with('l') {
|
||||
Err("MIDDLE DOT is allowed between 'l' characters only")?;
|
||||
}
|
||||
s = suffix;
|
||||
}
|
||||
}
|
||||
|
||||
// Greek KERAIA must be followed by Greek character
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.4
|
||||
{
|
||||
let keralia = '\u{0375}';
|
||||
let greek = '\u{0370}'..='\u{03FF}';
|
||||
let mut s = unicode;
|
||||
while let Some(i) = s.find(keralia) {
|
||||
let suffix = &s[i + keralia.len_utf8()..];
|
||||
if !suffix.starts_with(|c| greek.contains(&c)) {
|
||||
Err("Greek KERAIA must be followed by Greek character")?;
|
||||
}
|
||||
s = suffix;
|
||||
}
|
||||
}
|
||||
|
||||
// Hebrew GERESH must be preceded by Hebrew character
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.5
|
||||
//
|
||||
// Hebrew GERSHAYIM must be preceded by Hebrew character
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.6
|
||||
{
|
||||
let geresh = '\u{05F3}';
|
||||
let gereshayim = '\u{05F4}';
|
||||
let hebrew = '\u{0590}'..='\u{05FF}';
|
||||
for ch in [geresh, gereshayim] {
|
||||
let mut s = unicode;
|
||||
while let Some(i) = s.find(ch) {
|
||||
let prefix = &s[..i];
|
||||
if !prefix.ends_with(|c| hebrew.contains(&c)) {
|
||||
if i == 0 {
|
||||
Err("Hebrew GERESH must be preceded by Hebrew character")?;
|
||||
} else {
|
||||
Err("Hebrew GERESHYIM must be preceded by Hebrew character")?;
|
||||
}
|
||||
}
|
||||
let suffix = &s[i + ch.len_utf8()..];
|
||||
s = suffix;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// KATAKANA MIDDLE DOT must be with Hiragana, Katakana, or Han
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.7
|
||||
{
|
||||
let katakana_middle_dot = '\u{30FB}';
|
||||
if unicode.contains(katakana_middle_dot) {
|
||||
let hiragana = '\u{3040}'..='\u{309F}';
|
||||
let katakana = '\u{30A0}'..='\u{30FF}';
|
||||
let han = '\u{4E00}'..='\u{9FFF}'; // https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_(Unicode_block): is this range correct??
|
||||
if unicode.contains(|c| hiragana.contains(&c))
|
||||
|| unicode.contains(|c| c != katakana_middle_dot && katakana.contains(&c))
|
||||
|| unicode.contains(|c| han.contains(&c))
|
||||
{
|
||||
// ok
|
||||
} else {
|
||||
Err("KATAKANA MIDDLE DOT must be with Hiragana, Katakana, or Han")?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ARABIC-INDIC DIGITS and Extended Arabic-Indic Digits cannot be mixed
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.8
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.9
|
||||
{
|
||||
let arabic_indic_digits = '\u{0660}'..='\u{0669}';
|
||||
let extended_arabic_indic_digits = '\u{06F0}'..='\u{06F9}';
|
||||
if unicode.contains(|c| arabic_indic_digits.contains(&c))
|
||||
&& unicode.contains(|c| extended_arabic_indic_digits.contains(&c))
|
||||
{
|
||||
Err("ARABIC-INDIC DIGITS and Extended Arabic-Indic Digits cannot be mixed")?;
|
||||
}
|
||||
}
|
||||
|
||||
// ZERO WIDTH JOINER must be preceded by Virama
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.2
|
||||
{
|
||||
let zero_width_jointer = '\u{200D}';
|
||||
static VIRAMA: [char; 61] = [
|
||||
'\u{094D}',
|
||||
'\u{09CD}',
|
||||
'\u{0A4D}',
|
||||
'\u{0ACD}',
|
||||
'\u{0B4D}',
|
||||
'\u{0BCD}',
|
||||
'\u{0C4D}',
|
||||
'\u{0CCD}',
|
||||
'\u{0D3B}',
|
||||
'\u{0D3C}',
|
||||
'\u{0D4D}',
|
||||
'\u{0DCA}',
|
||||
'\u{0E3A}',
|
||||
'\u{0EBA}',
|
||||
'\u{0F84}',
|
||||
'\u{1039}',
|
||||
'\u{103A}',
|
||||
'\u{1714}',
|
||||
'\u{1734}',
|
||||
'\u{17D2}',
|
||||
'\u{1A60}',
|
||||
'\u{1B44}',
|
||||
'\u{1BAA}',
|
||||
'\u{1BAB}',
|
||||
'\u{1BF2}',
|
||||
'\u{1BF3}',
|
||||
'\u{2D7F}',
|
||||
'\u{A806}',
|
||||
'\u{A82C}',
|
||||
'\u{A8C4}',
|
||||
'\u{A953}',
|
||||
'\u{A9C0}',
|
||||
'\u{AAF6}',
|
||||
'\u{ABED}',
|
||||
'\u{10A3F}',
|
||||
'\u{11046}',
|
||||
'\u{1107F}',
|
||||
'\u{110B9}',
|
||||
'\u{11133}',
|
||||
'\u{11134}',
|
||||
'\u{111C0}',
|
||||
'\u{11235}',
|
||||
'\u{112EA}',
|
||||
'\u{1134D}',
|
||||
'\u{11442}',
|
||||
'\u{114C2}',
|
||||
'\u{115BF}',
|
||||
'\u{1163F}',
|
||||
'\u{116B6}',
|
||||
'\u{1172B}',
|
||||
'\u{11839}',
|
||||
'\u{1193D}',
|
||||
'\u{1193E}',
|
||||
'\u{119E0}',
|
||||
'\u{11A34}',
|
||||
'\u{11A47}',
|
||||
'\u{11A99}',
|
||||
'\u{11C3F}',
|
||||
'\u{11D44}',
|
||||
'\u{11D45}',
|
||||
'\u{11D97}',
|
||||
]; // https://www.compart.com/en/unicode/combining/9
|
||||
let mut s = unicode;
|
||||
while let Some(i) = s.find(zero_width_jointer) {
|
||||
let prefix = &s[..i];
|
||||
if !prefix.ends_with(VIRAMA) {
|
||||
Err("ZERO WIDTH JOINER must be preceded by Virama")?;
|
||||
}
|
||||
let suffix = &s[i + zero_width_jointer.len_utf8()..];
|
||||
s = suffix;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_email(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_email(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://en.wikipedia.org/wiki/Email_address
|
||||
fn check_email(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// entire email address to be no more than 254 characters long
|
||||
if s.len() > 254 {
|
||||
Err("more than 254 characters long")?
|
||||
}
|
||||
|
||||
// email address is generally recognized as having two parts joined with an at-sign
|
||||
let Some(at) = s.rfind('@') else {
|
||||
Err("missing @")?
|
||||
};
|
||||
let (local, domain) = (&s[..at], &s[at + 1..]);
|
||||
|
||||
// local part may be up to 64 characters long
|
||||
if local.len() > 64 {
|
||||
Err("local part more than 64 characters long")?
|
||||
}
|
||||
|
||||
if local.len() > 1 && local.starts_with('"') && local.ends_with('"') {
|
||||
// quoted
|
||||
let local = &local[1..local.len() - 1];
|
||||
if local.contains(['\\', '"']) {
|
||||
Err("backslash and quote not allowed within quoted local part")?
|
||||
}
|
||||
} else {
|
||||
// unquoted
|
||||
|
||||
if local.starts_with('.') {
|
||||
Err("starts with dot")?
|
||||
}
|
||||
if local.ends_with('.') {
|
||||
Err("ends with dot")?
|
||||
}
|
||||
|
||||
// consecutive dots not allowed
|
||||
if local.contains("..") {
|
||||
Err("consecutive dots")?
|
||||
}
|
||||
|
||||
// check allowd chars
|
||||
if let Some(ch) = local
|
||||
.chars()
|
||||
.find(|c| !(c.is_ascii_alphanumeric() || ".!#$%&'*+-/=?^_`{|}~".contains(*c)))
|
||||
{
|
||||
Err(format!("invalid character {ch:?}"))?
|
||||
}
|
||||
}
|
||||
|
||||
// domain if enclosed in brackets, must match an IP address
|
||||
if domain.starts_with('[') && domain.ends_with(']') {
|
||||
let s = &domain[1..domain.len() - 1];
|
||||
if let Some(s) = s.strip_prefix("IPv6:") {
|
||||
if let Err(e) = s.parse::<Ipv6Addr>() {
|
||||
Err(format!("invalid ipv6 address: {e}"))?
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
if let Err(e) = s.parse::<Ipv4Addr>() {
|
||||
Err(format!("invalid ipv4 address: {e}"))?
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// domain must match the requirements for a hostname
|
||||
if let Err(e) = check_hostname(domain) {
|
||||
Err(format!("invalid domain: {e}"))?
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_idn_email(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
let Some(at) = s.rfind('@') else {
|
||||
Err("missing @")?
|
||||
};
|
||||
let (local, domain) = (&s[..at], &s[at + 1..]);
|
||||
|
||||
let local = idna::domain_to_ascii_strict(local).map_err(|e| format!("idna error: {:?}", e))?;
|
||||
let domain = idna::domain_to_ascii_strict(domain).map_err(|e| format!("idna error: {:?}", e))?;
|
||||
if let Err(e) = check_idn_hostname(&domain) {
|
||||
Err(format!("invalid domain: {e}"))?
|
||||
}
|
||||
check_email(&format!("{local}@{domain}"))
|
||||
}
|
||||
|
||||
fn validate_json_pointer(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_json_pointer(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://www.rfc-editor.org/rfc/rfc6901#section-3
|
||||
fn check_json_pointer(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
if s.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
if !s.starts_with('/') {
|
||||
Err("not starting with slash")?;
|
||||
}
|
||||
for token in s.split('/').skip(1) {
|
||||
let mut chars = token.chars();
|
||||
while let Some(ch) = chars.next() {
|
||||
if ch == '~' {
|
||||
if !matches!(chars.next(), Some('0' | '1')) {
|
||||
Err("~ must be followed by 0 or 1")?;
|
||||
}
|
||||
} else if !matches!(ch, '\x00'..='\x2E' | '\x30'..='\x7D' | '\x7F'..='\u{10FFFF}') {
|
||||
Err("contains disallowed character")?;
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://tools.ietf.org/html/draft-handrews-relative-json-pointer-01#section-3
|
||||
fn validate_relative_json_pointer(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
// start with non-negative-integer
|
||||
let num_digits = s.chars().take_while(char::is_ascii_digit).count();
|
||||
if num_digits == 0 {
|
||||
Err("must start with non-negative integer")?;
|
||||
}
|
||||
if num_digits > 1 && s.starts_with('0') {
|
||||
Err("starts with zero")?;
|
||||
}
|
||||
let s = &s[num_digits..];
|
||||
|
||||
// followed by either json-pointer or '#'
|
||||
if s == "#" {
|
||||
return Ok(());
|
||||
}
|
||||
if let Err(e) = check_json_pointer(s) {
|
||||
Err(format!("invalid json-pointer element: {e}"))?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://datatracker.ietf.org/doc/html/rfc4122#page-4
|
||||
fn validate_uuid(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
static HEX_GROUPS: [usize; 5] = [8, 4, 4, 4, 12];
|
||||
let mut i = 0;
|
||||
for group in s.split('-') {
|
||||
if i >= HEX_GROUPS.len() {
|
||||
Err("more than 5 elements")?;
|
||||
}
|
||||
if group.len() != HEX_GROUPS[i] {
|
||||
Err(format!(
|
||||
"element {} must be {} characters long",
|
||||
i + 1,
|
||||
HEX_GROUPS[i]
|
||||
))?;
|
||||
}
|
||||
if let Some(ch) = group.chars().find(|c| !c.is_ascii_hexdigit()) {
|
||||
Err(format!("non-hex character {ch:?}"))?;
|
||||
}
|
||||
i += 1;
|
||||
}
|
||||
if i != HEX_GROUPS.len() {
|
||||
Err("must have 5 elements")?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_uri(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
if fluent_uri::UriRef::parse(s.as_str())
|
||||
.map_err(|e| e.to_string())?
|
||||
.scheme()
|
||||
.is_none()
|
||||
{
|
||||
Err("relative url")?;
|
||||
};
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_iri(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
match Url::parse(s) {
|
||||
Ok(_) => Ok(()),
|
||||
Err(url::ParseError::RelativeUrlWithoutBase) => Err("relative url")?,
|
||||
Err(e) => Err(e)?,
|
||||
}
|
||||
}
|
||||
|
||||
lazy_static! {
|
||||
static ref TEMP_URL: Url = Url::parse("http://temp.com").unwrap();
|
||||
}
|
||||
|
||||
fn parse_uri_reference(s: &str) -> Result<Url, Box<dyn Error + Send + Sync>> {
|
||||
if s.contains('\\') {
|
||||
Err("contains \\\\")?;
|
||||
}
|
||||
Ok(TEMP_URL.join(s)?)
|
||||
}
|
||||
|
||||
fn validate_uri_reference(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
fluent_uri::UriRef::parse(s.as_str()).map_err(|e| e.to_string())?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_iri_reference(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
parse_uri_reference(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_uri_template(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
let url = parse_uri_reference(s)?;
|
||||
|
||||
let path = url.path();
|
||||
// path we got has curly bases percent encoded
|
||||
let path = percent_decode_str(path).decode_utf8()?;
|
||||
|
||||
// ensure curly brackets are not nested and balanced
|
||||
for part in path.as_ref().split('/') {
|
||||
let mut want = true;
|
||||
for got in part
|
||||
.chars()
|
||||
.filter(|c| matches!(c, '{' | '}'))
|
||||
.map(|c| c == '{')
|
||||
{
|
||||
if got != want {
|
||||
Err("nested curly braces")?;
|
||||
}
|
||||
want = !want;
|
||||
}
|
||||
if !want {
|
||||
Err("no matching closing brace")?
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
187
src/database/mod.rs
Normal file
187
src/database/mod.rs
Normal file
@ -0,0 +1,187 @@
|
||||
pub mod r#enum;
|
||||
pub mod formats;
|
||||
pub mod page;
|
||||
pub mod punc;
|
||||
pub mod schema;
|
||||
pub mod r#type;
|
||||
|
||||
use crate::database::r#enum::Enum;
|
||||
use crate::database::punc::Punc;
|
||||
use crate::database::schema::Schema;
|
||||
use crate::database::r#type::Type;
|
||||
use std::collections::{HashMap, HashSet};
|
||||
|
||||
pub struct Database {
|
||||
pub enums: HashMap<String, Enum>,
|
||||
pub types: HashMap<String, Type>,
|
||||
pub puncs: HashMap<String, Punc>,
|
||||
pub schemas: HashMap<String, Schema>,
|
||||
pub descendants: HashMap<String, Vec<String>>,
|
||||
pub depths: HashMap<String, usize>,
|
||||
}
|
||||
|
||||
impl Database {
|
||||
pub fn new(val: &serde_json::Value) -> Self {
|
||||
let mut db = Self {
|
||||
enums: HashMap::new(),
|
||||
types: HashMap::new(),
|
||||
puncs: HashMap::new(),
|
||||
schemas: HashMap::new(),
|
||||
descendants: HashMap::new(),
|
||||
depths: HashMap::new(),
|
||||
};
|
||||
|
||||
if let Some(arr) = val.get("enums").and_then(|v| v.as_array()) {
|
||||
for item in arr {
|
||||
if let Ok(def) = serde_json::from_value::<Enum>(item.clone()) {
|
||||
db.enums.insert(def.name.clone(), def);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(arr) = val.get("types").and_then(|v| v.as_array()) {
|
||||
for item in arr {
|
||||
if let Ok(def) = serde_json::from_value::<Type>(item.clone()) {
|
||||
db.types.insert(def.name.clone(), def);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(arr) = val.get("puncs").and_then(|v| v.as_array()) {
|
||||
for item in arr {
|
||||
if let Ok(def) = serde_json::from_value::<Punc>(item.clone()) {
|
||||
db.puncs.insert(def.name.clone(), def);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(arr) = val.get("schemas").and_then(|v| v.as_array()) {
|
||||
for (i, item) in arr.iter().enumerate() {
|
||||
if let Ok(mut schema) = serde_json::from_value::<Schema>(item.clone()) {
|
||||
let id = schema
|
||||
.obj
|
||||
.id
|
||||
.clone()
|
||||
.unwrap_or_else(|| format!("schema_{}", i));
|
||||
schema.obj.id = Some(id.clone());
|
||||
db.schemas.insert(id, schema);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let _ = db.compile();
|
||||
db
|
||||
}
|
||||
|
||||
/// Organizes the graph of the database, compiling regex, format functions, and caching relationships.
|
||||
fn compile(&mut self) -> Result<(), String> {
|
||||
self.collect_schemas();
|
||||
self.collect_depths();
|
||||
self.collect_descendants();
|
||||
self.compile_schemas();
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn collect_schemas(&mut self) {
|
||||
let mut to_insert = Vec::new();
|
||||
|
||||
// Pass 1: Extract all Schemas structurally off top level definitions into the master registry.
|
||||
for type_def in self.types.values() {
|
||||
for mut schema in type_def.schemas.clone() {
|
||||
schema.harvest(&mut to_insert);
|
||||
}
|
||||
}
|
||||
for punc_def in self.puncs.values() {
|
||||
for mut schema in punc_def.schemas.clone() {
|
||||
schema.harvest(&mut to_insert);
|
||||
}
|
||||
}
|
||||
for enum_def in self.enums.values() {
|
||||
for mut schema in enum_def.schemas.clone() {
|
||||
schema.harvest(&mut to_insert);
|
||||
}
|
||||
}
|
||||
|
||||
for (id, schema) in to_insert {
|
||||
self.schemas.insert(id, schema);
|
||||
}
|
||||
}
|
||||
|
||||
fn collect_depths(&mut self) {
|
||||
let mut depths: HashMap<String, usize> = HashMap::new();
|
||||
let schema_ids: Vec<String> = self.schemas.keys().cloned().collect();
|
||||
|
||||
for id in schema_ids {
|
||||
let mut current_id = id.clone();
|
||||
let mut depth = 0;
|
||||
let mut visited = HashSet::new();
|
||||
|
||||
while let Some(schema) = self.schemas.get(¤t_id) {
|
||||
if !visited.insert(current_id.clone()) {
|
||||
break; // Cycle detected
|
||||
}
|
||||
if let Some(ref_str) = &schema.obj.r#ref {
|
||||
current_id = ref_str.clone();
|
||||
depth += 1;
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
}
|
||||
depths.insert(id, depth);
|
||||
}
|
||||
self.depths = depths;
|
||||
}
|
||||
|
||||
fn collect_descendants(&mut self) {
|
||||
let mut direct_refs: HashMap<String, Vec<String>> = HashMap::new();
|
||||
for (id, schema) in &self.schemas {
|
||||
if let Some(ref_str) = &schema.obj.r#ref {
|
||||
direct_refs
|
||||
.entry(ref_str.clone())
|
||||
.or_default()
|
||||
.push(id.clone());
|
||||
}
|
||||
}
|
||||
|
||||
// Cache generic descendants for $family runtime lookups
|
||||
let mut descendants = HashMap::new();
|
||||
for (id, schema) in &self.schemas {
|
||||
if let Some(family_target) = &schema.obj.family {
|
||||
let mut desc_set = HashSet::new();
|
||||
Self::collect_descendants_recursively(family_target, &direct_refs, &mut desc_set);
|
||||
let mut desc_vec: Vec<String> = desc_set.into_iter().collect();
|
||||
desc_vec.sort();
|
||||
|
||||
// By placing all descendants directly onto the ID mapped location of the Family declaration,
|
||||
// we can lookup descendants natively in ValidationContext without AST replacement overrides.
|
||||
descendants.insert(id.clone(), desc_vec);
|
||||
}
|
||||
}
|
||||
self.descendants = descendants;
|
||||
}
|
||||
|
||||
fn collect_descendants_recursively(
|
||||
target: &str,
|
||||
direct_refs: &HashMap<String, Vec<String>>,
|
||||
descendants: &mut HashSet<String>,
|
||||
) {
|
||||
if let Some(children) = direct_refs.get(target) {
|
||||
for child in children {
|
||||
if descendants.insert(child.clone()) {
|
||||
Self::collect_descendants_recursively(child, direct_refs, descendants);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn compile_schemas(&mut self) {
|
||||
// Pass 3: compile_internals across pure structure
|
||||
let schema_ids: Vec<String> = self.schemas.keys().cloned().collect();
|
||||
for id in schema_ids {
|
||||
if let Some(schema) = self.schemas.get_mut(&id) {
|
||||
schema.compile_internals();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
35
src/database/page.rs
Normal file
35
src/database/page.rs
Normal file
@ -0,0 +1,35 @@
|
||||
use indexmap::IndexMap;
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct Page {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub path: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub title: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub sidebar: Option<Sidebar>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub actions: Option<IndexMap<String, Action>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct Sidebar {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub category: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub priority: Option<i32>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct Action {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub punc: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub navigate: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub present: Option<String>,
|
||||
}
|
||||
20
src/database/punc.rs
Normal file
20
src/database/punc.rs
Normal file
@ -0,0 +1,20 @@
|
||||
use crate::database::page::Page;
|
||||
use crate::database::schema::Schema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct Punc {
|
||||
pub id: String,
|
||||
pub r#type: String,
|
||||
pub name: String,
|
||||
pub module: String,
|
||||
pub source: String,
|
||||
pub description: Option<String>,
|
||||
pub public: bool,
|
||||
pub form: bool,
|
||||
pub get: Option<String>,
|
||||
pub page: Option<Page>,
|
||||
#[serde(default)]
|
||||
pub schemas: Vec<Schema>,
|
||||
}
|
||||
@ -11,13 +11,7 @@ pub struct SchemaObject {
|
||||
#[serde(rename = "$id")]
|
||||
pub id: Option<String>,
|
||||
#[serde(rename = "$ref")]
|
||||
pub ref_string: Option<String>,
|
||||
#[serde(rename = "$anchor")]
|
||||
pub anchor: Option<String>,
|
||||
#[serde(rename = "$dynamicAnchor")]
|
||||
pub dynamic_anchor: Option<String>,
|
||||
#[serde(rename = "$dynamicRef")]
|
||||
pub dynamic_ref: Option<String>,
|
||||
pub r#ref: Option<String>,
|
||||
/*
|
||||
Note: The `Ref` field in the Go struct is a pointer populated by the linker.
|
||||
In Rust, we might handle this differently (e.g., separate lookup or Rc/Arc),
|
||||
@ -33,17 +27,16 @@ pub struct SchemaObject {
|
||||
pub properties: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
#[serde(rename = "patternProperties")]
|
||||
pub pattern_properties: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
#[serde(rename = "additionalProperties")]
|
||||
pub additional_properties: Option<Arc<Schema>>,
|
||||
#[serde(rename = "$family")]
|
||||
pub family: Option<String>,
|
||||
|
||||
pub required: Option<Vec<String>>,
|
||||
|
||||
// dependencies can be schema dependencies or property dependencies
|
||||
pub dependencies: Option<BTreeMap<String, Dependency>>,
|
||||
|
||||
// Definitions (for $ref resolution)
|
||||
#[serde(rename = "$defs")]
|
||||
pub defs: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
#[serde(rename = "definitions")]
|
||||
pub definitions: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
|
||||
// Array Keywords
|
||||
#[serde(rename = "items")]
|
||||
pub items: Option<Arc<Schema>>,
|
||||
@ -78,10 +71,6 @@ pub struct SchemaObject {
|
||||
pub max_properties: Option<f64>,
|
||||
#[serde(rename = "propertyNames")]
|
||||
pub property_names: Option<Arc<Schema>>,
|
||||
#[serde(rename = "dependentRequired")]
|
||||
pub dependent_required: Option<BTreeMap<String, Vec<String>>>,
|
||||
#[serde(rename = "dependentSchemas")]
|
||||
pub dependent_schemas: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
|
||||
// Numeric Validation
|
||||
pub format: Option<String>,
|
||||
@ -90,7 +79,7 @@ pub struct SchemaObject {
|
||||
#[serde(
|
||||
default,
|
||||
rename = "const",
|
||||
deserialize_with = "crate::util::deserialize_some"
|
||||
deserialize_with = "crate::validator::util::deserialize_some"
|
||||
)]
|
||||
pub const_: Option<Value>,
|
||||
|
||||
@ -131,18 +120,35 @@ pub struct SchemaObject {
|
||||
#[serde(default)]
|
||||
pub extensible: Option<bool>,
|
||||
|
||||
// Compiled Fields (Hidden from JSON/Serde)
|
||||
#[serde(skip)]
|
||||
pub compiled_format: Option<crate::compiler::CompiledFormat>,
|
||||
pub compiled_format: Option<CompiledFormat>,
|
||||
#[serde(skip)]
|
||||
pub compiled_pattern: Option<crate::compiler::CompiledRegex>,
|
||||
pub compiled_pattern: Option<CompiledRegex>,
|
||||
#[serde(skip)]
|
||||
pub compiled_pattern_properties: Option<Vec<(crate::compiler::CompiledRegex, Arc<Schema>)>>,
|
||||
#[serde(skip)]
|
||||
pub compiled_schemas: Option<Arc<crate::registry::Registry>>,
|
||||
pub compiled_pattern_properties: Option<Vec<(CompiledRegex, Arc<Schema>)>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize)]
|
||||
/// Represents a compiled format validator
|
||||
#[derive(Clone)]
|
||||
pub enum CompiledFormat {
|
||||
Func(fn(&serde_json::Value) -> Result<(), Box<dyn std::error::Error + Send + Sync>>),
|
||||
Regex(regex::Regex),
|
||||
}
|
||||
|
||||
impl std::fmt::Debug for CompiledFormat {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
CompiledFormat::Func(_) => write!(f, "CompiledFormat::Func(...)"),
|
||||
CompiledFormat::Regex(r) => write!(f, "CompiledFormat::Regex({:?})", r),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A wrapper for compiled regex patterns
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct CompiledRegex(pub regex::Regex);
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Default)]
|
||||
pub struct Schema {
|
||||
#[serde(flatten)]
|
||||
pub obj: SchemaObject,
|
||||
@ -150,15 +156,6 @@ pub struct Schema {
|
||||
pub always_fail: bool,
|
||||
}
|
||||
|
||||
impl Default for Schema {
|
||||
fn default() -> Self {
|
||||
Schema {
|
||||
obj: SchemaObject::default(),
|
||||
always_fail: false,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::ops::Deref for Schema {
|
||||
type Target = SchemaObject;
|
||||
fn deref(&self) -> &Self::Target {
|
||||
@ -171,6 +168,102 @@ impl std::ops::DerefMut for Schema {
|
||||
}
|
||||
}
|
||||
|
||||
impl Schema {
|
||||
pub fn compile_internals(&mut self) {
|
||||
self.map_children(|child| child.compile_internals());
|
||||
|
||||
if let Some(format_str) = &self.obj.format
|
||||
&& let Some(fmt) = crate::database::formats::FORMATS.get(format_str.as_str())
|
||||
{
|
||||
self.obj.compiled_format = Some(crate::database::schema::CompiledFormat::Func(fmt.func));
|
||||
}
|
||||
|
||||
if let Some(pattern_str) = &self.obj.pattern
|
||||
&& let Ok(re) = regex::Regex::new(pattern_str)
|
||||
{
|
||||
self.obj.compiled_pattern = Some(crate::database::schema::CompiledRegex(re));
|
||||
}
|
||||
|
||||
if let Some(pattern_props) = &self.obj.pattern_properties {
|
||||
let mut compiled = Vec::new();
|
||||
for (k, v) in pattern_props {
|
||||
if let Ok(re) = regex::Regex::new(k) {
|
||||
compiled.push((crate::database::schema::CompiledRegex(re), v.clone()));
|
||||
}
|
||||
}
|
||||
if !compiled.is_empty() {
|
||||
self.obj.compiled_pattern_properties = Some(compiled);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn harvest(&mut self, to_insert: &mut Vec<(String, Schema)>) {
|
||||
if let Some(id) = &self.obj.id {
|
||||
to_insert.push((id.clone(), self.clone()));
|
||||
}
|
||||
self.map_children(|child| child.harvest(to_insert));
|
||||
}
|
||||
|
||||
pub fn map_children<F>(&mut self, mut f: F)
|
||||
where
|
||||
F: FnMut(&mut Schema),
|
||||
{
|
||||
if let Some(props) = &mut self.obj.properties {
|
||||
for v in props.values_mut() {
|
||||
let mut inner = (**v).clone();
|
||||
f(&mut inner);
|
||||
*v = Arc::new(inner);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(pattern_props) = &mut self.obj.pattern_properties {
|
||||
for v in pattern_props.values_mut() {
|
||||
let mut inner = (**v).clone();
|
||||
f(&mut inner);
|
||||
*v = Arc::new(inner);
|
||||
}
|
||||
}
|
||||
|
||||
let mut map_arr = |arr: &mut Vec<Arc<Schema>>| {
|
||||
for v in arr.iter_mut() {
|
||||
let mut inner = (**v).clone();
|
||||
f(&mut inner);
|
||||
*v = Arc::new(inner);
|
||||
}
|
||||
};
|
||||
|
||||
if let Some(arr) = &mut self.obj.prefix_items {
|
||||
map_arr(arr);
|
||||
}
|
||||
if let Some(arr) = &mut self.obj.all_of {
|
||||
map_arr(arr);
|
||||
}
|
||||
if let Some(arr) = &mut self.obj.any_of {
|
||||
map_arr(arr);
|
||||
}
|
||||
if let Some(arr) = &mut self.obj.one_of {
|
||||
map_arr(arr);
|
||||
}
|
||||
|
||||
let mut map_opt = |opt: &mut Option<Arc<Schema>>| {
|
||||
if let Some(v) = opt {
|
||||
let mut inner = (**v).clone();
|
||||
f(&mut inner);
|
||||
*v = Arc::new(inner);
|
||||
}
|
||||
};
|
||||
|
||||
map_opt(&mut self.obj.additional_properties);
|
||||
map_opt(&mut self.obj.items);
|
||||
map_opt(&mut self.obj.contains);
|
||||
map_opt(&mut self.obj.property_names);
|
||||
map_opt(&mut self.obj.not);
|
||||
map_opt(&mut self.obj.if_);
|
||||
map_opt(&mut self.obj.then_);
|
||||
map_opt(&mut self.obj.else_);
|
||||
}
|
||||
}
|
||||
|
||||
impl<'de> Deserialize<'de> for Schema {
|
||||
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||
where
|
||||
@ -188,7 +281,37 @@ impl<'de> Deserialize<'de> for Schema {
|
||||
always_fail: !b,
|
||||
});
|
||||
}
|
||||
let obj: SchemaObject = serde_json::from_value(v.clone()).map_err(serde::de::Error::custom)?;
|
||||
let mut obj: SchemaObject =
|
||||
serde_json::from_value(v.clone()).map_err(serde::de::Error::custom)?;
|
||||
|
||||
// If a schema is effectively empty (except for potentially carrying an ID),
|
||||
// it functions as a boolean `true` schema in Draft2020 which means it should not
|
||||
// restrict additional properties natively
|
||||
let is_empty = obj.type_.is_none()
|
||||
&& obj.properties.is_none()
|
||||
&& obj.pattern_properties.is_none()
|
||||
&& obj.additional_properties.is_none()
|
||||
&& obj.required.is_none()
|
||||
&& obj.dependencies.is_none()
|
||||
&& obj.items.is_none()
|
||||
&& obj.prefix_items.is_none()
|
||||
&& obj.contains.is_none()
|
||||
&& obj.format.is_none()
|
||||
&& obj.enum_.is_none()
|
||||
&& obj.const_.is_none()
|
||||
&& obj.all_of.is_none()
|
||||
&& obj.any_of.is_none()
|
||||
&& obj.one_of.is_none()
|
||||
&& obj.not.is_none()
|
||||
&& obj.if_.is_none()
|
||||
&& obj.then_.is_none()
|
||||
&& obj.else_.is_none()
|
||||
&& obj.r#ref.is_none()
|
||||
&& obj.family.is_none();
|
||||
|
||||
if is_empty && obj.extensible.is_none() {
|
||||
obj.extensible = Some(true);
|
||||
}
|
||||
|
||||
Ok(Schema {
|
||||
obj,
|
||||
39
src/database/type.rs
Normal file
39
src/database/type.rs
Normal file
@ -0,0 +1,39 @@
|
||||
use std::collections::HashSet;
|
||||
|
||||
use crate::database::schema::Schema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::Value;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct Type {
|
||||
pub id: String,
|
||||
pub r#type: String,
|
||||
pub name: String,
|
||||
pub module: String,
|
||||
pub source: String,
|
||||
#[serde(default)]
|
||||
pub historical: bool,
|
||||
#[serde(default)]
|
||||
pub sensitive: bool,
|
||||
#[serde(default)]
|
||||
pub ownable: bool,
|
||||
pub longevity: Option<i32>,
|
||||
#[serde(default)]
|
||||
pub hierarchy: Vec<String>,
|
||||
#[serde(default)]
|
||||
pub variations: HashSet<String>,
|
||||
pub relationship: Option<bool>,
|
||||
#[serde(default)]
|
||||
pub fields: Vec<String>,
|
||||
pub grouped_fields: Option<Value>,
|
||||
#[serde(default)]
|
||||
pub lookup_fields: Vec<String>,
|
||||
#[serde(default)]
|
||||
pub null_fields: Vec<String>,
|
||||
#[serde(default)]
|
||||
pub default_fields: Vec<String>,
|
||||
pub field_types: Option<Value>,
|
||||
#[serde(default)]
|
||||
pub schemas: Vec<Schema>,
|
||||
}
|
||||
21
src/drop.rs
21
src/drop.rs
@ -7,17 +7,22 @@ pub struct Drop {
|
||||
// as they are added by the SQL wrapper. We just need to conform to the structure.
|
||||
// The user said "Validator::validate always needs to return this drop type".
|
||||
// So we should match it as closely as possible.
|
||||
|
||||
#[serde(rename = "type")]
|
||||
pub type_: String, // "drop"
|
||||
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub response: Option<Value>,
|
||||
|
||||
#[serde(default)]
|
||||
#[serde(default, skip_serializing_if = "Vec::is_empty")]
|
||||
pub errors: Vec<Error>,
|
||||
}
|
||||
|
||||
impl Default for Drop {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl Drop {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
@ -30,7 +35,15 @@ impl Drop {
|
||||
pub fn success() -> Self {
|
||||
Self {
|
||||
type_: "drop".to_string(),
|
||||
response: Some(serde_json::json!({ "result": "success" })), // Or appropriate success response
|
||||
response: Some(serde_json::json!("success")),
|
||||
errors: vec![],
|
||||
}
|
||||
}
|
||||
|
||||
pub fn success_with_val(val: Value) -> Self {
|
||||
Self {
|
||||
type_: "drop".to_string(),
|
||||
response: Some(val),
|
||||
errors: vec![],
|
||||
}
|
||||
}
|
||||
@ -46,8 +59,6 @@ impl Drop {
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize, Clone)]
|
||||
pub struct Error {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub punc: Option<String>,
|
||||
pub code: String,
|
||||
pub message: String,
|
||||
pub details: ErrorDetails,
|
||||
|
||||
79
src/entity/GEMINI.md
Normal file
79
src/entity/GEMINI.md
Normal file
@ -0,0 +1,79 @@
|
||||
# Entity Engine (jspg)
|
||||
|
||||
## Overview
|
||||
|
||||
This document outlines the architecture for moving the complex, CPU-bound row merging (`merge_entity`) and dynamic querying (`query_entity`) functionality out of PL/pgSQL and directly into the Rust-based `jspg` extension.
|
||||
|
||||
By treating the `jspg` schema registry as the absolute Single Source of Truth, we can leverage Rust and the Postgres query planner (via SPI) to achieve near O(1) execution planning for deeply nested reads, complex relational writes, and partial hydration beats.
|
||||
|
||||
## The Problem
|
||||
|
||||
Historically, `agreego.merge_entity` (PL/pgSQL) handled nested writes by segmenting JSON, resolving types, searching hierarchies, and dynamically concatenating `INSERT`/`UPDATE` statements. `agreego.query_entity` was conceived to do the same for reads (handling base security, inheritance JOINs, and filtering automatically).
|
||||
|
||||
However, this design hits three major limitations:
|
||||
1. **CPU Bound Operations**: PL/pgSQL is comparatively slow at complex string concatenation and massive JSON graph traversals.
|
||||
2. **Query Planning Cache Busting**: Generating massive, dynamic SQL strings prevents Postgres from caching query plans. `EXECUTE dynamic_sql` forces the planner to re-evaluate statistics and execution paths on every function call, leading to extreme latency spikes at scale.
|
||||
3. **The Hydration Beat Problem**: The Punc framework requires fetching specific UI "fragments" (e.g. just the `target` of a specific `contact` array element) to feed WebSockets. Hand-rolling CTEs for every possible sub-tree permutation to serve beats will quickly become unmaintainable.
|
||||
|
||||
## The Solution: Semantic Engine Database
|
||||
|
||||
By migrating `merge_entity` and `query_entity` to `jspg`, we turn the database into a pre-compiled Semantic Engine.
|
||||
|
||||
1. **Schema-to-SQL Compilation**: During the connection lifecycle (`cache_json_schemas()`), `jspg` statically analyzes the JSON Schema AST. It acts as a compiler, translating the schema layout into perfectly optimized, multi-JOIN SQL query strings for *every* node/fragment in the schema.
|
||||
2. **Prepared Statements (SPI)**: `jspg` feeds these computed SQL strings into the Postgres SPI (Server Programming Interface) using `Spi::prepare()`. Postgres calculates the query execution plan *once* and caches it in memory.
|
||||
3. **Instant Execution**: When a Punc needs data, `jspg` retrieves the cached PreparedStatement, securely binds binary parameters, and executes the pre-planned query instantly.
|
||||
|
||||
## Architecture
|
||||
|
||||
### 1. The `cache_json_schemas()` Expansion
|
||||
The initialization function must now ingest `types` and `agreego.relation` data so the internal `Registry` holds the full Relational Graph.
|
||||
|
||||
During schema compilation, if a schema is associated with a database Type, it triggers the **SQL Compiler Phase**:
|
||||
- It builds a table-resolution AST mapping to `JOIN` clauses based on foreign keys.
|
||||
- It translates JSON schema properties to `SELECT jsonb_build_object(...)`.
|
||||
- It generates static SQL for `INSERT`, `UPDATE`, and `SELECT` (including path-based fragment SELECTs).
|
||||
- It calls `Spi::prepare()` to cache these plans inside the Session Context.
|
||||
|
||||
### 2. `agreego.query_entity` (Reads)
|
||||
* **API**: `agreego.query_entity(schema_id TEXT, fragment_path TEXT, cue JSONB)`
|
||||
* **Execution**:
|
||||
* Rust locates the target Schema in memory.
|
||||
* It uses the `fragment_path` (e.g., `/` for a full read, or `/contacts/0/target` for a hydration beat) to fetch the exact PreparedStatement.
|
||||
* It binds variables (Row Level Security IDs, filtering, pagination limit/offset) parsed from the `cue`.
|
||||
* SPI returns the heavily nested, pre-aggregated `JSONB` instantly.
|
||||
|
||||
### 3. Unified Aggregations & Computeds (Schema `query` objects)
|
||||
We replace the concept of a complex string parser (PEL) with native structured JSON JSON objects using the `query` keyword.
|
||||
|
||||
A structured `query` block in the schema:
|
||||
```json
|
||||
"total": {
|
||||
"type": "number",
|
||||
"readOnly": true,
|
||||
"query": {
|
||||
"aggregate": "sum",
|
||||
"source": "lines",
|
||||
"field": "amount"
|
||||
}
|
||||
}
|
||||
```
|
||||
* **Frontend (Dart)**: The Go generator parses the JSON object directly and emits the native UI aggregation code (e.g. `lines.fold(...)`) for instant UI updates before the server responds.
|
||||
* **Backend (jspg)**: The Rust SQL compiler natively deserializes the `query` object into an internal struct. It recognizes the `aggregate` instruction and outputs a Postgres native aggregation: `(SELECT SUM(amount) FROM agreego.invoice_line WHERE invoice_id = t1.id)` as a column in the prepared `SELECT` statement.
|
||||
* **Unification**: The database-calculated value acts as the authoritative truth, synchronizing and correcting the client automatically on the resulting `beat`.
|
||||
|
||||
### 4. `agreego.merge_entity` (Writes)
|
||||
* **API**: `agreego.merge_entity(cue JSONB)`
|
||||
* **Execution**:
|
||||
* Parses the incoming `cue` JSON via `serde_json` at C-like speeds.
|
||||
* Recursively validates and *constructively masks* the tree against the strict schema.
|
||||
* Traverses the relational graph (which is fully loaded in the `jspg` registry).
|
||||
* Binds the new values directly into the cached `INSERT` or `UPDATE` SPI prepared statements for each table in the hierarchy.
|
||||
* Evaluates field differences and natively uses `pg_notify` to fire atomic row-level changes for the Go Beat framework.
|
||||
|
||||
## Roadmap
|
||||
|
||||
1. **Relational Ingestion**: Update `cache_json_schemas` to pass relational metadata (`agreego.relation` rows) into the `jspg` registry cache.
|
||||
2. **The SQL Compiler**: Build the AST-to-String compiler in Rust that reads properties, `$ref`s, and `$family` trees to piece together generic SQL.
|
||||
3. **SPI Caching**: Integrate `Spi::prepare` into the `Validator` creation phase.
|
||||
4. **Rust `merge_entity`**: Port the constructive structural extraction loop from PL/pgSQL to Rust.
|
||||
5. **Rust `query_entity`**: Abstract the query runtime, mapping Punc JSON `filters` arrays to SPI-bound parameters safely.
|
||||
875
src/formats.rs
875
src/formats.rs
@ -1,875 +0,0 @@
|
||||
use std::{
|
||||
collections::HashMap,
|
||||
error::Error,
|
||||
net::{Ipv4Addr, Ipv6Addr},
|
||||
};
|
||||
|
||||
use lazy_static::lazy_static;
|
||||
use percent_encoding::percent_decode_str;
|
||||
use serde_json::Value;
|
||||
use url::Url;
|
||||
|
||||
// use crate::ecma; // Assuming ecma is not yet available, stubbing regex for now
|
||||
|
||||
/// Defines format for `format` keyword.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Format {
|
||||
/// Name of the format
|
||||
pub name: &'static str,
|
||||
|
||||
/// validates given value.
|
||||
pub func: fn(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>>, // Ensure thread safety if needed
|
||||
}
|
||||
|
||||
lazy_static! {
|
||||
pub(crate) static ref FORMATS: HashMap<&'static str, Format> = {
|
||||
let mut m = HashMap::<&'static str, Format>::new();
|
||||
// Helper to register formats
|
||||
let mut register = |name, func| m.insert(name, Format { name, func });
|
||||
|
||||
// register("regex", validate_regex); // Stubbed
|
||||
register("ipv4", validate_ipv4);
|
||||
register("ipv6", validate_ipv6);
|
||||
register("hostname", validate_hostname);
|
||||
register("idn-hostname", validate_idn_hostname);
|
||||
register("email", validate_email);
|
||||
register("idn-email", validate_idn_email);
|
||||
register("date", validate_date);
|
||||
register("time", validate_time);
|
||||
register("date-time", validate_date_time);
|
||||
register("duration", validate_duration);
|
||||
register("period", validate_period);
|
||||
register("json-pointer", validate_json_pointer);
|
||||
register("relative-json-pointer", validate_relative_json_pointer);
|
||||
register("uuid", validate_uuid);
|
||||
register("uri", validate_uri);
|
||||
register("iri", validate_iri);
|
||||
register("uri-reference", validate_uri_reference);
|
||||
register("iri-reference", validate_iri_reference);
|
||||
register("uri-template", validate_uri_template);
|
||||
m
|
||||
};
|
||||
}
|
||||
|
||||
/*
|
||||
fn validate_regex(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
// ecma::convert(s).map(|_| ())
|
||||
Ok(())
|
||||
}
|
||||
*/
|
||||
|
||||
fn validate_ipv4(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
s.parse::<Ipv4Addr>()?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_ipv6(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
s.parse::<Ipv6Addr>()?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_date(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_date(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn matches_char(s: &str, index: usize, ch: char) -> bool {
|
||||
s.is_char_boundary(index) && s[index..].starts_with(ch)
|
||||
}
|
||||
|
||||
// see https://datatracker.ietf.org/doc/html/rfc3339#section-5.6
|
||||
fn check_date(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// yyyy-mm-dd
|
||||
if s.len() != 10 {
|
||||
Err("must be 10 characters long")?;
|
||||
}
|
||||
if !matches_char(s, 4, '-') || !matches_char(s, 7, '-') {
|
||||
Err("missing hyphen in correct place")?;
|
||||
}
|
||||
|
||||
let mut ymd = s.splitn(3, '-').filter_map(|t| t.parse::<usize>().ok());
|
||||
let (Some(y), Some(m), Some(d)) = (ymd.next(), ymd.next(), ymd.next()) else {
|
||||
Err("non-positive year/month/day")?
|
||||
};
|
||||
|
||||
if !matches!(m, 1..=12) {
|
||||
Err(format!("{m} months in year"))?;
|
||||
}
|
||||
if !matches!(d, 1..=31) {
|
||||
Err(format!("{d} days in month"))?;
|
||||
}
|
||||
|
||||
match m {
|
||||
2 => {
|
||||
let mut feb_days = 28;
|
||||
if y % 4 == 0 && (y % 100 != 0 || y % 400 == 0) {
|
||||
feb_days += 1; // leap year
|
||||
};
|
||||
if d > feb_days {
|
||||
Err(format!("february has {feb_days} days only"))?;
|
||||
}
|
||||
}
|
||||
4 | 6 | 9 | 11 => {
|
||||
if d > 30 {
|
||||
Err("month has 30 days only")?;
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_time(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_time(s)
|
||||
}
|
||||
|
||||
fn check_time(mut str: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// min: hh:mm:ssZ
|
||||
if str.len() < 9 {
|
||||
Err("less than 9 characters long")?
|
||||
}
|
||||
if !matches_char(str, 2, ':') || !matches_char(str, 5, ':') {
|
||||
Err("missing colon in correct place")?
|
||||
}
|
||||
|
||||
// parse hh:mm:ss
|
||||
if !str.is_char_boundary(8) {
|
||||
Err("contains non-ascii char")?
|
||||
}
|
||||
let mut hms = (str[..8])
|
||||
.splitn(3, ':')
|
||||
.filter_map(|t| t.parse::<usize>().ok());
|
||||
let (Some(mut h), Some(mut m), Some(s)) = (hms.next(), hms.next(), hms.next()) else {
|
||||
Err("non-positive hour/min/sec")?
|
||||
};
|
||||
if h > 23 || m > 59 || s > 60 {
|
||||
Err("hour/min/sec out of range")?
|
||||
}
|
||||
str = &str[8..];
|
||||
|
||||
// parse sec-frac if present
|
||||
if let Some(rem) = str.strip_prefix('.') {
|
||||
let n_digits = rem.chars().take_while(char::is_ascii_digit).count();
|
||||
if n_digits == 0 {
|
||||
Err("no digits in second fraction")?;
|
||||
}
|
||||
str = &rem[n_digits..];
|
||||
}
|
||||
|
||||
if str != "z" && str != "Z" {
|
||||
// parse time-numoffset
|
||||
if str.len() != 6 {
|
||||
Err("offset must be 6 characters long")?;
|
||||
}
|
||||
let sign: isize = match str.chars().next() {
|
||||
Some('+') => -1,
|
||||
Some('-') => 1,
|
||||
_ => return Err("offset must begin with plus/minus")?,
|
||||
};
|
||||
str = &str[1..];
|
||||
if !matches_char(str, 2, ':') {
|
||||
Err("missing colon in offset at correct place")?
|
||||
}
|
||||
|
||||
let mut zhm = str.splitn(2, ':').filter_map(|t| t.parse::<usize>().ok());
|
||||
let (Some(zh), Some(zm)) = (zhm.next(), zhm.next()) else {
|
||||
Err("non-positive hour/min in offset")?
|
||||
};
|
||||
if zh > 23 || zm > 59 {
|
||||
Err("hour/min in offset out of range")?
|
||||
}
|
||||
|
||||
// apply timezone
|
||||
let mut hm = (h * 60 + m) as isize + sign * (zh * 60 + zm) as isize;
|
||||
if hm < 0 {
|
||||
hm += 24 * 60;
|
||||
debug_assert!(hm >= 0);
|
||||
}
|
||||
let hm = hm as usize;
|
||||
(h, m) = (hm / 60, hm % 60);
|
||||
}
|
||||
|
||||
// check leap second
|
||||
if !(s < 60 || (h == 23 && m == 59)) {
|
||||
Err("invalid leap second")?
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_date_time(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_date_time(s)
|
||||
}
|
||||
|
||||
fn check_date_time(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// min: yyyy-mm-ddThh:mm:ssZ
|
||||
if s.len() < 20 {
|
||||
Err("less than 20 characters long")?;
|
||||
}
|
||||
if !s.is_char_boundary(10) || !s[10..].starts_with(['t', 'T']) {
|
||||
Err("11th character must be t or T")?;
|
||||
}
|
||||
if let Err(e) = check_date(&s[..10]) {
|
||||
Err(format!("invalid date element: {e}"))?;
|
||||
}
|
||||
if let Err(e) = check_time(&s[11..]) {
|
||||
Err(format!("invalid time element: {e}"))?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_duration(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_duration(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://datatracker.ietf.org/doc/html/rfc3339#appendix-A
|
||||
fn check_duration(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// must start with 'P'
|
||||
let Some(s) = s.strip_prefix('P') else {
|
||||
Err("must start with P")?
|
||||
};
|
||||
if s.is_empty() {
|
||||
Err("nothing after P")?
|
||||
}
|
||||
|
||||
// dur-week
|
||||
if let Some(s) = s.strip_suffix('W') {
|
||||
if s.is_empty() {
|
||||
Err("no number in week")?
|
||||
}
|
||||
if !s.chars().all(|c| c.is_ascii_digit()) {
|
||||
Err("invalid week")?
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
static UNITS: [&str; 2] = ["YMD", "HMS"];
|
||||
for (i, s) in s.split('T').enumerate() {
|
||||
let mut s = s;
|
||||
if i != 0 && s.is_empty() {
|
||||
Err("no time elements")?
|
||||
}
|
||||
let Some(mut units) = UNITS.get(i).cloned() else {
|
||||
Err("more than one T")?
|
||||
};
|
||||
while !s.is_empty() {
|
||||
let digit_count = s.chars().take_while(char::is_ascii_digit).count();
|
||||
if digit_count == 0 {
|
||||
Err("missing number")?
|
||||
}
|
||||
s = &s[digit_count..];
|
||||
let Some(unit) = s.chars().next() else {
|
||||
Err("missing unit")?
|
||||
};
|
||||
let Some(j) = units.find(unit) else {
|
||||
if UNITS[i].contains(unit) {
|
||||
Err(format!("unit {unit} out of order"))?
|
||||
}
|
||||
Err(format!("invalid unit {unit}"))?
|
||||
};
|
||||
units = &units[j + 1..];
|
||||
s = &s[1..];
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://datatracker.ietf.org/doc/html/rfc3339#appendix-A
|
||||
fn validate_period(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
let Some(slash) = s.find('/') else {
|
||||
Err("missing slash")?
|
||||
};
|
||||
|
||||
let (start, end) = (&s[..slash], &s[slash + 1..]);
|
||||
if start.starts_with('P') {
|
||||
if let Err(e) = check_duration(start) {
|
||||
Err(format!("invalid start duration: {e}"))?
|
||||
}
|
||||
if let Err(e) = check_date_time(end) {
|
||||
Err(format!("invalid end date-time: {e}"))?
|
||||
}
|
||||
} else {
|
||||
if let Err(e) = check_date_time(start) {
|
||||
Err(format!("invalid start date-time: {e}"))?
|
||||
}
|
||||
if end.starts_with('P') {
|
||||
if let Err(e) = check_duration(end) {
|
||||
Err(format!("invalid end duration: {e}"))?;
|
||||
}
|
||||
} else if let Err(e) = check_date_time(end) {
|
||||
Err(format!("invalid end date-time: {e}"))?;
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_hostname(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_hostname(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://en.wikipedia.org/wiki/Hostname#Restrictions_on_valid_host_names
|
||||
fn check_hostname(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// entire hostname (including the delimiting dots but not a trailing dot) has a maximum of 253 ASCII characters
|
||||
|
||||
if s.len() > 253 {
|
||||
Err("more than 253 characters long")?
|
||||
}
|
||||
|
||||
// Hostnames are composed of series of labels concatenated with dots, as are all domain names
|
||||
for label in s.split('.') {
|
||||
// Each label must be from 1 to 63 characters long
|
||||
if !matches!(label.len(), 1..=63) {
|
||||
Err("label must be 1 to 63 characters long")?;
|
||||
}
|
||||
|
||||
// labels must not start or end with a hyphen
|
||||
if label.starts_with('-') {
|
||||
Err("label starts with hyphen")?;
|
||||
}
|
||||
|
||||
if label.ends_with('-') {
|
||||
Err("label ends with hyphen")?;
|
||||
}
|
||||
|
||||
// labels may contain only the ASCII letters 'a' through 'z' (in a case-insensitive manner),
|
||||
// the digits '0' through '9', and the hyphen ('-')
|
||||
if let Some(ch) = label
|
||||
.chars()
|
||||
.find(|c| !matches!(c, 'a'..='z' | 'A'..='Z' | '0'..='9' | '-'))
|
||||
{
|
||||
Err(format!("invalid character {ch:?}"))?;
|
||||
}
|
||||
|
||||
// labels must not contain "--" in 3rd and 4th position unless they start with "xn--"
|
||||
if label.len() >= 4 && &label[2..4] == "--" {
|
||||
if !label.starts_with("xn--") {
|
||||
Err("label has -- in 3rd/4th position but does not start with xn--")?;
|
||||
} else {
|
||||
let (unicode, errors) = idna::domain_to_unicode(label);
|
||||
if let Err(_) = errors {
|
||||
Err("invalid punycode")?;
|
||||
}
|
||||
check_unicode_idn_constraints(&unicode).map_err(|e| format!("invalid punycode/IDN: {e}"))?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_idn_hostname(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_idn_hostname(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
static DISALLOWED: [char; 10] = [
|
||||
'\u{0640}', // ARABIC TATWEEL
|
||||
'\u{07FA}', // NKO LAJANYALAN
|
||||
'\u{302E}', // HANGUL SINGLE DOT TONE MARK
|
||||
'\u{302F}', // HANGUL DOUBLE DOT TONE MARK
|
||||
'\u{3031}', // VERTICAL KANA REPEAT MARK
|
||||
'\u{3032}', // VERTICAL KANA REPEAT WITH VOICED SOUND MARK
|
||||
'\u{3033}', // VERTICAL KANA REPEAT MARK UPPER HALF
|
||||
'\u{3034}', // VERTICAL KANA REPEAT WITH VOICED SOUND MARK UPPER HA
|
||||
'\u{3035}', // VERTICAL KANA REPEAT MARK LOWER HALF
|
||||
'\u{303B}', // VERTICAL IDEOGRAPHIC ITERATION MARK
|
||||
];
|
||||
|
||||
fn check_idn_hostname(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let s = idna::domain_to_ascii_strict(s).map_err(|e| format!("idna error: {:?}", e))?;
|
||||
let (unicode, errors) = idna::domain_to_unicode(&s);
|
||||
if let Err(e) = errors {
|
||||
Err(format!("idna decoding error: {:?}", e))?;
|
||||
}
|
||||
check_unicode_idn_constraints(&unicode)?;
|
||||
check_hostname(&s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn check_unicode_idn_constraints(unicode: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#section-2.6
|
||||
{
|
||||
if unicode.contains(DISALLOWED) {
|
||||
Err("contains disallowed character")?;
|
||||
}
|
||||
}
|
||||
|
||||
// unicode string must not contain "--" in 3rd and 4th position
|
||||
// and must not start and end with a '-'
|
||||
// see https://www.rfc-editor.org/rfc/rfc5891#section-4.2.3.1
|
||||
{
|
||||
let count: usize = unicode
|
||||
.chars()
|
||||
.skip(2)
|
||||
.take(2)
|
||||
.map(|c| if c == '-' { 1 } else { 0 })
|
||||
.sum();
|
||||
if count == 2 {
|
||||
Err("unicode string must not contain '--' in 3rd and 4th position")?;
|
||||
}
|
||||
}
|
||||
|
||||
// MIDDLE DOT is allowed between 'l' characters only
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.3
|
||||
{
|
||||
let middle_dot = '\u{00b7}';
|
||||
let mut s = unicode;
|
||||
while let Some(i) = s.find(middle_dot) {
|
||||
let prefix = &s[..i];
|
||||
let suffix = &s[i + middle_dot.len_utf8()..];
|
||||
if !prefix.ends_with('l') || !suffix.ends_with('l') {
|
||||
Err("MIDDLE DOT is allowed between 'l' characters only")?;
|
||||
}
|
||||
s = suffix;
|
||||
}
|
||||
}
|
||||
|
||||
// Greek KERAIA must be followed by Greek character
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.4
|
||||
{
|
||||
let keralia = '\u{0375}';
|
||||
let greek = '\u{0370}'..='\u{03FF}';
|
||||
let mut s = unicode;
|
||||
while let Some(i) = s.find(keralia) {
|
||||
let suffix = &s[i + keralia.len_utf8()..];
|
||||
if !suffix.starts_with(|c| greek.contains(&c)) {
|
||||
Err("Greek KERAIA must be followed by Greek character")?;
|
||||
}
|
||||
s = suffix;
|
||||
}
|
||||
}
|
||||
|
||||
// Hebrew GERESH must be preceded by Hebrew character
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.5
|
||||
//
|
||||
// Hebrew GERSHAYIM must be preceded by Hebrew character
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.6
|
||||
{
|
||||
let geresh = '\u{05F3}';
|
||||
let gereshayim = '\u{05F4}';
|
||||
let hebrew = '\u{0590}'..='\u{05FF}';
|
||||
for ch in [geresh, gereshayim] {
|
||||
let mut s = unicode;
|
||||
while let Some(i) = s.find(ch) {
|
||||
let prefix = &s[..i];
|
||||
if !prefix.ends_with(|c| hebrew.contains(&c)) {
|
||||
if i == 0 {
|
||||
Err("Hebrew GERESH must be preceded by Hebrew character")?;
|
||||
} else {
|
||||
Err("Hebrew GERESHYIM must be preceded by Hebrew character")?;
|
||||
}
|
||||
}
|
||||
let suffix = &s[i + ch.len_utf8()..];
|
||||
s = suffix;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// KATAKANA MIDDLE DOT must be with Hiragana, Katakana, or Han
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.7
|
||||
{
|
||||
let katakana_middle_dot = '\u{30FB}';
|
||||
if unicode.contains(katakana_middle_dot) {
|
||||
let hiragana = '\u{3040}'..='\u{309F}';
|
||||
let katakana = '\u{30A0}'..='\u{30FF}';
|
||||
let han = '\u{4E00}'..='\u{9FFF}'; // https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_(Unicode_block): is this range correct??
|
||||
if unicode.contains(|c| hiragana.contains(&c))
|
||||
|| unicode.contains(|c| c != katakana_middle_dot && katakana.contains(&c))
|
||||
|| unicode.contains(|c| han.contains(&c))
|
||||
{
|
||||
// ok
|
||||
} else {
|
||||
Err("KATAKANA MIDDLE DOT must be with Hiragana, Katakana, or Han")?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ARABIC-INDIC DIGITS and Extended Arabic-Indic Digits cannot be mixed
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.8
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.9
|
||||
{
|
||||
let arabic_indic_digits = '\u{0660}'..='\u{0669}';
|
||||
let extended_arabic_indic_digits = '\u{06F0}'..='\u{06F9}';
|
||||
if unicode.contains(|c| arabic_indic_digits.contains(&c))
|
||||
&& unicode.contains(|c| extended_arabic_indic_digits.contains(&c))
|
||||
{
|
||||
Err("ARABIC-INDIC DIGITS and Extended Arabic-Indic Digits cannot be mixed")?;
|
||||
}
|
||||
}
|
||||
|
||||
// ZERO WIDTH JOINER must be preceded by Virama
|
||||
// see https://www.rfc-editor.org/rfc/rfc5892#appendix-A.2
|
||||
{
|
||||
let zero_width_jointer = '\u{200D}';
|
||||
static VIRAMA: [char; 61] = [
|
||||
'\u{094D}',
|
||||
'\u{09CD}',
|
||||
'\u{0A4D}',
|
||||
'\u{0ACD}',
|
||||
'\u{0B4D}',
|
||||
'\u{0BCD}',
|
||||
'\u{0C4D}',
|
||||
'\u{0CCD}',
|
||||
'\u{0D3B}',
|
||||
'\u{0D3C}',
|
||||
'\u{0D4D}',
|
||||
'\u{0DCA}',
|
||||
'\u{0E3A}',
|
||||
'\u{0EBA}',
|
||||
'\u{0F84}',
|
||||
'\u{1039}',
|
||||
'\u{103A}',
|
||||
'\u{1714}',
|
||||
'\u{1734}',
|
||||
'\u{17D2}',
|
||||
'\u{1A60}',
|
||||
'\u{1B44}',
|
||||
'\u{1BAA}',
|
||||
'\u{1BAB}',
|
||||
'\u{1BF2}',
|
||||
'\u{1BF3}',
|
||||
'\u{2D7F}',
|
||||
'\u{A806}',
|
||||
'\u{A82C}',
|
||||
'\u{A8C4}',
|
||||
'\u{A953}',
|
||||
'\u{A9C0}',
|
||||
'\u{AAF6}',
|
||||
'\u{ABED}',
|
||||
'\u{10A3F}',
|
||||
'\u{11046}',
|
||||
'\u{1107F}',
|
||||
'\u{110B9}',
|
||||
'\u{11133}',
|
||||
'\u{11134}',
|
||||
'\u{111C0}',
|
||||
'\u{11235}',
|
||||
'\u{112EA}',
|
||||
'\u{1134D}',
|
||||
'\u{11442}',
|
||||
'\u{114C2}',
|
||||
'\u{115BF}',
|
||||
'\u{1163F}',
|
||||
'\u{116B6}',
|
||||
'\u{1172B}',
|
||||
'\u{11839}',
|
||||
'\u{1193D}',
|
||||
'\u{1193E}',
|
||||
'\u{119E0}',
|
||||
'\u{11A34}',
|
||||
'\u{11A47}',
|
||||
'\u{11A99}',
|
||||
'\u{11C3F}',
|
||||
'\u{11D44}',
|
||||
'\u{11D45}',
|
||||
'\u{11D97}',
|
||||
]; // https://www.compart.com/en/unicode/combining/9
|
||||
let mut s = unicode;
|
||||
while let Some(i) = s.find(zero_width_jointer) {
|
||||
let prefix = &s[..i];
|
||||
if !prefix.ends_with(VIRAMA) {
|
||||
Err("ZERO WIDTH JOINER must be preceded by Virama")?;
|
||||
}
|
||||
let suffix = &s[i + zero_width_jointer.len_utf8()..];
|
||||
s = suffix;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_email(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_email(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://en.wikipedia.org/wiki/Email_address
|
||||
fn check_email(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
// entire email address to be no more than 254 characters long
|
||||
if s.len() > 254 {
|
||||
Err("more than 254 characters long")?
|
||||
}
|
||||
|
||||
// email address is generally recognized as having two parts joined with an at-sign
|
||||
let Some(at) = s.rfind('@') else {
|
||||
Err("missing @")?
|
||||
};
|
||||
let (local, domain) = (&s[..at], &s[at + 1..]);
|
||||
|
||||
// local part may be up to 64 characters long
|
||||
if local.len() > 64 {
|
||||
Err("local part more than 64 characters long")?
|
||||
}
|
||||
|
||||
if local.len() > 1 && local.starts_with('"') && local.ends_with('"') {
|
||||
// quoted
|
||||
let local = &local[1..local.len() - 1];
|
||||
if local.contains(['\\', '"']) {
|
||||
Err("backslash and quote not allowed within quoted local part")?
|
||||
}
|
||||
} else {
|
||||
// unquoted
|
||||
|
||||
if local.starts_with('.') {
|
||||
Err("starts with dot")?
|
||||
}
|
||||
if local.ends_with('.') {
|
||||
Err("ends with dot")?
|
||||
}
|
||||
|
||||
// consecutive dots not allowed
|
||||
if local.contains("..") {
|
||||
Err("consecutive dots")?
|
||||
}
|
||||
|
||||
// check allowd chars
|
||||
if let Some(ch) = local
|
||||
.chars()
|
||||
.find(|c| !(c.is_ascii_alphanumeric() || ".!#$%&'*+-/=?^_`{|}~".contains(*c)))
|
||||
{
|
||||
Err(format!("invalid character {ch:?}"))?
|
||||
}
|
||||
}
|
||||
|
||||
// domain if enclosed in brackets, must match an IP address
|
||||
if domain.starts_with('[') && domain.ends_with(']') {
|
||||
let s = &domain[1..domain.len() - 1];
|
||||
if let Some(s) = s.strip_prefix("IPv6:") {
|
||||
if let Err(e) = s.parse::<Ipv6Addr>() {
|
||||
Err(format!("invalid ipv6 address: {e}"))?
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
if let Err(e) = s.parse::<Ipv4Addr>() {
|
||||
Err(format!("invalid ipv4 address: {e}"))?
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// domain must match the requirements for a hostname
|
||||
if let Err(e) = check_hostname(domain) {
|
||||
Err(format!("invalid domain: {e}"))?
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_idn_email(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
let Some(at) = s.rfind('@') else {
|
||||
Err("missing @")?
|
||||
};
|
||||
let (local, domain) = (&s[..at], &s[at + 1..]);
|
||||
|
||||
let local = idna::domain_to_ascii_strict(local).map_err(|e| format!("idna error: {:?}", e))?;
|
||||
let domain = idna::domain_to_ascii_strict(domain).map_err(|e| format!("idna error: {:?}", e))?;
|
||||
if let Err(e) = check_idn_hostname(&domain) {
|
||||
Err(format!("invalid domain: {e}"))?
|
||||
}
|
||||
check_email(&format!("{local}@{domain}"))
|
||||
}
|
||||
|
||||
fn validate_json_pointer(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
check_json_pointer(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://www.rfc-editor.org/rfc/rfc6901#section-3
|
||||
fn check_json_pointer(s: &str) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
if s.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
if !s.starts_with('/') {
|
||||
Err("not starting with slash")?;
|
||||
}
|
||||
for token in s.split('/').skip(1) {
|
||||
let mut chars = token.chars();
|
||||
while let Some(ch) = chars.next() {
|
||||
if ch == '~' {
|
||||
if !matches!(chars.next(), Some('0' | '1')) {
|
||||
Err("~ must be followed by 0 or 1")?;
|
||||
}
|
||||
} else if !matches!(ch, '\x00'..='\x2E' | '\x30'..='\x7D' | '\x7F'..='\u{10FFFF}') {
|
||||
Err("contains disallowed character")?;
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://tools.ietf.org/html/draft-handrews-relative-json-pointer-01#section-3
|
||||
fn validate_relative_json_pointer(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
// start with non-negative-integer
|
||||
let num_digits = s.chars().take_while(char::is_ascii_digit).count();
|
||||
if num_digits == 0 {
|
||||
Err("must start with non-negative integer")?;
|
||||
}
|
||||
if num_digits > 1 && s.starts_with('0') {
|
||||
Err("starts with zero")?;
|
||||
}
|
||||
let s = &s[num_digits..];
|
||||
|
||||
// followed by either json-pointer or '#'
|
||||
if s == "#" {
|
||||
return Ok(());
|
||||
}
|
||||
if let Err(e) = check_json_pointer(s) {
|
||||
Err(format!("invalid json-pointer element: {e}"))?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// see https://datatracker.ietf.org/doc/html/rfc4122#page-4
|
||||
fn validate_uuid(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
static HEX_GROUPS: [usize; 5] = [8, 4, 4, 4, 12];
|
||||
let mut i = 0;
|
||||
for group in s.split('-') {
|
||||
if i >= HEX_GROUPS.len() {
|
||||
Err("more than 5 elements")?;
|
||||
}
|
||||
if group.len() != HEX_GROUPS[i] {
|
||||
Err(format!(
|
||||
"element {} must be {} characters long",
|
||||
i + 1,
|
||||
HEX_GROUPS[i]
|
||||
))?;
|
||||
}
|
||||
if let Some(ch) = group.chars().find(|c| !c.is_ascii_hexdigit()) {
|
||||
Err(format!("non-hex character {ch:?}"))?;
|
||||
}
|
||||
i += 1;
|
||||
}
|
||||
if i != HEX_GROUPS.len() {
|
||||
Err("must have 5 elements")?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_uri(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
if fluent_uri::UriRef::parse(s.as_str()).map_err(|e| e.to_string())?.scheme().is_none() {
|
||||
Err("relative url")?;
|
||||
};
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_iri(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
match Url::parse(s) {
|
||||
Ok(_) => Ok(()),
|
||||
Err(url::ParseError::RelativeUrlWithoutBase) => Err("relative url")?,
|
||||
Err(e) => Err(e)?,
|
||||
}
|
||||
}
|
||||
|
||||
lazy_static! {
|
||||
static ref TEMP_URL: Url = Url::parse("http://temp.com").unwrap();
|
||||
}
|
||||
|
||||
fn parse_uri_reference(s: &str) -> Result<Url, Box<dyn Error + Send + Sync>> {
|
||||
if s.contains('\\') {
|
||||
Err("contains \\\\")?;
|
||||
}
|
||||
Ok(TEMP_URL.join(s)?)
|
||||
}
|
||||
|
||||
fn validate_uri_reference(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
fluent_uri::UriRef::parse(s.as_str()).map_err(|e| e.to_string())?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_iri_reference(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
parse_uri_reference(s)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_uri_template(v: &Value) -> Result<(), Box<dyn Error + Send + Sync>> {
|
||||
let Value::String(s) = v else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
let url = parse_uri_reference(s)?;
|
||||
|
||||
let path = url.path();
|
||||
// path we got has curly bases percent encoded
|
||||
let path = percent_decode_str(path).decode_utf8()?;
|
||||
|
||||
// ensure curly brackets are not nested and balanced
|
||||
for part in path.as_ref().split('/') {
|
||||
let mut want = true;
|
||||
for got in part
|
||||
.chars()
|
||||
.filter(|c| matches!(c, '{' | '}'))
|
||||
.map(|c| c == '{')
|
||||
{
|
||||
if got != want {
|
||||
Err("nested curly braces")?;
|
||||
}
|
||||
want = !want;
|
||||
}
|
||||
if !want {
|
||||
Err("no matching closing brace")?
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
29
src/jspg.rs
Normal file
29
src/jspg.rs
Normal file
@ -0,0 +1,29 @@
|
||||
use crate::database::Database;
|
||||
use crate::merger::Merger;
|
||||
use crate::queryer::Queryer;
|
||||
use crate::validator::Validator;
|
||||
use std::sync::Arc;
|
||||
|
||||
pub struct Jspg {
|
||||
pub database: Arc<Database>,
|
||||
pub validator: Validator,
|
||||
pub queryer: Queryer,
|
||||
pub merger: Merger,
|
||||
}
|
||||
|
||||
impl Jspg {
|
||||
pub fn new(database_val: &serde_json::Value) -> Self {
|
||||
let database_instance = Database::new(database_val);
|
||||
let database = Arc::new(database_instance);
|
||||
let validator = Validator::new(database.clone());
|
||||
let queryer = Queryer::new();
|
||||
let merger = Merger::new();
|
||||
|
||||
Self {
|
||||
database,
|
||||
validator,
|
||||
queryer,
|
||||
merger,
|
||||
}
|
||||
}
|
||||
}
|
||||
202
src/lib.rs
202
src/lib.rs
@ -2,134 +2,134 @@ use pgrx::*;
|
||||
|
||||
pg_module_magic!();
|
||||
|
||||
pub mod compiler;
|
||||
pub mod database;
|
||||
pub mod drop;
|
||||
pub mod formats;
|
||||
pub mod jspg;
|
||||
pub mod merger;
|
||||
pub mod queryer;
|
||||
pub mod validator;
|
||||
|
||||
pub mod registry;
|
||||
mod schema;
|
||||
pub mod util;
|
||||
mod validator;
|
||||
use serde_json::json;
|
||||
use std::sync::{Arc, RwLock};
|
||||
|
||||
use crate::registry::REGISTRY;
|
||||
use crate::schema::Schema;
|
||||
use serde_json::{Value, json};
|
||||
lazy_static::lazy_static! {
|
||||
// Global Atomic Swap Container:
|
||||
// - RwLock: To protect the SWAP of the Option.
|
||||
// - Option: Because it starts empty.
|
||||
// - Arc: Because multiple running threads might hold the OLD engine while we swap.
|
||||
// - Jspg: The root semantic engine encapsulating the database metadata, validator, queryer, and merger.
|
||||
static ref GLOBAL_JSPG: RwLock<Option<Arc<jspg::Jspg>>> = RwLock::new(None);
|
||||
}
|
||||
|
||||
#[pg_extern(strict)]
|
||||
fn cache_json_schemas(enums: JsonB, types: JsonB, puncs: JsonB) -> JsonB {
|
||||
let mut registry = REGISTRY.write().unwrap();
|
||||
registry.clear();
|
||||
pub fn jspg_cache_database(database: JsonB) -> JsonB {
|
||||
let new_jspg = crate::jspg::Jspg::new(&database.0);
|
||||
let new_arc = Arc::new(new_jspg);
|
||||
|
||||
// Generate Family Schemas from Types
|
||||
// 3. ATOMIC SWAP
|
||||
{
|
||||
let mut family_map: std::collections::HashMap<String, std::collections::HashSet<String>> =
|
||||
std::collections::HashMap::new();
|
||||
if let Value::Array(arr) = &types.0 {
|
||||
for item in arr {
|
||||
if let Some(name) = item.get("name").and_then(|v| v.as_str()) {
|
||||
if let Some(hierarchy) = item.get("hierarchy").and_then(|v| v.as_array()) {
|
||||
for ancestor in hierarchy {
|
||||
if let Some(anc_str) = ancestor.as_str() {
|
||||
family_map
|
||||
.entry(anc_str.to_string())
|
||||
.or_default()
|
||||
.insert(name.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
let mut lock = GLOBAL_JSPG.write().unwrap();
|
||||
*lock = Some(new_arc);
|
||||
}
|
||||
|
||||
for (family_name, members) in family_map {
|
||||
let id = format!("{}.family", family_name);
|
||||
let drop = crate::drop::Drop::success();
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
// `mask_json_schema` has been removed as the mask architecture is fully replaced by Spi string queries during DB interactions.
|
||||
|
||||
// Object Union (for polymorphic object validation)
|
||||
// This allows the schema to match ANY of the types in the family hierarchy
|
||||
let object_refs: Vec<Value> = members.iter().map(|s| json!({ "$ref": s })).collect();
|
||||
|
||||
let schema_json = json!({
|
||||
"$id": id,
|
||||
"oneOf": object_refs
|
||||
});
|
||||
|
||||
if let Ok(schema) = serde_json::from_value::<Schema>(schema_json) {
|
||||
let compiled = crate::compiler::Compiler::compile(schema, Some(id.clone()));
|
||||
registry.insert(id, compiled);
|
||||
}
|
||||
}
|
||||
|
||||
// Helper to parse and cache a list of items
|
||||
let mut cache_items = |items: JsonB| {
|
||||
if let Value::Array(arr) = items.0 {
|
||||
for item in arr {
|
||||
// For now, we assume the item structure matches what the generator expects
|
||||
// or what `json_schemas.sql` sends.
|
||||
// The `Schema` struct in `schema.rs` is designed to deserialize standard JSON Schema.
|
||||
// However, the input here is an array of objects that *contain* a `schemas` array.
|
||||
// We need to extract those inner schemas.
|
||||
|
||||
if let Some(schemas_val) = item.get("schemas") {
|
||||
if let Value::Array(schemas) = schemas_val {
|
||||
for schema_val in schemas {
|
||||
// Deserialize into our robust Schema struct to ensure validity/parsing
|
||||
if let Ok(schema) = serde_json::from_value::<Schema>(schema_val.clone()) {
|
||||
if let Some(id) = &schema.obj.id {
|
||||
let id_clone = id.clone();
|
||||
// Store the compiled Schema in the registry.
|
||||
// The registry.insert method now handles simple insertion of CompiledSchema
|
||||
let compiled =
|
||||
crate::compiler::Compiler::compile(schema, Some(id_clone.clone()));
|
||||
registry.insert(id_clone, compiled);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
pub fn validate_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
// 1. Acquire Snapshot
|
||||
let jspg_arc = {
|
||||
let lock = GLOBAL_JSPG.read().unwrap();
|
||||
lock.clone()
|
||||
};
|
||||
|
||||
cache_items(enums);
|
||||
cache_items(types);
|
||||
cache_items(puncs); // public/private distinction logic to come later
|
||||
// 2. Validate (Lock-Free)
|
||||
if let Some(engine) = jspg_arc {
|
||||
match engine.validator.validate(schema_id, &instance.0) {
|
||||
Ok(result) => {
|
||||
if result.is_valid() {
|
||||
let drop = crate::drop::Drop::success();
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
} else {
|
||||
let errors: Vec<crate::drop::Error> = result
|
||||
.errors
|
||||
.into_iter()
|
||||
.map(|e| crate::drop::Error {
|
||||
code: e.code,
|
||||
message: e.message,
|
||||
details: crate::drop::ErrorDetails { path: e.path },
|
||||
})
|
||||
.collect();
|
||||
let drop = crate::drop::Drop::with_errors(errors);
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
let error = crate::drop::Error {
|
||||
code: e.code,
|
||||
message: e.message,
|
||||
details: crate::drop::ErrorDetails { path: e.path },
|
||||
};
|
||||
let drop = crate::drop::Drop::with_errors(vec![error]);
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
}
|
||||
} else {
|
||||
let error = crate::drop::Error {
|
||||
code: "VALIDATOR_NOT_INITIALIZED".to_string(),
|
||||
message: "The JSPG database has not been cached yet. Run jspg_cache_database()".to_string(),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: "".to_string(),
|
||||
},
|
||||
};
|
||||
let drop = crate::drop::Drop::with_errors(vec![error]);
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
JsonB(json!({ "response": "success" }))
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn validate_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
let drop = validator::Validator::validate(schema_id, &instance.0);
|
||||
pub fn json_schema_cached(schema_id: &str) -> bool {
|
||||
if let Some(engine) = GLOBAL_JSPG.read().unwrap().as_ref() {
|
||||
match engine
|
||||
.validator
|
||||
.validate(schema_id, &serde_json::Value::Null)
|
||||
{
|
||||
Err(e) if e.code == "SCHEMA_NOT_FOUND" => false,
|
||||
_ => true,
|
||||
}
|
||||
} else {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
#[pg_extern(strict)]
|
||||
pub fn clear_json_schemas() -> JsonB {
|
||||
let mut lock = GLOBAL_JSPG.write().unwrap();
|
||||
*lock = None;
|
||||
let drop = crate::drop::Drop::success();
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn json_schema_cached(schema_id: &str) -> bool {
|
||||
let registry = REGISTRY.read().unwrap();
|
||||
registry.get(schema_id).is_some()
|
||||
pub fn show_json_schemas() -> JsonB {
|
||||
if let Some(engine) = GLOBAL_JSPG.read().unwrap().as_ref() {
|
||||
let mut keys = engine.validator.get_schema_ids();
|
||||
keys.sort();
|
||||
let drop = crate::drop::Drop::success_with_val(json!(keys));
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
} else {
|
||||
let drop = crate::drop::Drop::success_with_val(json!([]));
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
|
||||
#[pg_extern(strict)]
|
||||
fn clear_json_schemas() -> JsonB {
|
||||
let mut registry = REGISTRY.write().unwrap();
|
||||
registry.clear();
|
||||
JsonB(json!({ "response": "success" }))
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn show_json_schemas() -> JsonB {
|
||||
let registry = REGISTRY.read().unwrap();
|
||||
// Debug dump
|
||||
// In a real scenario we might return the whole map, but for now just success
|
||||
// or maybe a list of keys
|
||||
JsonB(json!({ "response": "success", "count": registry.len() }))
|
||||
}
|
||||
|
||||
#[cfg(any(test, feature = "pg_test"))]
|
||||
#[pg_schema]
|
||||
mod tests {
|
||||
use pgrx::prelude::*;
|
||||
include!("tests.rs");
|
||||
include!("tests/fixtures.rs");
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
|
||||
15
src/merger/mod.rs
Normal file
15
src/merger/mod.rs
Normal file
@ -0,0 +1,15 @@
|
||||
pub struct Merger {
|
||||
// To be implemented
|
||||
}
|
||||
|
||||
impl Default for Merger {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl Merger {
|
||||
pub fn new() -> Self {
|
||||
Self {}
|
||||
}
|
||||
}
|
||||
15
src/queryer/mod.rs
Normal file
15
src/queryer/mod.rs
Normal file
@ -0,0 +1,15 @@
|
||||
pub struct Queryer {
|
||||
// To be implemented
|
||||
}
|
||||
|
||||
impl Default for Queryer {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl Queryer {
|
||||
pub fn new() -> Self {
|
||||
Self {}
|
||||
}
|
||||
}
|
||||
@ -1,40 +0,0 @@
|
||||
use crate::schema::Schema;
|
||||
use lazy_static::lazy_static;
|
||||
use std::collections::HashMap;
|
||||
use std::sync::RwLock;
|
||||
|
||||
lazy_static! {
|
||||
pub static ref REGISTRY: RwLock<Registry> = RwLock::new(Registry::new());
|
||||
}
|
||||
|
||||
use std::sync::Arc;
|
||||
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct Registry {
|
||||
pub schemas: HashMap<String, Arc<Schema>>,
|
||||
}
|
||||
|
||||
impl Registry {
|
||||
pub fn new() -> Self {
|
||||
Registry {
|
||||
schemas: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn insert(&mut self, id: String, schema: Arc<Schema>) {
|
||||
// We allow overwriting for now to support re-compilation in tests/dev
|
||||
self.schemas.insert(id, schema);
|
||||
}
|
||||
|
||||
pub fn get(&self, id: &str) -> Option<Arc<Schema>> {
|
||||
self.schemas.get(id).cloned()
|
||||
}
|
||||
|
||||
pub fn clear(&mut self) {
|
||||
self.schemas.clear();
|
||||
}
|
||||
|
||||
pub fn len(&self) -> usize {
|
||||
self.schemas.len()
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
413
src/util.rs
413
src/util.rs
@ -1,413 +0,0 @@
|
||||
use serde::Deserialize;
|
||||
use std::fs;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TestSuite {
|
||||
#[allow(dead_code)]
|
||||
description: String,
|
||||
schema: Option<serde_json::Value>,
|
||||
// Support JSPG-style test suites with explicit types/enums/puncs
|
||||
types: Option<serde_json::Value>,
|
||||
enums: Option<serde_json::Value>,
|
||||
puncs: Option<serde_json::Value>,
|
||||
tests: Vec<TestCase>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TestCase {
|
||||
description: String,
|
||||
data: serde_json::Value,
|
||||
valid: bool,
|
||||
// Support explicit schema ID target for test case
|
||||
schema_id: Option<String>,
|
||||
// Expected output for masking tests
|
||||
#[allow(dead_code)]
|
||||
expected: Option<serde_json::Value>,
|
||||
}
|
||||
|
||||
use crate::registry::REGISTRY;
|
||||
use crate::validator::Validator;
|
||||
use serde_json::Value;
|
||||
|
||||
pub fn deserialize_some<'de, D>(deserializer: D) -> Result<Option<Value>, D::Error>
|
||||
where
|
||||
D: serde::Deserializer<'de>,
|
||||
{
|
||||
let v = Value::deserialize(deserializer)?;
|
||||
Ok(Some(v))
|
||||
}
|
||||
|
||||
pub fn run_test_file_at_index(path: &str, index: usize) -> Result<(), String> {
|
||||
// Clear registry to ensure isolation
|
||||
// {
|
||||
// let mut registry = REGISTRY.write().unwrap();
|
||||
// registry.clear();
|
||||
// }
|
||||
|
||||
let content =
|
||||
fs::read_to_string(path).unwrap_or_else(|_| panic!("Failed to read file: {}", path));
|
||||
let suite: Vec<TestSuite> = serde_json::from_str(&content)
|
||||
.unwrap_or_else(|e| panic!("Failed to parse JSON in {}: {}", path, e));
|
||||
|
||||
if index >= suite.len() {
|
||||
panic!("Index {} out of bounds for file {}", index, path);
|
||||
}
|
||||
|
||||
let group = &suite[index];
|
||||
let mut failures = Vec::<String>::new();
|
||||
|
||||
let mut registry = crate::registry::Registry::new();
|
||||
|
||||
// Helper to register items with 'schemas'
|
||||
let register_schemas = |registry: &mut crate::registry::Registry, items_val: Option<&Value>| {
|
||||
if let Some(val) = items_val {
|
||||
if let Value::Array(arr) = val {
|
||||
for item in arr {
|
||||
if let Some(schemas_val) = item.get("schemas") {
|
||||
if let Value::Array(schemas) = schemas_val {
|
||||
for schema_val in schemas {
|
||||
if let Ok(schema) =
|
||||
serde_json::from_value::<crate::schema::Schema>(schema_val.clone())
|
||||
{
|
||||
// Clone ID upfront to avoid borrow issues
|
||||
if let Some(id_clone) = schema.obj.id.clone() {
|
||||
let compiled =
|
||||
crate::compiler::Compiler::compile(schema, Some(id_clone.clone()));
|
||||
registry.insert(id_clone, compiled);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// 1. Register Family Schemas if 'types' is present
|
||||
if let Some(types_val) = &group.types {
|
||||
if let Value::Array(arr) = types_val {
|
||||
let mut family_map: std::collections::HashMap<String, std::collections::HashSet<String>> =
|
||||
std::collections::HashMap::new();
|
||||
|
||||
for item in arr {
|
||||
if let Some(name) = item.get("name").and_then(|v| v.as_str()) {
|
||||
if let Some(hierarchy) = item.get("hierarchy").and_then(|v| v.as_array()) {
|
||||
for ancestor in hierarchy {
|
||||
if let Some(anc_str) = ancestor.as_str() {
|
||||
family_map
|
||||
.entry(anc_str.to_string())
|
||||
.or_default()
|
||||
.insert(name.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (family_name, members) in family_map {
|
||||
let id = format!("{}.family", family_name);
|
||||
let object_refs: Vec<Value> = members
|
||||
.iter()
|
||||
.map(|s| serde_json::json!({ "$ref": s }))
|
||||
.collect();
|
||||
|
||||
let schema_json = serde_json::json!({
|
||||
"$id": id,
|
||||
"oneOf": object_refs
|
||||
});
|
||||
|
||||
if let Ok(schema) = serde_json::from_value::<crate::schema::Schema>(schema_json) {
|
||||
let compiled = crate::compiler::Compiler::compile(schema, Some(id.clone()));
|
||||
registry.insert(id, compiled);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Register items directly
|
||||
register_schemas(&mut registry, group.enums.as_ref());
|
||||
register_schemas(&mut registry, group.types.as_ref());
|
||||
register_schemas(&mut registry, group.puncs.as_ref());
|
||||
|
||||
// 3. Register root 'schemas' if present (generic test support)
|
||||
// Some tests use a raw 'schema' or 'schemas' field at the group level
|
||||
if let Some(schema_val) = &group.schema {
|
||||
match serde_json::from_value::<crate::schema::Schema>(schema_val.clone()) {
|
||||
Ok(schema) => {
|
||||
let id = schema
|
||||
.obj
|
||||
.id
|
||||
.clone()
|
||||
.or_else(|| {
|
||||
// Fallback ID if none provided in schema
|
||||
Some(format!("test:{}:{}", path, index))
|
||||
})
|
||||
.unwrap();
|
||||
|
||||
let mut registry_ref = &mut registry;
|
||||
let compiled = crate::compiler::Compiler::compile(schema, Some(id.clone()));
|
||||
registry_ref.insert(id, compiled);
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!(
|
||||
"DEBUG: FAILED to deserialize group schema for index {}: {}",
|
||||
index, e
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Run Tests
|
||||
for (_test_index, test) in group.tests.iter().enumerate() {
|
||||
let mut schema_id = test.schema_id.clone();
|
||||
|
||||
// If no explicit schema_id, try to infer from the single schema in the group
|
||||
if schema_id.is_none() {
|
||||
if let Some(s) = &group.schema {
|
||||
// If 'schema' is a single object, use its ID or "root"
|
||||
if let Some(obj) = s.as_object() {
|
||||
if let Some(id_val) = obj.get("$id") {
|
||||
schema_id = id_val.as_str().map(|s| s.to_string());
|
||||
}
|
||||
}
|
||||
if schema_id.is_none() {
|
||||
schema_id = Some(format!("test:{}:{}", path, index));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Default to the first punc if present (for puncs.json style)
|
||||
if schema_id.is_none() {
|
||||
if let Some(Value::Array(puncs)) = &group.puncs {
|
||||
if let Some(first_punc) = puncs.first() {
|
||||
if let Some(Value::Array(schemas)) = first_punc.get("schemas") {
|
||||
if let Some(first_schema) = schemas.first() {
|
||||
if let Some(id) = first_schema.get("$id").and_then(|v| v.as_str()) {
|
||||
schema_id = Some(id.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(sid) = schema_id {
|
||||
let result = Validator::validate_with_registry(&sid, &test.data, ®istry);
|
||||
|
||||
if !result.errors.is_empty() != !test.valid {
|
||||
failures.push(format!(
|
||||
"[{}] Test '{}' failed. Expected: {}, Got: {}. Errors: {:?}",
|
||||
group.description,
|
||||
test.description,
|
||||
test.valid,
|
||||
!result.errors.is_empty(), // "Got Invalid?"
|
||||
result.errors
|
||||
));
|
||||
}
|
||||
} else {
|
||||
failures.push(format!(
|
||||
"[{}] Test '{}' skipped: No schema ID found.",
|
||||
group.description, test.description
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
if !failures.is_empty() {
|
||||
return Err(failures.join("\n"));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
pub fn run_test_file(path: &str) -> Result<(), String> {
|
||||
let content =
|
||||
fs::read_to_string(path).unwrap_or_else(|_| panic!("Failed to read file: {}", path));
|
||||
let suite: Vec<TestSuite> = serde_json::from_str(&content)
|
||||
.unwrap_or_else(|e| panic!("Failed to parse JSON in {}: {}", path, e));
|
||||
|
||||
let mut failures = Vec::<String>::new();
|
||||
for (group_index, group) in suite.into_iter().enumerate() {
|
||||
// Helper to register items with 'schemas'
|
||||
let register_schemas = |items_val: Option<Value>| {
|
||||
if let Some(val) = items_val {
|
||||
if let Value::Array(arr) = val {
|
||||
for item in arr {
|
||||
if let Some(schemas_val) = item.get("schemas") {
|
||||
if let Value::Array(schemas) = schemas_val {
|
||||
for schema_val in schemas {
|
||||
if let Ok(schema) =
|
||||
serde_json::from_value::<crate::schema::Schema>(schema_val.clone())
|
||||
{
|
||||
// Clone ID upfront to avoid borrow issues
|
||||
if let Some(id_clone) = schema.obj.id.clone() {
|
||||
let mut registry = REGISTRY.write().unwrap();
|
||||
// Utilize the new compile method which handles strictness
|
||||
let compiled =
|
||||
crate::compiler::Compiler::compile(schema, Some(id_clone.clone()));
|
||||
registry.insert(id_clone, compiled);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// 1. Register Family Schemas if 'types' is present
|
||||
if let Some(types_val) = &group.types {
|
||||
if let Value::Array(arr) = types_val {
|
||||
let mut family_map: std::collections::HashMap<String, std::collections::HashSet<String>> =
|
||||
std::collections::HashMap::new();
|
||||
|
||||
for item in arr {
|
||||
if let Some(name) = item.get("name").and_then(|v| v.as_str()) {
|
||||
// Default hierarchy contains self if not specified?
|
||||
// Usually hierarchy is explicit in these tests.
|
||||
if let Some(hierarchy) = item.get("hierarchy").and_then(|v| v.as_array()) {
|
||||
for ancestor in hierarchy {
|
||||
if let Some(anc_str) = ancestor.as_str() {
|
||||
family_map
|
||||
.entry(anc_str.to_string())
|
||||
.or_default()
|
||||
.insert(name.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (family_name, members) in family_map {
|
||||
let id = format!("{}.family", family_name);
|
||||
let object_refs: Vec<Value> = members
|
||||
.into_iter()
|
||||
.map(|s| serde_json::json!({ "$ref": s }))
|
||||
.collect();
|
||||
|
||||
let schema_json = serde_json::json!({
|
||||
"$id": id,
|
||||
"oneOf": object_refs
|
||||
});
|
||||
|
||||
if let Ok(schema) = serde_json::from_value::<crate::schema::Schema>(schema_json) {
|
||||
let mut registry = REGISTRY.write().unwrap();
|
||||
let compiled = crate::compiler::Compiler::compile(schema, Some(id.clone()));
|
||||
registry.insert(id, compiled);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Register 'types', 'enums', and 'puncs' if present (JSPG style)
|
||||
register_schemas(group.types);
|
||||
register_schemas(group.enums);
|
||||
register_schemas(group.puncs);
|
||||
|
||||
// Register main 'schema' if present (Standard style)
|
||||
// Ensure ID is a valid URI to avoid Url::parse errors in Compiler
|
||||
let unique_id = format!("test:{}:{}", path, group_index);
|
||||
|
||||
// Register main 'schema' if present (Standard style)
|
||||
if let Some(ref schema_val) = group.schema {
|
||||
let mut registry = REGISTRY.write().unwrap();
|
||||
let schema: crate::schema::Schema =
|
||||
serde_json::from_value(schema_val.clone()).expect("Failed to parse test schema");
|
||||
let compiled = crate::compiler::Compiler::compile(schema, Some(unique_id.clone()));
|
||||
registry.insert(unique_id.clone(), compiled);
|
||||
}
|
||||
|
||||
for test in group.tests {
|
||||
// Use explicit schema_id from test, or default to unique_id
|
||||
let schema_id = test.schema_id.as_deref().unwrap_or(&unique_id).to_string();
|
||||
|
||||
let drop = Validator::validate(&schema_id, &test.data);
|
||||
|
||||
if test.valid {
|
||||
if !drop.errors.is_empty() {
|
||||
let msg = format!(
|
||||
"Test failed (expected valid): {}\nSchema: {:?}\nData: {:?}\nErrors: {:?}",
|
||||
test.description,
|
||||
group.schema, // We might need to find the actual schema used if schema_id is custom
|
||||
test.data,
|
||||
drop.errors
|
||||
);
|
||||
eprintln!("{}", msg);
|
||||
failures.push(msg);
|
||||
}
|
||||
} else {
|
||||
if drop.errors.is_empty() {
|
||||
let msg = format!(
|
||||
"Test failed (expected invalid): {}\nSchema: {:?}\nData: {:?}\nErrors: (Empty)",
|
||||
test.description, group.schema, test.data
|
||||
);
|
||||
println!("{}", msg);
|
||||
failures.push(msg);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !failures.is_empty() {
|
||||
return Err(format!(
|
||||
"{} tests failed in file {}:\n\n{}",
|
||||
failures.len(),
|
||||
path,
|
||||
failures.join("\n\n")
|
||||
));
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn is_integer(v: &Value) -> bool {
|
||||
match v {
|
||||
Value::Number(n) => {
|
||||
n.is_i64() || n.is_u64() || n.as_f64().filter(|n| n.fract() == 0.0).is_some()
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// serde_json treats 0 and 0.0 not equal. so we cannot simply use v1==v2
|
||||
pub fn equals(v1: &Value, v2: &Value) -> bool {
|
||||
// eprintln!("Comparing {:?} with {:?}", v1, v2);
|
||||
match (v1, v2) {
|
||||
(Value::Null, Value::Null) => true,
|
||||
(Value::Bool(b1), Value::Bool(b2)) => b1 == b2,
|
||||
(Value::Number(n1), Value::Number(n2)) => {
|
||||
if let (Some(n1), Some(n2)) = (n1.as_u64(), n2.as_u64()) {
|
||||
return n1 == n2;
|
||||
}
|
||||
if let (Some(n1), Some(n2)) = (n1.as_i64(), n2.as_i64()) {
|
||||
return n1 == n2;
|
||||
}
|
||||
if let (Some(n1), Some(n2)) = (n1.as_f64(), n2.as_f64()) {
|
||||
return (n1 - n2).abs() < f64::EPSILON;
|
||||
}
|
||||
false
|
||||
}
|
||||
(Value::String(s1), Value::String(s2)) => s1 == s2,
|
||||
(Value::Array(arr1), Value::Array(arr2)) => {
|
||||
if arr1.len() != arr2.len() {
|
||||
return false;
|
||||
}
|
||||
arr1.iter().zip(arr2).all(|(e1, e2)| equals(e1, e2))
|
||||
}
|
||||
(Value::Object(obj1), Value::Object(obj2)) => {
|
||||
if obj1.len() != obj2.len() {
|
||||
return false;
|
||||
}
|
||||
for (k1, v1) in obj1 {
|
||||
if let Some(v2) = obj2.get(k1) {
|
||||
if !equals(v1, v2) {
|
||||
return false;
|
||||
}
|
||||
} else {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
true
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
1321
src/validator.rs
1321
src/validator.rs
File diff suppressed because it is too large
Load Diff
82
src/validator/context.rs
Normal file
82
src/validator/context.rs
Normal file
@ -0,0 +1,82 @@
|
||||
use crate::database::Database;
|
||||
use crate::database::schema::Schema;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
use std::collections::HashSet;
|
||||
use std::sync::Arc;
|
||||
|
||||
pub struct ValidationContext<'a> {
|
||||
pub db: &'a Arc<Database>,
|
||||
pub root: &'a Schema,
|
||||
pub schema: &'a Schema,
|
||||
pub instance: &'a serde_json::Value,
|
||||
pub path: String,
|
||||
pub depth: usize,
|
||||
pub extensible: bool,
|
||||
pub reporter: bool,
|
||||
pub overrides: HashSet<String>,
|
||||
}
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub fn new(
|
||||
db: &'a Arc<Database>,
|
||||
root: &'a Schema,
|
||||
schema: &'a Schema,
|
||||
instance: &'a serde_json::Value,
|
||||
overrides: HashSet<String>,
|
||||
extensible: bool,
|
||||
reporter: bool,
|
||||
) -> Self {
|
||||
let effective_extensible = schema.extensible.unwrap_or(extensible);
|
||||
Self {
|
||||
db,
|
||||
root,
|
||||
schema,
|
||||
instance,
|
||||
path: String::new(),
|
||||
depth: 0,
|
||||
extensible: effective_extensible,
|
||||
reporter,
|
||||
overrides,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn derive(
|
||||
&self,
|
||||
schema: &'a Schema,
|
||||
instance: &'a serde_json::Value,
|
||||
path: &str,
|
||||
overrides: HashSet<String>,
|
||||
extensible: bool,
|
||||
reporter: bool,
|
||||
) -> Self {
|
||||
let effective_extensible = schema.extensible.unwrap_or(extensible);
|
||||
|
||||
Self {
|
||||
db: self.db,
|
||||
root: self.root,
|
||||
schema,
|
||||
instance,
|
||||
path: path.to_string(),
|
||||
depth: self.depth + 1,
|
||||
extensible: effective_extensible,
|
||||
reporter,
|
||||
overrides,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn derive_for_schema(&self, schema: &'a Schema, reporter: bool) -> Self {
|
||||
self.derive(
|
||||
schema,
|
||||
self.instance,
|
||||
&self.path,
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
reporter,
|
||||
)
|
||||
}
|
||||
|
||||
pub fn validate(&self) -> Result<ValidationResult, ValidationError> {
|
||||
self.validate_scoped()
|
||||
}
|
||||
}
|
||||
6
src/validator/error.rs
Normal file
6
src/validator/error.rs
Normal file
@ -0,0 +1,6 @@
|
||||
#[derive(Debug, Clone, serde::Serialize)]
|
||||
pub struct ValidationError {
|
||||
pub code: String,
|
||||
pub message: String,
|
||||
pub path: String,
|
||||
}
|
||||
98
src/validator/instance.rs
Normal file
98
src/validator/instance.rs
Normal file
@ -0,0 +1,98 @@
|
||||
use serde_json::Value;
|
||||
use HashSet;
|
||||
use std::ptr::NonNull;
|
||||
|
||||
pub trait ValidationInstance<'a>: Copy + Clone {
|
||||
fn as_value(&self) -> &'a Value;
|
||||
fn child_at_key(&self, key: &str) -> Option<Self>;
|
||||
fn child_at_index(&self, idx: usize) -> Option<Self>;
|
||||
fn prune_object(&self, _keys: &HashSet<String>) {}
|
||||
fn prune_array(&self, _indices: &HashSet<usize>) {}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct ReadOnlyInstance<'a>(pub &'a Value);
|
||||
|
||||
impl<'a> ValidationInstance<'a> for ReadOnlyInstance<'a> {
|
||||
fn as_value(&self) -> &'a Value {
|
||||
self.0
|
||||
}
|
||||
|
||||
fn child_at_key(&self, key: &str) -> Option<Self> {
|
||||
self.0.get(key).map(ReadOnlyInstance)
|
||||
}
|
||||
|
||||
fn child_at_index(&self, idx: usize) -> Option<Self> {
|
||||
self.0.get(idx).map(ReadOnlyInstance)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct MutableInstance {
|
||||
ptr: NonNull<Value>,
|
||||
}
|
||||
|
||||
impl MutableInstance {
|
||||
pub fn new(val: &mut Value) -> Self {
|
||||
Self {
|
||||
ptr: NonNull::from(val),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> ValidationInstance<'a> for MutableInstance {
|
||||
fn as_value(&self) -> &'a Value {
|
||||
unsafe { self.ptr.as_ref() }
|
||||
}
|
||||
|
||||
fn child_at_key(&self, key: &str) -> Option<Self> {
|
||||
unsafe {
|
||||
if let Some(obj) = self.ptr.as_ref().as_object() {
|
||||
if obj.contains_key(key) {
|
||||
let parent_mut = &mut *self.ptr.as_ptr();
|
||||
if let Some(child_val) = parent_mut.get_mut(key) {
|
||||
return Some(MutableInstance::new(child_val));
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn child_at_index(&self, idx: usize) -> Option<Self> {
|
||||
unsafe {
|
||||
if let Some(arr) = self.ptr.as_ref().as_array() {
|
||||
if idx < arr.len() {
|
||||
let parent_mut = &mut *self.ptr.as_ptr();
|
||||
if let Some(child_val) = parent_mut.get_mut(idx) {
|
||||
return Some(MutableInstance::new(child_val));
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn prune_object(&self, keys: &HashSet<String>) {
|
||||
unsafe {
|
||||
let val_mut = &mut *self.ptr.as_ptr();
|
||||
if let Some(obj) = val_mut.as_object_mut() {
|
||||
obj.retain(|k, _| keys.contains(k));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn prune_array(&self, indices: &HashSet<usize>) {
|
||||
unsafe {
|
||||
let val_mut = &mut *self.ptr.as_ptr();
|
||||
if let Some(arr) = val_mut.as_array_mut() {
|
||||
let mut i = 0;
|
||||
arr.retain(|_| {
|
||||
let keep = indices.contains(&i);
|
||||
i += 1;
|
||||
keep
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
73
src/validator/mod.rs
Normal file
73
src/validator/mod.rs
Normal file
@ -0,0 +1,73 @@
|
||||
use std::collections::HashSet;
|
||||
|
||||
pub mod context;
|
||||
pub mod error;
|
||||
pub mod result;
|
||||
pub mod rules;
|
||||
pub mod util;
|
||||
|
||||
pub use context::ValidationContext;
|
||||
pub use error::ValidationError;
|
||||
pub use result::ValidationResult;
|
||||
|
||||
use crate::database::Database;
|
||||
use crate::validator::rules::util::is_integer;
|
||||
use serde_json::Value;
|
||||
use std::sync::Arc;
|
||||
|
||||
pub struct Validator {
|
||||
pub db: Arc<Database>,
|
||||
}
|
||||
|
||||
impl Validator {
|
||||
pub fn new(db: Arc<Database>) -> Self {
|
||||
Self { db }
|
||||
}
|
||||
|
||||
pub fn get_schema_ids(&self) -> Vec<String> {
|
||||
self.db.schemas.keys().cloned().collect()
|
||||
}
|
||||
|
||||
pub fn check_type(t: &str, val: &Value) -> bool {
|
||||
if let Value::String(s) = val
|
||||
&& s.is_empty()
|
||||
{
|
||||
return true;
|
||||
}
|
||||
match t {
|
||||
"null" => val.is_null(),
|
||||
"boolean" => val.is_boolean(),
|
||||
"string" => val.is_string(),
|
||||
"number" => val.is_number(),
|
||||
"integer" => is_integer(val),
|
||||
"object" => val.is_object(),
|
||||
"array" => val.is_array(),
|
||||
_ => true,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn validate(
|
||||
&self,
|
||||
schema_id: &str,
|
||||
instance: &Value,
|
||||
) -> Result<ValidationResult, ValidationError> {
|
||||
if let Some(schema) = self.db.schemas.get(schema_id) {
|
||||
let ctx = ValidationContext::new(
|
||||
&self.db,
|
||||
schema,
|
||||
schema,
|
||||
instance,
|
||||
HashSet::new(),
|
||||
false,
|
||||
false,
|
||||
);
|
||||
ctx.validate_scoped()
|
||||
} else {
|
||||
Err(ValidationError {
|
||||
code: "SCHEMA_NOT_FOUND".to_string(),
|
||||
message: format!("Schema {} not found", schema_id),
|
||||
path: "".to_string(),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
28
src/validator/result.rs
Normal file
28
src/validator/result.rs
Normal file
@ -0,0 +1,28 @@
|
||||
use std::collections::HashSet;
|
||||
|
||||
use crate::validator::error::ValidationError;
|
||||
|
||||
#[derive(Debug, Default, Clone, serde::Serialize)]
|
||||
pub struct ValidationResult {
|
||||
pub errors: Vec<ValidationError>,
|
||||
#[serde(skip)]
|
||||
pub evaluated_keys: HashSet<String>,
|
||||
#[serde(skip)]
|
||||
pub evaluated_indices: HashSet<usize>,
|
||||
}
|
||||
|
||||
impl ValidationResult {
|
||||
pub fn new() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
|
||||
pub fn merge(&mut self, other: ValidationResult) {
|
||||
self.errors.extend(other.errors);
|
||||
self.evaluated_keys.extend(other.evaluated_keys);
|
||||
self.evaluated_indices.extend(other.evaluated_indices);
|
||||
}
|
||||
|
||||
pub fn is_valid(&self) -> bool {
|
||||
self.errors.is_empty()
|
||||
}
|
||||
}
|
||||
135
src/validator/rules/array.rs
Normal file
135
src/validator/rules/array.rs
Normal file
@ -0,0 +1,135 @@
|
||||
use std::collections::HashSet;
|
||||
|
||||
use serde_json::Value;
|
||||
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_array(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
let current = self.instance;
|
||||
if let Some(arr) = current.as_array() {
|
||||
if let Some(min) = self.schema.min_items
|
||||
&& (arr.len() as f64) < min
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "MIN_ITEMS".to_string(),
|
||||
message: "Too few items".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(max) = self.schema.max_items
|
||||
&& (arr.len() as f64) > max
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "MAX_ITEMS".to_string(),
|
||||
message: "Too many items".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
|
||||
if self.schema.unique_items.unwrap_or(false) {
|
||||
let mut seen: Vec<&Value> = Vec::new();
|
||||
for item in arr {
|
||||
if seen.contains(&item) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "UNIQUE_ITEMS_VIOLATED".to_string(),
|
||||
message: "Array has duplicate items".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
break;
|
||||
}
|
||||
seen.push(item);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref contains_schema) = self.schema.contains {
|
||||
let mut _match_count = 0;
|
||||
for (i, child_instance) in arr.iter().enumerate() {
|
||||
let derived = self.derive(
|
||||
contains_schema,
|
||||
child_instance,
|
||||
&self.path,
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
false,
|
||||
);
|
||||
|
||||
let check = derived.validate()?;
|
||||
if check.is_valid() {
|
||||
_match_count += 1;
|
||||
result.evaluated_indices.insert(i);
|
||||
}
|
||||
}
|
||||
|
||||
let min = self.schema.min_contains.unwrap_or(1.0) as usize;
|
||||
if _match_count < min {
|
||||
result.errors.push(ValidationError {
|
||||
code: "CONTAINS_VIOLATED".to_string(),
|
||||
message: format!("Contains matches {} < min {}", _match_count, min),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(max) = self.schema.max_contains
|
||||
&& _match_count > max as usize
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "CONTAINS_VIOLATED".to_string(),
|
||||
message: format!("Contains matches {} > max {}", _match_count, max),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
let len = arr.len();
|
||||
let mut validation_index = 0;
|
||||
|
||||
if let Some(ref prefix) = self.schema.prefix_items {
|
||||
for (i, sub_schema) in prefix.iter().enumerate() {
|
||||
if i < len {
|
||||
let path = format!("{}/{}", self.path, i);
|
||||
if let Some(child_instance) = arr.get(i) {
|
||||
let derived = self.derive(
|
||||
sub_schema,
|
||||
child_instance,
|
||||
&path,
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
false,
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
result.merge(item_res);
|
||||
result.evaluated_indices.insert(i);
|
||||
validation_index += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref items_schema) = self.schema.items {
|
||||
for i in validation_index..len {
|
||||
let path = format!("{}/{}", self.path, i);
|
||||
if let Some(child_instance) = arr.get(i) {
|
||||
let derived = self.derive(
|
||||
items_schema,
|
||||
child_instance,
|
||||
&path,
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
false,
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
result.merge(item_res);
|
||||
result.evaluated_indices.insert(i);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
92
src/validator/rules/combinators.rs
Normal file
92
src/validator/rules/combinators.rs
Normal file
@ -0,0 +1,92 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_combinators(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
if let Some(ref all_of) = self.schema.all_of {
|
||||
for sub in all_of {
|
||||
let derived = self.derive_for_schema(sub, true);
|
||||
let res = derived.validate()?;
|
||||
result.merge(res);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref one_of) = self.schema.one_of {
|
||||
let mut passed_candidates: Vec<(Option<String>, usize, ValidationResult)> = Vec::new();
|
||||
|
||||
for sub in one_of {
|
||||
let derived = self.derive_for_schema(sub, true);
|
||||
let sub_res = derived.validate()?;
|
||||
if sub_res.is_valid() {
|
||||
let child_id = sub.id.clone();
|
||||
let depth = child_id
|
||||
.as_ref()
|
||||
.and_then(|id| self.db.depths.get(id).copied())
|
||||
.unwrap_or(0);
|
||||
passed_candidates.push((child_id, depth, sub_res));
|
||||
}
|
||||
}
|
||||
|
||||
if passed_candidates.len() == 1 {
|
||||
result.merge(passed_candidates.pop().unwrap().2);
|
||||
} else if passed_candidates.is_empty() {
|
||||
result.errors.push(ValidationError {
|
||||
code: "NO_ONEOF_MATCH".to_string(),
|
||||
message: "Matches none of oneOf schemas".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
} else {
|
||||
// Apply depth heuristic tie-breaker
|
||||
let mut best_depth: Option<usize> = None;
|
||||
let mut ambiguous = false;
|
||||
let mut best_res = None;
|
||||
|
||||
for (_, depth, res) in passed_candidates.into_iter() {
|
||||
if let Some(current_best) = best_depth {
|
||||
if depth > current_best {
|
||||
best_depth = Some(depth);
|
||||
best_res = Some(res);
|
||||
ambiguous = false;
|
||||
} else if depth == current_best {
|
||||
ambiguous = true;
|
||||
}
|
||||
} else {
|
||||
best_depth = Some(depth);
|
||||
best_res = Some(res);
|
||||
}
|
||||
}
|
||||
|
||||
if !ambiguous {
|
||||
if let Some(res) = best_res {
|
||||
result.merge(res);
|
||||
return Ok(true);
|
||||
}
|
||||
}
|
||||
|
||||
result.errors.push(ValidationError {
|
||||
code: "AMBIGUOUS_ONEOF_MATCH".to_string(),
|
||||
message: "Matches multiple oneOf schemas without a clear depth winner".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref not_schema) = self.schema.not {
|
||||
let derived = self.derive_for_schema(not_schema, true);
|
||||
let sub_res = derived.validate()?;
|
||||
if sub_res.is_valid() {
|
||||
result.errors.push(ValidationError {
|
||||
code: "NOT_VIOLATED".to_string(),
|
||||
message: "Matched 'not' schema".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
67
src/validator/rules/conditionals.rs
Normal file
67
src/validator/rules/conditionals.rs
Normal file
@ -0,0 +1,67 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_conditionals(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
if let Some(ref if_schema) = self.schema.if_ {
|
||||
let derived_if = self.derive_for_schema(if_schema, true);
|
||||
let if_res = derived_if.validate()?;
|
||||
|
||||
result.evaluated_keys.extend(if_res.evaluated_keys.clone());
|
||||
result
|
||||
.evaluated_indices
|
||||
.extend(if_res.evaluated_indices.clone());
|
||||
|
||||
if if_res.is_valid() {
|
||||
if let Some(ref then_schema) = self.schema.then_ {
|
||||
let derived_then = self.derive_for_schema(then_schema, true);
|
||||
result.merge(derived_then.validate()?);
|
||||
}
|
||||
} else if let Some(ref else_schema) = self.schema.else_ {
|
||||
let derived_else = self.derive_for_schema(else_schema, true);
|
||||
result.merge(derived_else.validate()?);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
|
||||
pub(crate) fn validate_strictness(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
if self.extensible || self.reporter {
|
||||
return Ok(true);
|
||||
}
|
||||
|
||||
if let Some(obj) = self.instance.as_object() {
|
||||
for key in obj.keys() {
|
||||
if !result.evaluated_keys.contains(key) && !self.overrides.contains(key) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "STRICT_PROPERTY_VIOLATION".to_string(),
|
||||
message: format!("Unexpected property '{}'", key),
|
||||
path: format!("{}/{}", self.path, key),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(arr) = self.instance.as_array() {
|
||||
for i in 0..arr.len() {
|
||||
if !result.evaluated_indices.contains(&i) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "STRICT_ITEM_VIOLATION".to_string(),
|
||||
message: format!("Unexpected item at index {}", i),
|
||||
path: format!("{}/{}", self.path, i),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
81
src/validator/rules/core.rs
Normal file
81
src/validator/rules/core.rs
Normal file
@ -0,0 +1,81 @@
|
||||
use crate::validator::Validator;
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
use crate::validator::rules::util::equals;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_core(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
let current = self.instance;
|
||||
|
||||
if let Some(ref type_) = self.schema.type_ {
|
||||
match type_ {
|
||||
crate::database::schema::SchemaTypeOrArray::Single(t) => {
|
||||
if !Validator::check_type(t, current) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "INVALID_TYPE".to_string(),
|
||||
message: format!("Expected type '{}'", t),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
crate::database::schema::SchemaTypeOrArray::Multiple(types) => {
|
||||
let mut valid = false;
|
||||
for t in types {
|
||||
if Validator::check_type(t, current) {
|
||||
valid = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if !valid {
|
||||
result.errors.push(ValidationError {
|
||||
code: "INVALID_TYPE".to_string(),
|
||||
message: format!("Expected one of types {:?}", types),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref const_val) = self.schema.const_ {
|
||||
if !equals(current, const_val) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "CONST_VIOLATED".to_string(),
|
||||
message: "Value does not match const".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
} else if let Some(obj) = current.as_object() {
|
||||
result.evaluated_keys.extend(obj.keys().cloned());
|
||||
} else if let Some(arr) = current.as_array() {
|
||||
result.evaluated_indices.extend(0..arr.len());
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref enum_vals) = self.schema.enum_ {
|
||||
let mut found = false;
|
||||
for val in enum_vals {
|
||||
if equals(current, val) {
|
||||
found = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
result.errors.push(ValidationError {
|
||||
code: "ENUM_MISMATCH".to_string(),
|
||||
message: "Value is not in enum".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
} else if let Some(obj) = current.as_object() {
|
||||
result.evaluated_keys.extend(obj.keys().cloned());
|
||||
} else if let Some(arr) = current.as_array() {
|
||||
result.evaluated_indices.extend(0..arr.len());
|
||||
}
|
||||
}
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
42
src/validator/rules/format.rs
Normal file
42
src/validator/rules/format.rs
Normal file
@ -0,0 +1,42 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_format(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
let current = self.instance;
|
||||
if let Some(ref compiled_fmt) = self.schema.compiled_format {
|
||||
match compiled_fmt {
|
||||
crate::database::schema::CompiledFormat::Func(f) => {
|
||||
let should = if let Some(s) = current.as_str() {
|
||||
!s.is_empty()
|
||||
} else {
|
||||
true
|
||||
};
|
||||
if should && let Err(e) = f(current) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "FORMAT_MISMATCH".to_string(),
|
||||
message: format!("Format error: {}", e),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
crate::database::schema::CompiledFormat::Regex(re) => {
|
||||
if let Some(s) = current.as_str()
|
||||
&& !re.is_match(s)
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "FORMAT_MISMATCH".to_string(),
|
||||
message: "Format regex mismatch".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
91
src/validator/rules/mod.rs
Normal file
91
src/validator/rules/mod.rs
Normal file
@ -0,0 +1,91 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
pub mod array;
|
||||
pub mod combinators;
|
||||
pub mod conditionals;
|
||||
pub mod core;
|
||||
pub mod format;
|
||||
pub mod numeric;
|
||||
pub mod object;
|
||||
pub mod polymorphism;
|
||||
pub mod string;
|
||||
pub mod util;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_scoped(&self) -> Result<ValidationResult, ValidationError> {
|
||||
let mut result = ValidationResult::new();
|
||||
|
||||
// Structural Limits
|
||||
if !self.validate_depth(&mut result)? {
|
||||
return Ok(result);
|
||||
}
|
||||
if !self.validate_always_fail(&mut result)? {
|
||||
return Ok(result);
|
||||
}
|
||||
if !self.validate_family(&mut result)? {
|
||||
return Ok(result);
|
||||
}
|
||||
if !self.validate_refs(&mut result)? {
|
||||
return Ok(result);
|
||||
}
|
||||
|
||||
// Core Type Constraints
|
||||
self.validate_core(&mut result)?;
|
||||
self.validate_numeric(&mut result)?;
|
||||
self.validate_string(&mut result)?;
|
||||
self.validate_format(&mut result)?;
|
||||
|
||||
// Complex Structures
|
||||
self.validate_object(&mut result)?;
|
||||
self.validate_array(&mut result)?;
|
||||
|
||||
// Multipliers & Conditionals
|
||||
self.validate_combinators(&mut result)?;
|
||||
self.validate_conditionals(&mut result)?;
|
||||
|
||||
// State Tracking
|
||||
self.validate_extensible(&mut result)?;
|
||||
self.validate_strictness(&mut result)?;
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
fn validate_depth(&self, _result: &mut ValidationResult) -> Result<bool, ValidationError> {
|
||||
if self.depth > 100 {
|
||||
Err(ValidationError {
|
||||
code: "RECURSION_LIMIT_EXCEEDED".to_string(),
|
||||
message: "Recursion limit exceeded".to_string(),
|
||||
path: self.path.to_string(),
|
||||
})
|
||||
} else {
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_always_fail(&self, result: &mut ValidationResult) -> Result<bool, ValidationError> {
|
||||
if self.schema.always_fail {
|
||||
result.errors.push(ValidationError {
|
||||
code: "FALSE_SCHEMA".to_string(),
|
||||
message: "Schema is false".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
// Short-circuit
|
||||
Ok(false)
|
||||
} else {
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
|
||||
fn validate_extensible(&self, result: &mut ValidationResult) -> Result<bool, ValidationError> {
|
||||
if self.extensible {
|
||||
if let Some(obj) = self.instance.as_object() {
|
||||
result.evaluated_keys.extend(obj.keys().cloned());
|
||||
} else if let Some(arr) = self.instance.as_array() {
|
||||
result.evaluated_indices.extend(0..arr.len());
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
61
src/validator/rules/numeric.rs
Normal file
61
src/validator/rules/numeric.rs
Normal file
@ -0,0 +1,61 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_numeric(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
let current = self.instance;
|
||||
if let Some(num) = current.as_f64() {
|
||||
if let Some(min) = self.schema.minimum
|
||||
&& num < min
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "MINIMUM_VIOLATED".to_string(),
|
||||
message: format!("Value {} < min {}", num, min),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(max) = self.schema.maximum
|
||||
&& num > max
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "MAXIMUM_VIOLATED".to_string(),
|
||||
message: format!("Value {} > max {}", num, max),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(ex_min) = self.schema.exclusive_minimum
|
||||
&& num <= ex_min
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "EXCLUSIVE_MINIMUM_VIOLATED".to_string(),
|
||||
message: format!("Value {} <= ex_min {}", num, ex_min),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(ex_max) = self.schema.exclusive_maximum
|
||||
&& num >= ex_max
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "EXCLUSIVE_MAXIMUM_VIOLATED".to_string(),
|
||||
message: format!("Value {} >= ex_max {}", num, ex_max),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(multiple_of) = self.schema.multiple_of {
|
||||
let val: f64 = num / multiple_of;
|
||||
if (val - val.round()).abs() > f64::EPSILON {
|
||||
result.errors.push(ValidationError {
|
||||
code: "MULTIPLE_OF_VIOLATED".to_string(),
|
||||
message: format!("Value {} not multiple of {}", num, multiple_of),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
220
src/validator/rules/object.rs
Normal file
220
src/validator/rules/object.rs
Normal file
@ -0,0 +1,220 @@
|
||||
use std::collections::HashSet;
|
||||
|
||||
use serde_json::Value;
|
||||
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_object(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
let current = self.instance;
|
||||
if let Some(obj) = current.as_object() {
|
||||
// Entity Bound Implicit Type Validation
|
||||
if let Some(lookup_key) = self.schema.id.as_ref().or(self.schema.r#ref.as_ref()) {
|
||||
let base_type_name = lookup_key.split('.').next_back().unwrap_or("").to_string();
|
||||
if let Some(type_def) = self.db.types.get(&base_type_name)
|
||||
&& let Some(type_val) = obj.get("type")
|
||||
&& let Some(type_str) = type_val.as_str()
|
||||
{
|
||||
if type_def.variations.contains(type_str) {
|
||||
// Ensure it passes strict mode
|
||||
result.evaluated_keys.insert("type".to_string());
|
||||
} else {
|
||||
result.errors.push(ValidationError {
|
||||
code: "CONST_VIOLATED".to_string(), // Aligning with original const override errors
|
||||
message: format!(
|
||||
"Type '{}' is not a valid descendant for this entity bound schema",
|
||||
type_str
|
||||
),
|
||||
path: format!("{}/type", self.path),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
if let Some(min) = self.schema.min_properties
|
||||
&& (obj.len() as f64) < min
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "MIN_PROPERTIES".to_string(),
|
||||
message: "Too few properties".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(max) = self.schema.max_properties
|
||||
&& (obj.len() as f64) > max
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "MAX_PROPERTIES".to_string(),
|
||||
message: "Too many properties".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(ref req) = self.schema.required {
|
||||
for field in req {
|
||||
if !obj.contains_key(field) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "REQUIRED_FIELD_MISSING".to_string(),
|
||||
message: format!("Missing {}", field),
|
||||
path: format!("{}/{}", self.path, field),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref deps) = self.schema.dependencies {
|
||||
for (prop, dep) in deps {
|
||||
if obj.contains_key(prop) {
|
||||
match dep {
|
||||
crate::database::schema::Dependency::Props(required_props) => {
|
||||
for req_prop in required_props {
|
||||
if !obj.contains_key(req_prop) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "DEPENDENCY_MISSING".to_string(),
|
||||
message: format!("Property '{}' requires property '{}'", prop, req_prop),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
crate::database::schema::Dependency::Schema(dep_schema) => {
|
||||
let derived = self.derive_for_schema(dep_schema, false);
|
||||
let dep_res = derived.validate()?;
|
||||
result.evaluated_keys.extend(dep_res.evaluated_keys.clone());
|
||||
result.merge(dep_res);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(props) = &self.schema.properties {
|
||||
for (key, sub_schema) in props {
|
||||
if self.overrides.contains(key) {
|
||||
continue; // Skip validation if exactly this property was overridden by a child
|
||||
}
|
||||
|
||||
if let Some(child_instance) = obj.get(key) {
|
||||
let new_path = format!("{}/{}", self.path, key);
|
||||
let is_ref = sub_schema.r#ref.is_some();
|
||||
let next_extensible = if is_ref { false } else { self.extensible };
|
||||
|
||||
let derived = self.derive(
|
||||
sub_schema,
|
||||
child_instance,
|
||||
&new_path,
|
||||
HashSet::new(),
|
||||
next_extensible,
|
||||
false,
|
||||
);
|
||||
let mut item_res = derived.validate()?;
|
||||
|
||||
// Entity Bound Implicit Type Interception
|
||||
if key == "type"
|
||||
&& let Some(lookup_key) = sub_schema.id.as_ref().or(sub_schema.r#ref.as_ref())
|
||||
{
|
||||
let base_type_name = lookup_key.split('.').next_back().unwrap_or("").to_string();
|
||||
if let Some(type_def) = self.db.types.get(&base_type_name)
|
||||
&& let Some(instance_type) = child_instance.as_str()
|
||||
&& type_def.variations.contains(instance_type)
|
||||
{
|
||||
item_res
|
||||
.errors
|
||||
.retain(|e| e.code != "CONST_VIOLATED" && e.code != "ENUM_VIOLATED");
|
||||
}
|
||||
}
|
||||
|
||||
result.merge(item_res);
|
||||
result.evaluated_keys.insert(key.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref compiled_pp) = self.schema.compiled_pattern_properties {
|
||||
for (compiled_re, sub_schema) in compiled_pp {
|
||||
for (key, child_instance) in obj {
|
||||
if compiled_re.0.is_match(key) {
|
||||
let new_path = format!("{}/{}", self.path, key);
|
||||
let is_ref = sub_schema.r#ref.is_some();
|
||||
let next_extensible = if is_ref { false } else { self.extensible };
|
||||
|
||||
let derived = self.derive(
|
||||
sub_schema,
|
||||
child_instance,
|
||||
&new_path,
|
||||
HashSet::new(),
|
||||
next_extensible,
|
||||
false,
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
result.merge(item_res);
|
||||
result.evaluated_keys.insert(key.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref additional_schema) = self.schema.additional_properties {
|
||||
for (key, child_instance) in obj {
|
||||
let mut locally_matched = false;
|
||||
if let Some(props) = &self.schema.properties
|
||||
&& props.contains_key(&key.to_string())
|
||||
{
|
||||
locally_matched = true;
|
||||
}
|
||||
if !locally_matched && let Some(ref compiled_pp) = self.schema.compiled_pattern_properties
|
||||
{
|
||||
for (compiled_re, _) in compiled_pp {
|
||||
if compiled_re.0.is_match(key) {
|
||||
locally_matched = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !locally_matched {
|
||||
let new_path = format!("{}/{}", self.path, key);
|
||||
let is_ref = additional_schema.r#ref.is_some();
|
||||
let next_extensible = if is_ref { false } else { self.extensible };
|
||||
|
||||
let derived = self.derive(
|
||||
additional_schema,
|
||||
child_instance,
|
||||
&new_path,
|
||||
HashSet::new(),
|
||||
next_extensible,
|
||||
false,
|
||||
);
|
||||
let item_res = derived.validate()?;
|
||||
result.merge(item_res);
|
||||
result.evaluated_keys.insert(key.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref property_names) = self.schema.property_names {
|
||||
for key in obj.keys() {
|
||||
let _new_path = format!("{}/propertyNames/{}", self.path, key);
|
||||
let val_str = Value::String(key.to_string());
|
||||
|
||||
let ctx = ValidationContext::new(
|
||||
self.db,
|
||||
self.root,
|
||||
property_names,
|
||||
&val_str,
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
self.reporter,
|
||||
);
|
||||
|
||||
result.merge(ctx.validate()?);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
155
src/validator/rules/polymorphism.rs
Normal file
155
src/validator/rules/polymorphism.rs
Normal file
@ -0,0 +1,155 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_family(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
if self.schema.family.is_some() {
|
||||
let conflicts = self.schema.type_.is_some()
|
||||
|| self.schema.properties.is_some()
|
||||
|| self.schema.required.is_some()
|
||||
|| self.schema.additional_properties.is_some()
|
||||
|| self.schema.items.is_some()
|
||||
|| self.schema.r#ref.is_some()
|
||||
|| self.schema.one_of.is_some()
|
||||
|| self.schema.all_of.is_some()
|
||||
|| self.schema.enum_.is_some()
|
||||
|| self.schema.const_.is_some();
|
||||
|
||||
if conflicts {
|
||||
result.errors.push(ValidationError {
|
||||
code: "INVALID_SCHEMA".to_string(),
|
||||
message: "$family must be used exclusively without other constraints".to_string(),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
// Short-circuit: the schema formulation is broken
|
||||
return Ok(false);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(family_target) = &self.schema.family {
|
||||
// The descendants map is keyed by the schema's own $id, not the target string.
|
||||
if let Some(schema_id) = &self.schema.id
|
||||
&& let Some(descendants) = self.db.descendants.get(schema_id)
|
||||
{
|
||||
// Validate against all descendants simulating strict oneOf logic
|
||||
let mut passed_candidates: Vec<(String, usize, ValidationResult)> = Vec::new();
|
||||
|
||||
// The target itself is also an implicitly valid candidate
|
||||
let mut all_targets = vec![family_target.clone()];
|
||||
all_targets.extend(descendants.clone());
|
||||
|
||||
for child_id in &all_targets {
|
||||
if let Some(child_schema) = self.db.schemas.get(child_id) {
|
||||
let derived = self.derive(
|
||||
child_schema,
|
||||
self.instance,
|
||||
&self.path,
|
||||
self.overrides.clone(),
|
||||
self.extensible,
|
||||
self.reporter, // Inherit parent reporter flag, do not bypass strictness!
|
||||
);
|
||||
|
||||
// Explicitly run validate_scoped to accurately test candidates with strictness checks enabled
|
||||
let res = derived.validate_scoped()?;
|
||||
|
||||
if res.is_valid() {
|
||||
let depth = self.db.depths.get(child_id).copied().unwrap_or(0);
|
||||
passed_candidates.push((child_id.clone(), depth, res));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if passed_candidates.len() == 1 {
|
||||
result.merge(passed_candidates.pop().unwrap().2);
|
||||
} else if passed_candidates.is_empty() {
|
||||
result.errors.push(ValidationError {
|
||||
code: "NO_FAMILY_MATCH".to_string(),
|
||||
message: format!(
|
||||
"Payload did not match any descendants of family '{}'",
|
||||
family_target
|
||||
),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
} else {
|
||||
// Apply depth heuristic tie-breaker
|
||||
let mut best_depth: Option<usize> = None;
|
||||
let mut ambiguous = false;
|
||||
let mut best_res = None;
|
||||
|
||||
for (_, depth, res) in passed_candidates.into_iter() {
|
||||
if let Some(current_best) = best_depth {
|
||||
if depth > current_best {
|
||||
best_depth = Some(depth);
|
||||
best_res = Some(res);
|
||||
ambiguous = false; // Broke the tie
|
||||
} else if depth == current_best {
|
||||
ambiguous = true; // Tie at the highest level
|
||||
}
|
||||
} else {
|
||||
best_depth = Some(depth);
|
||||
best_res = Some(res);
|
||||
}
|
||||
}
|
||||
|
||||
if !ambiguous {
|
||||
if let Some(res) = best_res {
|
||||
result.merge(res);
|
||||
return Ok(true);
|
||||
}
|
||||
}
|
||||
|
||||
result.errors.push(ValidationError {
|
||||
code: "AMBIGUOUS_FAMILY_MATCH".to_string(),
|
||||
message: format!(
|
||||
"Payload matched multiple descendants of family '{}' without a clear depth winner",
|
||||
family_target
|
||||
),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
|
||||
pub(crate) fn validate_refs(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
// 1. Core $ref logic relies on the fast O(1) map to allow cycles and proper nesting
|
||||
if let Some(ref_str) = &self.schema.r#ref {
|
||||
if let Some(global_schema) = self.db.schemas.get(ref_str) {
|
||||
let mut new_overrides = self.overrides.clone();
|
||||
if let Some(props) = &self.schema.properties {
|
||||
new_overrides.extend(props.keys().map(|k| k.to_string()));
|
||||
}
|
||||
|
||||
let mut shadow = self.derive(
|
||||
global_schema,
|
||||
self.instance,
|
||||
&self.path,
|
||||
new_overrides,
|
||||
self.extensible,
|
||||
true,
|
||||
);
|
||||
shadow.root = global_schema;
|
||||
result.merge(shadow.validate()?);
|
||||
} else {
|
||||
result.errors.push(ValidationError {
|
||||
code: "REF_RESOLUTION_FAILED".to_string(),
|
||||
message: format!(
|
||||
"Reference pointer to '{}' was not found in schema registry",
|
||||
ref_str
|
||||
),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
52
src/validator/rules/string.rs
Normal file
52
src/validator/rules/string.rs
Normal file
@ -0,0 +1,52 @@
|
||||
use crate::validator::context::ValidationContext;
|
||||
use crate::validator::error::ValidationError;
|
||||
use crate::validator::result::ValidationResult;
|
||||
use regex::Regex;
|
||||
|
||||
impl<'a> ValidationContext<'a> {
|
||||
pub(crate) fn validate_string(
|
||||
&self,
|
||||
result: &mut ValidationResult,
|
||||
) -> Result<bool, ValidationError> {
|
||||
let current = self.instance;
|
||||
if let Some(s) = current.as_str() {
|
||||
if let Some(min) = self.schema.min_length
|
||||
&& (s.chars().count() as f64) < min
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "MIN_LENGTH_VIOLATED".to_string(),
|
||||
message: format!("Length < min {}", min),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(max) = self.schema.max_length
|
||||
&& (s.chars().count() as f64) > max
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "MAX_LENGTH_VIOLATED".to_string(),
|
||||
message: format!("Length > max {}", max),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
if let Some(ref compiled_re) = self.schema.compiled_pattern {
|
||||
if !compiled_re.0.is_match(s) {
|
||||
result.errors.push(ValidationError {
|
||||
code: "PATTERN_VIOLATED".to_string(),
|
||||
message: format!("Pattern mismatch {:?}", self.schema.pattern),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
} else if let Some(ref pattern) = self.schema.pattern
|
||||
&& let Ok(re) = Regex::new(pattern)
|
||||
&& !re.is_match(s)
|
||||
{
|
||||
result.errors.push(ValidationError {
|
||||
code: "PATTERN_VIOLATED".to_string(),
|
||||
message: format!("Pattern mismatch {}", pattern),
|
||||
path: self.path.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
}
|
||||
}
|
||||
53
src/validator/rules/util.rs
Normal file
53
src/validator/rules/util.rs
Normal file
@ -0,0 +1,53 @@
|
||||
use serde_json::Value;
|
||||
|
||||
pub fn is_integer(v: &Value) -> bool {
|
||||
match v {
|
||||
Value::Number(n) => {
|
||||
n.is_i64() || n.is_u64() || n.as_f64().filter(|n| n.fract() == 0.0).is_some()
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// serde_json treats 0 and 0.0 not equal. so we cannot simply use v1==v2
|
||||
pub fn equals(v1: &Value, v2: &Value) -> bool {
|
||||
match (v1, v2) {
|
||||
(Value::Null, Value::Null) => true,
|
||||
(Value::Bool(b1), Value::Bool(b2)) => b1 == b2,
|
||||
(Value::Number(n1), Value::Number(n2)) => {
|
||||
if let (Some(n1), Some(n2)) = (n1.as_u64(), n2.as_u64()) {
|
||||
return n1 == n2;
|
||||
}
|
||||
if let (Some(n1), Some(n2)) = (n1.as_i64(), n2.as_i64()) {
|
||||
return n1 == n2;
|
||||
}
|
||||
if let (Some(n1), Some(n2)) = (n1.as_f64(), n2.as_f64()) {
|
||||
return (n1 - n2).abs() < f64::EPSILON;
|
||||
}
|
||||
false
|
||||
}
|
||||
(Value::String(s1), Value::String(s2)) => s1 == s2,
|
||||
(Value::Array(arr1), Value::Array(arr2)) => {
|
||||
if arr1.len() != arr2.len() {
|
||||
return false;
|
||||
}
|
||||
arr1.iter().zip(arr2).all(|(e1, e2)| equals(e1, e2))
|
||||
}
|
||||
(Value::Object(obj1), Value::Object(obj2)) => {
|
||||
if obj1.len() != obj2.len() {
|
||||
return false;
|
||||
}
|
||||
for (k1, v1) in obj1 {
|
||||
if let Some(v2) = obj2.get(k1) {
|
||||
if !equals(v1, v2) {
|
||||
return false;
|
||||
}
|
||||
} else {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
true
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
91
src/validator/util.rs
Normal file
91
src/validator/util.rs
Normal file
@ -0,0 +1,91 @@
|
||||
use serde::Deserialize;
|
||||
use std::fs;
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TestSuite {
|
||||
#[allow(dead_code)]
|
||||
description: String,
|
||||
database: serde_json::Value,
|
||||
tests: Vec<TestCase>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TestCase {
|
||||
description: String,
|
||||
data: serde_json::Value,
|
||||
valid: bool,
|
||||
// Support explicit schema ID target for test case
|
||||
schema_id: String,
|
||||
}
|
||||
|
||||
// use crate::validator::registry::REGISTRY; // No longer used directly for tests!
|
||||
use crate::validator::Validator;
|
||||
use serde_json::Value;
|
||||
|
||||
pub fn deserialize_some<'de, D>(deserializer: D) -> Result<Option<Value>, D::Error>
|
||||
where
|
||||
D: serde::Deserializer<'de>,
|
||||
{
|
||||
let v = Value::deserialize(deserializer)?;
|
||||
Ok(Some(v))
|
||||
}
|
||||
|
||||
pub fn run_test_file_at_index(path: &str, index: usize) -> Result<(), String> {
|
||||
let content =
|
||||
fs::read_to_string(path).unwrap_or_else(|_| panic!("Failed to read file: {}", path));
|
||||
let suite: Vec<TestSuite> = serde_json::from_str(&content)
|
||||
.unwrap_or_else(|e| panic!("Failed to parse JSON in {}: {}", path, e));
|
||||
|
||||
if index >= suite.len() {
|
||||
panic!("Index {} out of bounds for file {}", index, path);
|
||||
}
|
||||
|
||||
let group = &suite[index];
|
||||
let mut failures = Vec::<String>::new();
|
||||
|
||||
let db_json = group.database.clone();
|
||||
let db = crate::database::Database::new(&db_json);
|
||||
let validator = Validator::new(std::sync::Arc::new(db));
|
||||
|
||||
// 4. Run Tests
|
||||
for test in group.tests.iter() {
|
||||
let schema_id = &test.schema_id;
|
||||
|
||||
if !validator.db.schemas.contains_key(schema_id) {
|
||||
failures.push(format!(
|
||||
"[{}] Missing Schema: Cannot find schema ID '{}'",
|
||||
group.description, schema_id
|
||||
));
|
||||
continue;
|
||||
}
|
||||
|
||||
let result = validator.validate(schema_id, &test.data);
|
||||
|
||||
let (got_valid, _errors) = match &result {
|
||||
Ok(res) => (res.is_valid(), &res.errors),
|
||||
Err(_e) => {
|
||||
// If we encounter an execution error (e.g. Schema Not Found),
|
||||
// we treat it as a test failure.
|
||||
(false, &vec![])
|
||||
}
|
||||
};
|
||||
|
||||
if got_valid != test.valid {
|
||||
let error_msg = match &result {
|
||||
Ok(res) => format!("{:?}", res.errors),
|
||||
Err(e) => format!("Execution Error: {:?}", e),
|
||||
};
|
||||
|
||||
failures.push(format!(
|
||||
"[{}] Test '{}' failed. Expected: {}, Got: {}. Errors: {}",
|
||||
group.description, test.description, test.valid, got_valid, error_msg
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
if !failures.is_empty() {
|
||||
return Err(failures.join("\n"));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
62
test_err.log
Normal file
62
test_err.log
Normal file
@ -0,0 +1,62 @@
|
||||
Compiling jspg v0.1.0 (/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg)
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 26.14s
|
||||
Running unittests src/lib.rs (target/debug/deps/jspg-99ace086c3537f5a)
|
||||
|
||||
running 1 test
|
||||
[32m[1m Using[0m[39m [37m[1mPgConfig("pg18")[0m[39m and `pg_config` from [36m/opt/homebrew/opt/postgresql@18/bin/pg_config[39m
|
||||
[32m[1m Building[0m[39m extension with features [36mpg_test pg18[39m
|
||||
[32m[1m Running[0m[39m command [36m"/opt/homebrew/bin/cargo" "build" "--lib" "--features" "pg_test pg18" "--no-default-features" "--message-format=json-render-diagnostics"[39m
|
||||
Compiling jspg v0.1.0 (/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg)
|
||||
Finished `dev` profile [unoptimized + debuginfo] target(s) in 7.10s
|
||||
[32m[1m Installing[0m[39m extension
|
||||
[32m[1m Copying[0m[39m control file to [36m/opt/homebrew/share/postgresql@18/extension/jspg.control[39m
|
||||
[32m[1m Copying[0m[39m shared library to [36m/opt/homebrew/lib/postgresql@18/jspg.dylib[39m
|
||||
[32m[1m Discovered[0m[39m [36m[1m351[0m[39m SQL entities: [36m[1m1[0m[39m schemas ([36m[1m1[0m[39m unique), [36m[1m350[0m[39m functions, [36m[1m0[0m[39m types, [36m[1m0[0m[39m enums, [36m[1m0[0m[39m sqls, [36m[1m0[0m[39m ords, [36m[1m0[0m[39m hashes, [36m[1m0[0m[39m aggregates, [36m[1m0[0m[39m triggers
|
||||
[32m[1m Rebuilding[0m[39m [36mpgrx_embed[39m, in debug mode, for SQL generation with features [36mpg_test pg18[39m
|
||||
Compiling jspg v0.1.0 (/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg)
|
||||
Finished `dev` profile [unoptimized + debuginfo] target(s) in 10.63s
|
||||
[32m[1m Writing[0m[39m SQL entities to /opt/homebrew/share/postgresql@18/extension/jspg--0.1.0.sql
|
||||
[32m[1m Finished[0m[39m installing jspg
|
||||
[36m[2026-03-01 22:54:19.068 EST] [82952] [69a509eb.14408]: LOG: starting PostgreSQL 18.1 (Homebrew) on aarch64-apple-darwin25.2.0, compiled by Apple clang version 17.0.0 (clang-1700.6.3.2), 64-bit[39m
|
||||
[36m[2026-03-01 22:54:19.070 EST] [82952] [69a509eb.14408]: LOG: listening on IPv6 address "::1", port 32218[39m
|
||||
[36m[2026-03-01 22:54:19.070 EST] [82952] [69a509eb.14408]: LOG: listening on IPv4 address "127.0.0.1", port 32218[39m
|
||||
[36m[2026-03-01 22:54:19.071 EST] [82952] [69a509eb.14408]: LOG: listening on Unix socket "/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/target/test-pgdata/.s.PGSQL.32218"[39m
|
||||
[36m[2026-03-01 22:54:19.077 EST] [82958] [69a509eb.1440e]: LOG: database system was shut down at 2026-03-01 22:49:02 EST[39m
|
||||
[32m[1m Creating[0m[39m database [36m[1mpgrx_tests[0m[39m
|
||||
|
||||
thread 'tests::pg_test_typed_refs_0' (29092254) panicked at /Users/awgneo/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/pgrx-tests-0.16.1/src/framework.rs:166:9:
|
||||
|
||||
|
||||
Postgres Messages:
|
||||
[37m[2m[2026-03-01 22:54:19.068 EST] [82952] [69a509eb.14408]: LOG: starting PostgreSQL 18.1 (Homebrew) on aarch64-apple-darwin25.2.0, compiled by Apple clang version 17.0.0 (clang-1700.6.3.2), 64-bit
|
||||
[2026-03-01 22:54:19.070 EST] [82952] [69a509eb.14408]: LOG: listening on IPv6 address "::1", port 32218
|
||||
[2026-03-01 22:54:19.070 EST] [82952] [69a509eb.14408]: LOG: listening on IPv4 address "127.0.0.1", port 32218
|
||||
[2026-03-01 22:54:19.071 EST] [82952] [69a509eb.14408]: LOG: listening on Unix socket "/Users/awgneo/Repositories/thoughtpatterns/cellular/jspg/target/test-pgdata/.s.PGSQL.32218"
|
||||
[2026-03-01 22:54:19.081 EST] [82952] [69a509eb.14408]: LOG: database system is ready to accept connections
|
||||
[0m[39m
|
||||
|
||||
Test Function Messages:
|
||||
[36m[2026-03-01 22:54:20.058 EST] [82982] [69a509ec.14426]: LOG: statement: START TRANSACTION
|
||||
[2026-03-01 22:54:20.058 EST] [82982] [69a509ec.14426]: LOG: statement: SELECT "tests"."test_typed_refs_0"();
|
||||
[2026-03-01 22:54:20.062 EST] [82982] [69a509ec.14426]: ERROR: called `Result::unwrap()` on an `Err` value: "[Entity inheritance and native type discrimination] Test 'Valid person against organization schema (implicit type allowance)' failed. Expected: true, Got: false. Errors: [ValidationError { code: \"CONST_VIOLATED\", message: \"Value does not match const\", path: \"/type\" }, ValidationError { code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", path: \"/first_name\" }, ValidationError { code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", path: \"/first_name\" }, ValidationError { code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", path: \"/first_name\" }]\n[Entity inheritance and native type discrimination] Test 'Valid organization against organization schema' failed. Expected: true, Got: false. Errors: [ValidationError { code: \"CONST_VIOLATED\", message: \"Value does not match const\", path: \"/type\" }]\n[Entity inheritance and native type discrimination] Test 'Invalid entity against organization schema (ancestor not allowed)' failed. Expected: false, Got: true. Errors: []"
|
||||
[2026-03-01 22:54:20.062 EST] [82982] [69a509ec.14426]: STATEMENT: SELECT "tests"."test_typed_refs_0"();
|
||||
[2026-03-01 22:54:20.062 EST] [82982] [69a509ec.14426]: LOG: statement: ROLLBACK
|
||||
[39m
|
||||
|
||||
Client Error:
|
||||
[31m[1mcalled `Result::unwrap()` on an `Err` value: "[Entity inheritance and native type discrimination] Test 'Valid person against organization schema (implicit type allowance)' failed. Expected: true, Got: false. Errors: [ValidationError { code: \"CONST_VIOLATED\", message: \"Value does not match const\", path: \"/type\" }, ValidationError { code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", path: \"/first_name\" }, ValidationError { code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", path: \"/first_name\" }, ValidationError { code: \"STRICT_PROPERTY_VIOLATION\", message: \"Unexpected property 'first_name'\", path: \"/first_name\" }]\n[Entity inheritance and native type discrimination] Test 'Valid organization against organization schema' failed. Expected: true, Got: false. Errors: [ValidationError { code: \"CONST_VIOLATED\", message: \"Value does not match const\", path: \"/type\" }]\n[Entity inheritance and native type discrimination] Test 'Invalid entity against organization schema (ancestor not allowed)' failed. Expected: false, Got: true. Errors: []"[0m[39m
|
||||
postgres location: [37m[2mfixtures.rs[0m[39m
|
||||
rust location: [33m<unknown>[39m
|
||||
|
||||
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
test tests::pg_test_typed_refs_0 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
failures:
|
||||
tests::pg_test_typed_refs_0
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 343 filtered out; finished in 21.82s
|
||||
|
||||
error: test failed, to rerun pass `--lib`
|
||||
@ -1,28 +1,4 @@
|
||||
use jspg::util;
|
||||
|
||||
#[test]
|
||||
fn test_anchor_0() {
|
||||
let path = format!("{}/tests/fixtures/anchor.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_anchor_1() {
|
||||
let path = format!("{}/tests/fixtures/anchor.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_anchor_2() {
|
||||
let path = format!("{}/tests/fixtures/anchor.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_anchor_3() {
|
||||
let path = format!("{}/tests/fixtures/anchor.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 3).unwrap();
|
||||
}
|
||||
use jspg::validator::util;
|
||||
|
||||
#[test]
|
||||
fn test_content_0() {
|
||||
@ -109,53 +85,89 @@ fn test_min_items_2() {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_puncs_0() {
|
||||
let path = format!("{}/tests/fixtures/puncs.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_additional_properties_0() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_puncs_1() {
|
||||
let path = format!("{}/tests/fixtures/puncs.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_additional_properties_1() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_puncs_2() {
|
||||
let path = format!("{}/tests/fixtures/puncs.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_additional_properties_2() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_puncs_3() {
|
||||
let path = format!("{}/tests/fixtures/puncs.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_dependencies_0() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependencies_1() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependencies_2() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependencies_3() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 3).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_puncs_4() {
|
||||
let path = format!("{}/tests/fixtures/puncs.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_dependencies_4() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 4).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_puncs_5() {
|
||||
let path = format!("{}/tests/fixtures/puncs.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_dependencies_5() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 5).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_puncs_6() {
|
||||
let path = format!("{}/tests/fixtures/puncs.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_dependencies_6() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 6).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_puncs_7() {
|
||||
let path = format!("{}/tests/fixtures/puncs.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_dependencies_7() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 7).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependencies_8() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 8).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependencies_9() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 9).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependencies_10() {
|
||||
let path = format!("{}/tests/fixtures/dependencies.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 10).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_exclusive_minimum_0() {
|
||||
let path = format!("{}/tests/fixtures/exclusiveMinimum.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -271,65 +283,17 @@ fn test_const_17() {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_0() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_families_0() {
|
||||
let path = format!("{}/tests/fixtures/families.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_1() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
fn test_families_1() {
|
||||
let path = format!("{}/tests/fixtures/families.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_2() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_3() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 3).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_4() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 4).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_5() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 5).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_6() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 6).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_7() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 7).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_8() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 8).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_any_of_9() {
|
||||
let path = format!("{}/tests/fixtures/anyOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 9).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_property_names_0() {
|
||||
let path = format!("{}/tests/fixtures/propertyNames.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -372,18 +336,6 @@ fn test_property_names_6() {
|
||||
util::run_test_file_at_index(&path, 6).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_boolean_schema_0() {
|
||||
let path = format!("{}/tests/fixtures/boolean_schema.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_boolean_schema_1() {
|
||||
let path = format!("{}/tests/fixtures/boolean_schema.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_not_0() {
|
||||
let path = format!("{}/tests/fixtures/not.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -834,42 +786,6 @@ fn test_max_length_1() {
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_schemas_0() {
|
||||
let path = format!("{}/tests/fixtures/dependentSchemas.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_schemas_1() {
|
||||
let path = format!("{}/tests/fixtures/dependentSchemas.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_schemas_2() {
|
||||
let path = format!("{}/tests/fixtures/dependentSchemas.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_schemas_3() {
|
||||
let path = format!("{}/tests/fixtures/dependentSchemas.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 3).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_schemas_4() {
|
||||
let path = format!("{}/tests/fixtures/dependentSchemas.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 4).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_schemas_5() {
|
||||
let path = format!("{}/tests/fixtures/dependentSchemas.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 5).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_exclusive_maximum_0() {
|
||||
let path = format!("{}/tests/fixtures/exclusiveMaximum.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -996,6 +912,18 @@ fn test_one_of_12() {
|
||||
util::run_test_file_at_index(&path, 12).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_boolean_schema_0() {
|
||||
let path = format!("{}/tests/fixtures/booleanSchema.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_boolean_schema_1() {
|
||||
let path = format!("{}/tests/fixtures/booleanSchema.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_if_then_else_0() {
|
||||
let path = format!("{}/tests/fixtures/if-then-else.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -1122,36 +1050,6 @@ fn test_max_properties_3() {
|
||||
util::run_test_file_at_index(&path, 3).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_required_0() {
|
||||
let path = format!("{}/tests/fixtures/dependentRequired.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_required_1() {
|
||||
let path = format!("{}/tests/fixtures/dependentRequired.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_required_2() {
|
||||
let path = format!("{}/tests/fixtures/dependentRequired.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_required_3() {
|
||||
let path = format!("{}/tests/fixtures/dependentRequired.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 3).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dependent_required_4() {
|
||||
let path = format!("{}/tests/fixtures/dependentRequired.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 4).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_required_0() {
|
||||
let path = format!("{}/tests/fixtures/required.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -1434,12 +1332,6 @@ fn test_all_of_14() {
|
||||
util::run_test_file_at_index(&path, 14).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_all_of_15() {
|
||||
let path = format!("{}/tests/fixtures/allOf.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 15).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_format_0() {
|
||||
let path = format!("{}/tests/fixtures/format.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -1680,150 +1572,6 @@ fn test_ref_15() {
|
||||
util::run_test_file_at_index(&path, 15).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_16() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 16).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_17() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 17).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_18() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 18).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_19() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 19).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_20() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 20).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_21() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 21).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_22() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 22).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_23() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 23).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_24() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 24).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_25() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 25).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_26() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 26).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_27() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 27).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_28() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 28).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_29() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 29).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_30() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 30).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_31() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 31).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_32() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 32).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_33() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 33).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_34() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 34).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_35() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 35).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_36() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 36).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_37() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 37).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_38() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 38).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ref_39() {
|
||||
let path = format!("{}/tests/fixtures/ref.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 39).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_maximum_0() {
|
||||
let path = format!("{}/tests/fixtures/maximum.json", env!("CARGO_MANIFEST_DIR"));
|
||||
@ -1919,129 +1667,3 @@ fn test_contains_8() {
|
||||
let path = format!("{}/tests/fixtures/contains.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 8).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_0() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_1() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_2() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_3() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 3).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_4() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 4).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_5() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 5).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_6() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 6).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_7() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 7).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_8() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 8).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_9() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 9).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_10() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 10).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_11() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 11).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_12() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 12).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_13() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 13).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_14() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 14).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_15() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 15).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_16() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 16).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_17() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 17).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_18() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 18).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_19() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 19).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_dynamic_ref_20() {
|
||||
let path = format!("{}/tests/fixtures/dynamicRef.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 20).unwrap();
|
||||
}
|
||||
152
tests/fixtures/additionalProperties.json
vendored
Normal file
152
tests/fixtures/additionalProperties.json
vendored
Normal file
@ -0,0 +1,152 @@
|
||||
[
|
||||
{
|
||||
"description": "additionalProperties validates properties not matched by properties",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "schema1",
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "string"
|
||||
},
|
||||
"bar": {
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"additionalProperties": {
|
||||
"type": "boolean"
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "defined properties are valid",
|
||||
"data": {
|
||||
"foo": "value",
|
||||
"bar": 123
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema1"
|
||||
},
|
||||
{
|
||||
"description": "additional property matching schema is valid",
|
||||
"data": {
|
||||
"foo": "value",
|
||||
"is_active": true,
|
||||
"hidden": false
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema1"
|
||||
},
|
||||
{
|
||||
"description": "additional property not matching schema is invalid",
|
||||
"data": {
|
||||
"foo": "value",
|
||||
"is_active": 1
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema1"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true with additionalProperties still validates structure",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"extensible": true,
|
||||
"additionalProperties": {
|
||||
"type": "integer"
|
||||
},
|
||||
"$id": "additionalProperties_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "additional property matching schema is valid",
|
||||
"data": {
|
||||
"foo": "hello",
|
||||
"count": 5,
|
||||
"age": 42
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "additionalProperties_1_0"
|
||||
},
|
||||
{
|
||||
"description": "additional property not matching schema is invalid despite extensible: true",
|
||||
"data": {
|
||||
"foo": "hello",
|
||||
"count": "five"
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "additionalProperties_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "complex additionalProperties with object and array items",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "schema3",
|
||||
"properties": {
|
||||
"type": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"additionalProperties": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid array of strings",
|
||||
"data": {
|
||||
"type": "my_type",
|
||||
"group_a": [
|
||||
"field1",
|
||||
"field2"
|
||||
],
|
||||
"group_b": [
|
||||
"field3"
|
||||
]
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema3"
|
||||
},
|
||||
{
|
||||
"description": "invalid array of integers",
|
||||
"data": {
|
||||
"type": "my_type",
|
||||
"group_a": [
|
||||
1,
|
||||
2
|
||||
]
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema3"
|
||||
},
|
||||
{
|
||||
"description": "invalid non-array type",
|
||||
"data": {
|
||||
"type": "my_type",
|
||||
"group_a": "field1"
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema3"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
283
tests/fixtures/allOf.json
vendored
283
tests/fixtures/allOf.json
vendored
@ -1,8 +1,9 @@
|
||||
[
|
||||
{
|
||||
"description": "allOf",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"properties": {
|
||||
@ -24,6 +25,9 @@
|
||||
"foo"
|
||||
]
|
||||
}
|
||||
],
|
||||
"$id": "allOf_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -33,21 +37,24 @@
|
||||
"foo": "baz",
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_0_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch second",
|
||||
"data": {
|
||||
"foo": "baz"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_0_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch first",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_0_0"
|
||||
},
|
||||
{
|
||||
"description": "wrong type",
|
||||
@ -55,14 +62,16 @@
|
||||
"foo": "baz",
|
||||
"bar": "quux"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with base schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"properties": {
|
||||
"bar": {
|
||||
"type": "integer"
|
||||
@ -96,6 +105,9 @@
|
||||
"baz"
|
||||
]
|
||||
}
|
||||
],
|
||||
"$id": "allOf_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -106,7 +118,8 @@
|
||||
"bar": 2,
|
||||
"baz": null
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_1_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch base schema",
|
||||
@ -114,7 +127,8 @@
|
||||
"foo": "quux",
|
||||
"baz": null
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_1_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch first allOf",
|
||||
@ -122,7 +136,8 @@
|
||||
"bar": 2,
|
||||
"baz": null
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_1_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch second allOf",
|
||||
@ -130,21 +145,24 @@
|
||||
"foo": "quux",
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_1_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch both",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf simple types",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"maximum": 30
|
||||
@ -152,157 +170,200 @@
|
||||
{
|
||||
"minimum": 20
|
||||
}
|
||||
],
|
||||
"$id": "allOf_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid",
|
||||
"data": 25,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_2_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch one",
|
||||
"data": 35,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with boolean schemas, all true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
true,
|
||||
true
|
||||
],
|
||||
"$id": "allOf_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with boolean schemas, some false",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
true,
|
||||
false
|
||||
],
|
||||
"$id": "allOf_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with boolean schemas, all false",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
false,
|
||||
false
|
||||
],
|
||||
"$id": "allOf_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with one empty schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{}
|
||||
],
|
||||
"$id": "allOf_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any data is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with two empty schemas",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{},
|
||||
{}
|
||||
],
|
||||
"$id": "allOf_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any data is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with the first empty schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{},
|
||||
{
|
||||
"type": "number"
|
||||
}
|
||||
],
|
||||
"$id": "allOf_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "number is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_8_0"
|
||||
},
|
||||
{
|
||||
"description": "string is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with the last empty schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"type": "number"
|
||||
},
|
||||
{}
|
||||
],
|
||||
"$id": "allOf_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "number is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_9_0"
|
||||
},
|
||||
{
|
||||
"description": "string is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "nested allOf, to check validation semantics",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"allOf": [
|
||||
@ -311,88 +372,31 @@
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"$id": "allOf_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "null is valid",
|
||||
"data": null,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_10_0"
|
||||
},
|
||||
{
|
||||
"description": "anything non-null is invalid",
|
||||
"data": 123,
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf combined with anyOf, oneOf",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"allOf": [
|
||||
{
|
||||
"multipleOf": 2
|
||||
}
|
||||
],
|
||||
"anyOf": [
|
||||
{
|
||||
"multipleOf": 3
|
||||
}
|
||||
],
|
||||
"oneOf": [
|
||||
{
|
||||
"multipleOf": 5
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "allOf: false, anyOf: false, oneOf: false",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "allOf: false, anyOf: false, oneOf: true",
|
||||
"data": 5,
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "allOf: false, anyOf: true, oneOf: false",
|
||||
"data": 3,
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "allOf: false, anyOf: true, oneOf: true",
|
||||
"data": 15,
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "allOf: true, anyOf: false, oneOf: false",
|
||||
"data": 2,
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "allOf: true, anyOf: false, oneOf: true",
|
||||
"data": 10,
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "allOf: true, anyOf: true, oneOf: false",
|
||||
"data": 6,
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "allOf: true, anyOf: true, oneOf: true",
|
||||
"data": 30,
|
||||
"valid": true
|
||||
"valid": false,
|
||||
"schema_id": "allOf_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in allOf",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"properties": {
|
||||
@ -415,7 +419,10 @@
|
||||
]
|
||||
}
|
||||
],
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "allOf_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -425,14 +432,16 @@
|
||||
"bar": 2,
|
||||
"qux": 3
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "strict by default with allOf properties",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"properties": {
|
||||
@ -448,6 +457,9 @@
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
"$id": "allOf_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -457,7 +469,8 @@
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_13_0"
|
||||
},
|
||||
{
|
||||
"description": "fails on extra property z explicitly",
|
||||
@ -466,14 +479,16 @@
|
||||
"bar": 2,
|
||||
"z": 3
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allOf with nested extensible: true (partial looseness)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"properties": {
|
||||
@ -490,6 +505,9 @@
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
"$id": "allOf_14_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -500,37 +518,43 @@
|
||||
"bar": 2,
|
||||
"z": 3
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_14_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "strictness: allOf composition with strict refs",
|
||||
"schema": {
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"$ref": "#/$defs/partA"
|
||||
"$ref": "partA"
|
||||
},
|
||||
{
|
||||
"$ref": "#/$defs/partB"
|
||||
"$ref": "partB"
|
||||
}
|
||||
],
|
||||
"$defs": {
|
||||
"partA": {
|
||||
"$id": "allOf_15_0"
|
||||
},
|
||||
{
|
||||
"$id": "partA",
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"partB": {
|
||||
{
|
||||
"$id": "partB",
|
||||
"properties": {
|
||||
"name": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -539,7 +563,8 @@
|
||||
"id": "1",
|
||||
"name": "Me"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "allOf_15_0"
|
||||
},
|
||||
{
|
||||
"description": "extra property is invalid (root is strict)",
|
||||
@ -548,7 +573,8 @@
|
||||
"name": "Me",
|
||||
"extra": 1
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_15_0"
|
||||
},
|
||||
{
|
||||
"description": "partA mismatch is invalid",
|
||||
@ -556,7 +582,8 @@
|
||||
"id": 1,
|
||||
"name": "Me"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "allOf_15_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
120
tests/fixtures/anchor.json
vendored
120
tests/fixtures/anchor.json
vendored
@ -1,120 +0,0 @@
|
||||
[
|
||||
{
|
||||
"description": "Location-independent identifier",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$ref": "#foo",
|
||||
"$defs": {
|
||||
"A": {
|
||||
"$anchor": "foo",
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"data": 1,
|
||||
"description": "match",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"data": "a",
|
||||
"description": "mismatch",
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "Location-independent identifier with absolute URI",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$ref": "http://localhost:1234/draft2020-12/bar#foo",
|
||||
"$defs": {
|
||||
"A": {
|
||||
"$id": "http://localhost:1234/draft2020-12/bar",
|
||||
"$anchor": "foo",
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"data": 1,
|
||||
"description": "match",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"data": "a",
|
||||
"description": "mismatch",
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "Location-independent identifier with base URI change in subschema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "http://localhost:1234/draft2020-12/root",
|
||||
"$ref": "http://localhost:1234/draft2020-12/nested.json#foo",
|
||||
"$defs": {
|
||||
"A": {
|
||||
"$id": "nested.json",
|
||||
"$defs": {
|
||||
"B": {
|
||||
"$anchor": "foo",
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"data": 1,
|
||||
"description": "match",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"data": "a",
|
||||
"description": "mismatch",
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "same $anchor with different base uri",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "http://localhost:1234/draft2020-12/foobar",
|
||||
"$defs": {
|
||||
"A": {
|
||||
"$id": "child1",
|
||||
"allOf": [
|
||||
{
|
||||
"$id": "child2",
|
||||
"$anchor": "my_anchor",
|
||||
"type": "number"
|
||||
},
|
||||
{
|
||||
"$anchor": "my_anchor",
|
||||
"type": "string"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"$ref": "child1#my_anchor"
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "$ref resolves to /$defs/A/allOf/1",
|
||||
"data": "a",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "$ref does not resolve to /$defs/A/allOf/0",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
295
tests/fixtures/anyOf.json
vendored
295
tests/fixtures/anyOf.json
vendored
@ -1,295 +0,0 @@
|
||||
[
|
||||
{
|
||||
"description": "anyOf",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
{
|
||||
"type": "integer"
|
||||
},
|
||||
{
|
||||
"minimum": 2
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "first anyOf valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "second anyOf valid",
|
||||
"data": 2.5,
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "both anyOf valid",
|
||||
"data": 3,
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "neither anyOf valid",
|
||||
"data": 1.5,
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "anyOf with base schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"type": "string",
|
||||
"anyOf": [
|
||||
{
|
||||
"maxLength": 2
|
||||
},
|
||||
{
|
||||
"minLength": 4
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "mismatch base schema",
|
||||
"data": 3,
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "one anyOf valid",
|
||||
"data": "foobar",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "both anyOf invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "anyOf with boolean schemas, all true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
true,
|
||||
true
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "anyOf with boolean schemas, some true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
true,
|
||||
false
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "anyOf with boolean schemas, all false",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
false,
|
||||
false
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "anyOf complex types",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
{
|
||||
"properties": {
|
||||
"bar": {
|
||||
"type": "integer"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"foo"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "first anyOf valid (complex)",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "second anyOf valid (complex)",
|
||||
"data": {
|
||||
"foo": "baz"
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "both anyOf valid (complex)",
|
||||
"data": {
|
||||
"foo": "baz",
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "neither anyOf valid (complex)",
|
||||
"data": {
|
||||
"foo": 2,
|
||||
"bar": "quux"
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "anyOf with one empty schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
{
|
||||
"type": "number"
|
||||
},
|
||||
{}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "string is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "number is valid",
|
||||
"data": 123,
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "nested anyOf, to check validation semantics",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
{
|
||||
"anyOf": [
|
||||
{
|
||||
"type": "null"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "null is valid",
|
||||
"data": null,
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "anything non-null is invalid",
|
||||
"data": 123,
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in anyOf",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
{
|
||||
"type": "integer"
|
||||
},
|
||||
{
|
||||
"minimum": 2
|
||||
}
|
||||
],
|
||||
"extensible": true
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "extra property is valid",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "strict by default with anyOf properties",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"anyOf": [
|
||||
{
|
||||
"properties": {
|
||||
"foo": {
|
||||
"const": 1
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"properties": {
|
||||
"bar": {
|
||||
"const": 2
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid match (foo)",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "fails on extra property z explicitly",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"z": 3
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
@ -1,111 +1,142 @@
|
||||
[
|
||||
{
|
||||
"description": "boolean schema 'true'",
|
||||
"schema": true,
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "booleanSchema_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "number is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
},
|
||||
{
|
||||
"description": "string is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean true is valid",
|
||||
"data": true,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean false is valid",
|
||||
"data": false,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
},
|
||||
{
|
||||
"description": "null is valid",
|
||||
"data": null,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
},
|
||||
{
|
||||
"description": "object is valid",
|
||||
"data": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty object is valid",
|
||||
"data": {},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
},
|
||||
{
|
||||
"description": "array is valid",
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "booleanSchema_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "boolean schema 'false'",
|
||||
"schema": false,
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": {},
|
||||
"$id": "booleanSchema_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "number is invalid",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
},
|
||||
{
|
||||
"description": "string is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean true is invalid",
|
||||
"data": true,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean false is invalid",
|
||||
"data": false,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
},
|
||||
{
|
||||
"description": "null is invalid",
|
||||
"data": null,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
},
|
||||
{
|
||||
"description": "object is invalid",
|
||||
"data": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
},
|
||||
{
|
||||
"description": "empty object is invalid",
|
||||
"data": {},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
},
|
||||
{
|
||||
"description": "array is invalid",
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "booleanSchema_1_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
360
tests/fixtures/const.json
vendored
360
tests/fixtures/const.json
vendored
@ -1,32 +1,40 @@
|
||||
[
|
||||
{
|
||||
"description": "const validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": 2
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": 2,
|
||||
"$id": "const_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "same value is valid",
|
||||
"data": 2,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_0_0"
|
||||
},
|
||||
{
|
||||
"description": "another value is invalid",
|
||||
"data": 5,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_0_0"
|
||||
},
|
||||
{
|
||||
"description": "another type is invalid",
|
||||
"data": "a",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with object",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": {
|
||||
"foo": "bar",
|
||||
"baz": "bax"
|
||||
@ -34,7 +42,10 @@
|
||||
"properties": {
|
||||
"foo": {},
|
||||
"baz": {}
|
||||
},
|
||||
"$id": "const_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -43,7 +54,8 @@
|
||||
"foo": "bar",
|
||||
"baz": "bax"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_1_0"
|
||||
},
|
||||
{
|
||||
"description": "same object with different property order is valid",
|
||||
@ -51,14 +63,16 @@
|
||||
"baz": "bax",
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_1_0"
|
||||
},
|
||||
{
|
||||
"description": "another object is invalid",
|
||||
"data": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_1_0"
|
||||
},
|
||||
{
|
||||
"description": "another type is invalid",
|
||||
@ -66,18 +80,23 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with array",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": [
|
||||
{
|
||||
"foo": "bar"
|
||||
}
|
||||
],
|
||||
"$id": "const_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -88,14 +107,16 @@
|
||||
"foo": "bar"
|
||||
}
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_2_0"
|
||||
},
|
||||
{
|
||||
"description": "another array item is invalid",
|
||||
"data": [
|
||||
2
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_2_0"
|
||||
},
|
||||
{
|
||||
"description": "array with additional items is invalid",
|
||||
@ -104,83 +125,108 @@
|
||||
2,
|
||||
3
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with null",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": null
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": null,
|
||||
"$id": "const_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "null is valid",
|
||||
"data": null,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_3_0"
|
||||
},
|
||||
{
|
||||
"description": "not null is invalid",
|
||||
"data": 0,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with false does not match 0",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": false
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": false,
|
||||
"$id": "const_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "false is valid",
|
||||
"data": false,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_4_0"
|
||||
},
|
||||
{
|
||||
"description": "integer zero is invalid",
|
||||
"data": 0,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_4_0"
|
||||
},
|
||||
{
|
||||
"description": "float zero is invalid",
|
||||
"data": 0.0,
|
||||
"valid": false
|
||||
"data": 0,
|
||||
"valid": false,
|
||||
"schema_id": "const_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with true does not match 1",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": true
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": true,
|
||||
"$id": "const_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "true is valid",
|
||||
"data": true,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_5_0"
|
||||
},
|
||||
{
|
||||
"description": "integer one is invalid",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_5_0"
|
||||
},
|
||||
{
|
||||
"description": "float one is invalid",
|
||||
"data": 1.0,
|
||||
"valid": false
|
||||
"data": 1,
|
||||
"valid": false,
|
||||
"schema_id": "const_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with [false] does not match [0]",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": [
|
||||
false
|
||||
],
|
||||
"$id": "const_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -189,30 +235,37 @@
|
||||
"data": [
|
||||
false
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_6_0"
|
||||
},
|
||||
{
|
||||
"description": "[0] is invalid",
|
||||
"data": [
|
||||
0
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_6_0"
|
||||
},
|
||||
{
|
||||
"description": "[0.0] is invalid",
|
||||
"data": [
|
||||
0.0
|
||||
0
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with [true] does not match [1]",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": [
|
||||
true
|
||||
],
|
||||
"$id": "const_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -221,31 +274,38 @@
|
||||
"data": [
|
||||
true
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_7_0"
|
||||
},
|
||||
{
|
||||
"description": "[1] is invalid",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_7_0"
|
||||
},
|
||||
{
|
||||
"description": "[1.0] is invalid",
|
||||
"data": [
|
||||
1.0
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with {\"a\": false} does not match {\"a\": 0}",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": {
|
||||
"a": false
|
||||
},
|
||||
"$id": "const_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -253,31 +313,38 @@
|
||||
"data": {
|
||||
"a": false
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_8_0"
|
||||
},
|
||||
{
|
||||
"description": "{\"a\": 0} is invalid",
|
||||
"data": {
|
||||
"a": 0
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_8_0"
|
||||
},
|
||||
{
|
||||
"description": "{\"a\": 0.0} is invalid",
|
||||
"data": {
|
||||
"a": 0.0
|
||||
"a": 0
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with {\"a\": true} does not match {\"a\": 1}",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": {
|
||||
"a": true
|
||||
},
|
||||
"$id": "const_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -285,221 +352,280 @@
|
||||
"data": {
|
||||
"a": true
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_9_0"
|
||||
},
|
||||
{
|
||||
"description": "{\"a\": 1} is invalid",
|
||||
"data": {
|
||||
"a": 1
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_9_0"
|
||||
},
|
||||
{
|
||||
"description": "{\"a\": 1.0} is invalid",
|
||||
"data": {
|
||||
"a": 1.0
|
||||
"a": 1
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with 0 does not match other zero-like types",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": 0
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": 0,
|
||||
"$id": "const_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "false is invalid",
|
||||
"data": false,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_10_0"
|
||||
},
|
||||
{
|
||||
"description": "integer zero is valid",
|
||||
"data": 0,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_10_0"
|
||||
},
|
||||
{
|
||||
"description": "float zero is valid",
|
||||
"data": 0.0,
|
||||
"valid": true
|
||||
"data": 0,
|
||||
"valid": true,
|
||||
"schema_id": "const_10_0"
|
||||
},
|
||||
{
|
||||
"description": "empty object is invalid",
|
||||
"data": {},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_10_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_10_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string is invalid",
|
||||
"data": "",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with 1 does not match true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": 1
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": 1,
|
||||
"$id": "const_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "true is invalid",
|
||||
"data": true,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_11_0"
|
||||
},
|
||||
{
|
||||
"description": "integer one is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_11_0"
|
||||
},
|
||||
{
|
||||
"description": "float one is valid",
|
||||
"data": 1.0,
|
||||
"valid": true
|
||||
"data": 1,
|
||||
"valid": true,
|
||||
"schema_id": "const_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "const with -2.0 matches integer and float types",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": -2.0
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": -2,
|
||||
"$id": "const_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "integer -2 is valid",
|
||||
"data": -2,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_12_0"
|
||||
},
|
||||
{
|
||||
"description": "integer 2 is invalid",
|
||||
"data": 2,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_12_0"
|
||||
},
|
||||
{
|
||||
"description": "float -2.0 is valid",
|
||||
"data": -2.0,
|
||||
"valid": true
|
||||
"data": -2,
|
||||
"valid": true,
|
||||
"schema_id": "const_12_0"
|
||||
},
|
||||
{
|
||||
"description": "float 2.0 is invalid",
|
||||
"data": 2.0,
|
||||
"valid": false
|
||||
"data": 2,
|
||||
"valid": false,
|
||||
"schema_id": "const_12_0"
|
||||
},
|
||||
{
|
||||
"description": "float -2.00001 is invalid",
|
||||
"data": -2.00001,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "float and integers are equal up to 64-bit representation limits",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": 9007199254740992
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": 9007199254740992,
|
||||
"$id": "const_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "integer is valid",
|
||||
"data": 9007199254740992,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_13_0"
|
||||
},
|
||||
{
|
||||
"description": "integer minus one is invalid",
|
||||
"data": 9007199254740991,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_13_0"
|
||||
},
|
||||
{
|
||||
"description": "float is valid",
|
||||
"data": 9007199254740992.0,
|
||||
"valid": true
|
||||
"data": 9007199254740992,
|
||||
"valid": true,
|
||||
"schema_id": "const_13_0"
|
||||
},
|
||||
{
|
||||
"description": "float minus one is invalid",
|
||||
"data": 9007199254740991.0,
|
||||
"valid": false
|
||||
"data": 9007199254740991,
|
||||
"valid": false,
|
||||
"schema_id": "const_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "nul characters in strings",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"const": "hello\u0000there"
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": "hello\u0000there",
|
||||
"$id": "const_14_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "match string with nul",
|
||||
"data": "hello\u0000there",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_14_0"
|
||||
},
|
||||
{
|
||||
"description": "do not match string lacking nul",
|
||||
"data": "hellothere",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_14_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "characters with the same visual representation but different codepoint",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": "μ",
|
||||
"$comment": "U+03BC"
|
||||
"$comment": "U+03BC",
|
||||
"$id": "const_15_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "character uses the same codepoint",
|
||||
"data": "μ",
|
||||
"comment": "U+03BC",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_15_0"
|
||||
},
|
||||
{
|
||||
"description": "character looks the same but uses a different codepoint",
|
||||
"data": "µ",
|
||||
"comment": "U+00B5",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_15_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "characters with the same visual representation, but different number of codepoints",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": "ä",
|
||||
"$comment": "U+00E4"
|
||||
"$comment": "U+00E4",
|
||||
"$id": "const_16_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "character uses the same codepoint",
|
||||
"data": "ä",
|
||||
"comment": "U+00E4",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_16_0"
|
||||
},
|
||||
{
|
||||
"description": "character looks the same but uses combining marks",
|
||||
"data": "ä",
|
||||
"comment": "a, U+0308",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_16_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in const object match",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"const": {
|
||||
"a": 1
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "const_17_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -508,14 +634,16 @@
|
||||
"a": 1,
|
||||
"b": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "const_17_0"
|
||||
},
|
||||
{
|
||||
"description": "extra property match in const (this is effectively impossible if data has extra props not in const, it implicitly fails const check unless we assume const check ignored extra props? No, const check is strict. So this test is just to show strictness passes.)",
|
||||
"data": {
|
||||
"a": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "const_17_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
154
tests/fixtures/contains.json
vendored
154
tests/fixtures/contains.json
vendored
@ -1,12 +1,16 @@
|
||||
[
|
||||
{
|
||||
"description": "contains keyword validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"minimum": 5
|
||||
},
|
||||
"items": true
|
||||
"items": true,
|
||||
"$id": "contains_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -16,7 +20,8 @@
|
||||
4,
|
||||
5
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_0_0"
|
||||
},
|
||||
{
|
||||
"description": "array with item matching schema (6) is valid (items: true)",
|
||||
@ -25,7 +30,8 @@
|
||||
4,
|
||||
6
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_0_0"
|
||||
},
|
||||
{
|
||||
"description": "array with two items matching schema (5, 6) is valid (items: true)",
|
||||
@ -35,7 +41,8 @@
|
||||
5,
|
||||
6
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_0_0"
|
||||
},
|
||||
{
|
||||
"description": "array without items matching schema is invalid",
|
||||
@ -44,28 +51,35 @@
|
||||
3,
|
||||
4
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_0_0"
|
||||
},
|
||||
{
|
||||
"description": "not array is valid",
|
||||
"data": {},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "contains keyword with const keyword",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 5
|
||||
},
|
||||
"items": true
|
||||
"items": true,
|
||||
"$id": "contains_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -75,7 +89,8 @@
|
||||
4,
|
||||
5
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "array with two items 5 is valid (items: true)",
|
||||
@ -85,7 +100,8 @@
|
||||
5,
|
||||
5
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "array without item 5 is invalid",
|
||||
@ -95,15 +111,20 @@
|
||||
3,
|
||||
4
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "contains keyword with boolean schema true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"contains": true
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": true,
|
||||
"$id": "contains_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -111,20 +132,26 @@
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_2_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "contains keyword with boolean schema false",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"contains": false
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": false,
|
||||
"$id": "contains_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -132,30 +159,37 @@
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_3_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_3_0"
|
||||
},
|
||||
{
|
||||
"description": "non-arrays are valid",
|
||||
"data": "contains does not apply to strings",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "items + contains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"items": {
|
||||
"multipleOf": 2
|
||||
},
|
||||
"contains": {
|
||||
"multipleOf": 3
|
||||
},
|
||||
"$id": "contains_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -165,7 +199,8 @@
|
||||
4,
|
||||
8
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_4_0"
|
||||
},
|
||||
{
|
||||
"description": "does not match items, matches contains",
|
||||
@ -174,7 +209,8 @@
|
||||
6,
|
||||
9
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_4_0"
|
||||
},
|
||||
{
|
||||
"description": "matches both items and contains",
|
||||
@ -182,7 +218,8 @@
|
||||
6,
|
||||
12
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_4_0"
|
||||
},
|
||||
{
|
||||
"description": "matches neither items nor contains",
|
||||
@ -190,18 +227,23 @@
|
||||
1,
|
||||
5
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "contains with false if subschema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"if": false,
|
||||
"else": true
|
||||
},
|
||||
"$id": "contains_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -209,22 +251,28 @@
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_5_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "contains with null instance elements",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"type": "null"
|
||||
},
|
||||
"$id": "contains_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -232,18 +280,23 @@
|
||||
"data": [
|
||||
null
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows non-matching items in contains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "contains_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -252,17 +305,22 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "strict by default: non-matching items in contains are invalid",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"$id": "contains_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -271,7 +329,8 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "contains_8_0"
|
||||
},
|
||||
{
|
||||
"description": "only matching items is valid",
|
||||
@ -279,7 +338,8 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "contains_8_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
92
tests/fixtures/content.json
vendored
92
tests/fixtures/content.json
vendored
@ -1,86 +1,109 @@
|
||||
[
|
||||
{
|
||||
"description": "validation of string-encoded content based on media type",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"contentMediaType": "application/json"
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contentMediaType": "application/json",
|
||||
"$id": "content_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "a valid JSON document",
|
||||
"data": "{\"foo\": \"bar\"}",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_0_0"
|
||||
},
|
||||
{
|
||||
"description": "an invalid JSON document; validates true",
|
||||
"data": "{:}",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-strings",
|
||||
"data": 100,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "validation of binary string-encoding",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"contentEncoding": "base64"
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contentEncoding": "base64",
|
||||
"$id": "content_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "a valid base64 string",
|
||||
"data": "eyJmb28iOiAiYmFyIn0K",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_1_0"
|
||||
},
|
||||
{
|
||||
"description": "an invalid base64 string (% is not a valid character); validates true",
|
||||
"data": "eyJmb28iOi%iYmFyIn0K",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_1_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-strings",
|
||||
"data": 100,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "validation of binary-encoded media type documents",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contentMediaType": "application/json",
|
||||
"contentEncoding": "base64"
|
||||
"contentEncoding": "base64",
|
||||
"$id": "content_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "a valid base64-encoded JSON document",
|
||||
"data": "eyJmb28iOiAiYmFyIn0K",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_2_0"
|
||||
},
|
||||
{
|
||||
"description": "a validly-encoded invalid JSON document; validates true",
|
||||
"data": "ezp9Cg==",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_2_0"
|
||||
},
|
||||
{
|
||||
"description": "an invalid base64 string that is valid JSON; validates true",
|
||||
"data": "{}",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_2_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-strings",
|
||||
"data": 100,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "validation of binary-encoded media type documents with schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contentMediaType": "application/json",
|
||||
"contentEncoding": "base64",
|
||||
"contentSchema": {
|
||||
@ -96,48 +119,59 @@
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
},
|
||||
"$id": "content_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "a valid base64-encoded JSON document",
|
||||
"data": "eyJmb28iOiAiYmFyIn0K",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_3_0"
|
||||
},
|
||||
{
|
||||
"description": "another valid base64-encoded JSON document",
|
||||
"data": "eyJib28iOiAyMCwgImZvbyI6ICJiYXoifQ==",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_3_0"
|
||||
},
|
||||
{
|
||||
"description": "an invalid base64-encoded JSON document; validates true",
|
||||
"data": "eyJib28iOiAyMH0=",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_3_0"
|
||||
},
|
||||
{
|
||||
"description": "an empty object as a base64-encoded JSON document; validates true",
|
||||
"data": "e30=",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_3_0"
|
||||
},
|
||||
{
|
||||
"description": "an empty array as a base64-encoded JSON document",
|
||||
"data": "W10=",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_3_0"
|
||||
},
|
||||
{
|
||||
"description": "a validly-encoded invalid JSON document; validates true",
|
||||
"data": "ezp9Cg==",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_3_0"
|
||||
},
|
||||
{
|
||||
"description": "an invalid base64 string that is valid JSON; validates true",
|
||||
"data": "{}",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_3_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-strings",
|
||||
"data": 100,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "content_3_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
619
tests/fixtures/dependencies.json
vendored
Normal file
619
tests/fixtures/dependencies.json
vendored
Normal file
@ -0,0 +1,619 @@
|
||||
[
|
||||
{
|
||||
"description": "single dependency (required)",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema1",
|
||||
"dependencies": {
|
||||
"bar": [
|
||||
"foo"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "neither",
|
||||
"data": {},
|
||||
"valid": true,
|
||||
"schema_id": "schema1"
|
||||
},
|
||||
{
|
||||
"description": "nondependant",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema1"
|
||||
},
|
||||
{
|
||||
"description": "with dependency",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema1"
|
||||
},
|
||||
{
|
||||
"description": "missing dependency",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema1"
|
||||
},
|
||||
{
|
||||
"description": "ignores arrays",
|
||||
"data": [
|
||||
"bar"
|
||||
],
|
||||
"valid": true,
|
||||
"schema_id": "schema1"
|
||||
},
|
||||
{
|
||||
"description": "ignores strings",
|
||||
"data": "foobar",
|
||||
"valid": true,
|
||||
"schema_id": "schema1"
|
||||
},
|
||||
{
|
||||
"description": "ignores other non-objects",
|
||||
"data": 12,
|
||||
"valid": true,
|
||||
"schema_id": "schema1"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "empty dependents",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema2",
|
||||
"dependencies": {
|
||||
"bar": []
|
||||
},
|
||||
"extensible": true
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty object",
|
||||
"data": {},
|
||||
"valid": true,
|
||||
"schema_id": "schema2"
|
||||
},
|
||||
{
|
||||
"description": "object with one property",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema2"
|
||||
},
|
||||
{
|
||||
"description": "non-object is valid",
|
||||
"data": 1,
|
||||
"valid": true,
|
||||
"schema_id": "schema2"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "multiple dependents required",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema3",
|
||||
"dependencies": {
|
||||
"quux": [
|
||||
"foo",
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "neither",
|
||||
"data": {},
|
||||
"valid": true,
|
||||
"schema_id": "schema3"
|
||||
},
|
||||
{
|
||||
"description": "nondependants",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema3"
|
||||
},
|
||||
{
|
||||
"description": "with dependencies",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"quux": 3
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema3"
|
||||
},
|
||||
{
|
||||
"description": "missing dependency",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"quux": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema3"
|
||||
},
|
||||
{
|
||||
"description": "missing other dependency",
|
||||
"data": {
|
||||
"bar": 1,
|
||||
"quux": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema3"
|
||||
},
|
||||
{
|
||||
"description": "missing both dependencies",
|
||||
"data": {
|
||||
"quux": 1
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema3"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "dependencies with escaped characters",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema4",
|
||||
"dependencies": {
|
||||
"foo\nbar": [
|
||||
"foo\rbar"
|
||||
],
|
||||
"foo\"bar": [
|
||||
"foo'bar"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "CRLF",
|
||||
"data": {
|
||||
"foo\nbar": 1,
|
||||
"foo\rbar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema4"
|
||||
},
|
||||
{
|
||||
"description": "quoted quotes",
|
||||
"data": {
|
||||
"foo'bar": 1,
|
||||
"foo\"bar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema4"
|
||||
},
|
||||
{
|
||||
"description": "CRLF missing dependent",
|
||||
"data": {
|
||||
"foo\nbar": 1,
|
||||
"foo": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema4"
|
||||
},
|
||||
{
|
||||
"description": "quoted quotes missing dependent",
|
||||
"data": {
|
||||
"foo\"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema4"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in dependentRequired",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema5",
|
||||
"dependencies": {
|
||||
"bar": [
|
||||
"foo"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "extra property is valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema5"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "single dependency (schemas, STRICT)",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema_schema1",
|
||||
"properties": {
|
||||
"foo": true,
|
||||
"bar": true
|
||||
},
|
||||
"dependencies": {
|
||||
"bar": {
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "integer"
|
||||
},
|
||||
"bar": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema1"
|
||||
},
|
||||
{
|
||||
"description": "no dependency",
|
||||
"data": {
|
||||
"foo": "quux"
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema1"
|
||||
},
|
||||
{
|
||||
"description": "wrong type",
|
||||
"data": {
|
||||
"foo": "quux",
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema1"
|
||||
},
|
||||
{
|
||||
"description": "wrong type other",
|
||||
"data": {
|
||||
"foo": 2,
|
||||
"bar": "quux"
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema1"
|
||||
},
|
||||
{
|
||||
"description": "wrong type both",
|
||||
"data": {
|
||||
"foo": "quux",
|
||||
"bar": "quux"
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema1"
|
||||
},
|
||||
{
|
||||
"description": "ignores arrays (invalid in strict mode)",
|
||||
"data": [
|
||||
"bar"
|
||||
],
|
||||
"valid": false,
|
||||
"expect_errors": [
|
||||
{
|
||||
"code": "STRICT_ITEM_VIOLATION"
|
||||
}
|
||||
],
|
||||
"schema_id": "schema_schema1"
|
||||
},
|
||||
{
|
||||
"description": "ignores strings",
|
||||
"data": "foobar",
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema1"
|
||||
},
|
||||
{
|
||||
"description": "ignores other non-objects",
|
||||
"data": 12,
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema1"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "single dependency (schemas, EXTENSIBLE)",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema_schema2",
|
||||
"properties": {
|
||||
"foo": true,
|
||||
"bar": true
|
||||
},
|
||||
"dependencies": {
|
||||
"bar": {
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "integer"
|
||||
},
|
||||
"bar": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"extensible": true
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "ignores arrays (valid in extensible mode)",
|
||||
"data": [
|
||||
"bar"
|
||||
],
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema2"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "boolean subschemas",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema_schema3",
|
||||
"properties": {
|
||||
"foo": true,
|
||||
"bar": true
|
||||
},
|
||||
"dependencies": {
|
||||
"foo": true,
|
||||
"bar": false
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "object with property having schema true is valid",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema3"
|
||||
},
|
||||
{
|
||||
"description": "object with property having schema false is invalid",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema3"
|
||||
},
|
||||
{
|
||||
"description": "object with both properties is invalid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema3"
|
||||
},
|
||||
{
|
||||
"description": "empty object is valid",
|
||||
"data": {},
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema3"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "dependencies with escaped characters",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema_schema4",
|
||||
"properties": {
|
||||
"foo\tbar": true,
|
||||
"foo'bar": true,
|
||||
"a": true,
|
||||
"b": true,
|
||||
"c": true
|
||||
},
|
||||
"dependencies": {
|
||||
"foo\tbar": {
|
||||
"minProperties": 4,
|
||||
"extensible": true
|
||||
},
|
||||
"foo'bar": {
|
||||
"required": [
|
||||
"foo\"bar"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "quoted tab",
|
||||
"data": {
|
||||
"foo\tbar": 1,
|
||||
"a": 2,
|
||||
"b": 3,
|
||||
"c": 4
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema4"
|
||||
},
|
||||
{
|
||||
"description": "quoted quote",
|
||||
"data": {
|
||||
"foo'bar": {
|
||||
"foo\"bar": 1
|
||||
}
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema4"
|
||||
},
|
||||
{
|
||||
"description": "quoted tab invalid under dependent schema",
|
||||
"data": {
|
||||
"foo\tbar": 1,
|
||||
"a": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema4"
|
||||
},
|
||||
{
|
||||
"description": "quoted quote invalid under dependent schema",
|
||||
"data": {
|
||||
"foo'bar": 1
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema4"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "dependent subschema incompatible with root (STRICT)",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema_schema5",
|
||||
"properties": {
|
||||
"foo": {},
|
||||
"baz": true
|
||||
},
|
||||
"dependencies": {
|
||||
"foo": {
|
||||
"properties": {
|
||||
"bar": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "matches root",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema5"
|
||||
},
|
||||
{
|
||||
"description": "matches dependency (invalid in strict mode - bar not allowed if foo missing)",
|
||||
"data": {
|
||||
"bar": 1
|
||||
},
|
||||
"valid": false,
|
||||
"expect_errors": [
|
||||
{
|
||||
"code": "STRICT_PROPERTY_VIOLATION"
|
||||
}
|
||||
],
|
||||
"schema_id": "schema_schema5"
|
||||
},
|
||||
{
|
||||
"description": "matches both",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "schema_schema5"
|
||||
},
|
||||
{
|
||||
"description": "no dependency",
|
||||
"data": {
|
||||
"baz": 1
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema5"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "dependent subschema incompatible with root (EXTENSIBLE)",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "schema_schema6",
|
||||
"properties": {
|
||||
"foo": {},
|
||||
"baz": true
|
||||
},
|
||||
"dependencies": {
|
||||
"foo": {
|
||||
"properties": {
|
||||
"bar": {}
|
||||
},
|
||||
"additionalProperties": false
|
||||
}
|
||||
},
|
||||
"extensible": true
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "matches dependency (valid in extensible mode)",
|
||||
"data": {
|
||||
"bar": 1
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "schema_schema6"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
220
tests/fixtures/dependentRequired.json
vendored
220
tests/fixtures/dependentRequired.json
vendored
@ -1,220 +0,0 @@
|
||||
[
|
||||
{
|
||||
"description": "single dependency",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"dependentRequired": {
|
||||
"bar": [
|
||||
"foo"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "neither",
|
||||
"data": {},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "nondependant",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "with dependency",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "missing dependency",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "ignores arrays",
|
||||
"data": [
|
||||
"bar"
|
||||
],
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "ignores strings",
|
||||
"data": "foobar",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "ignores other non-objects",
|
||||
"data": 12,
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "empty dependents",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"dependentRequired": {
|
||||
"bar": []
|
||||
},
|
||||
"extensible": true
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty object",
|
||||
"data": {},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "object with one property",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "non-object is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "multiple dependents required",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"dependentRequired": {
|
||||
"quux": [
|
||||
"foo",
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "neither",
|
||||
"data": {},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "nondependants",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "with dependencies",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"quux": 3
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "missing dependency",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"quux": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "missing other dependency",
|
||||
"data": {
|
||||
"bar": 1,
|
||||
"quux": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "missing both dependencies",
|
||||
"data": {
|
||||
"quux": 1
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "dependencies with escaped characters",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"dependentRequired": {
|
||||
"foo\nbar": [
|
||||
"foo\rbar"
|
||||
],
|
||||
"foo\"bar": [
|
||||
"foo'bar"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "CRLF",
|
||||
"data": {
|
||||
"foo\nbar": 1,
|
||||
"foo\rbar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "quoted quotes",
|
||||
"data": {
|
||||
"foo'bar": 1,
|
||||
"foo\"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "CRLF missing dependent",
|
||||
"data": {
|
||||
"foo\nbar": 1,
|
||||
"foo": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "quoted quotes missing dependent",
|
||||
"data": {
|
||||
"foo\"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in dependentRequired",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"dependentRequired": {
|
||||
"bar": [
|
||||
"foo"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "extra property is valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
303
tests/fixtures/dependentSchemas.json
vendored
303
tests/fixtures/dependentSchemas.json
vendored
@ -1,303 +0,0 @@
|
||||
[
|
||||
{
|
||||
"description": "single dependency (STRICT)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"foo": true,
|
||||
"bar": true
|
||||
},
|
||||
"dependentSchemas": {
|
||||
"bar": {
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "integer"
|
||||
},
|
||||
"bar": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "no dependency",
|
||||
"data": {
|
||||
"foo": "quux"
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "wrong type",
|
||||
"data": {
|
||||
"foo": "quux",
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "wrong type other",
|
||||
"data": {
|
||||
"foo": 2,
|
||||
"bar": "quux"
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "wrong type both",
|
||||
"data": {
|
||||
"foo": "quux",
|
||||
"bar": "quux"
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "ignores arrays (invalid in strict mode)",
|
||||
"data": [
|
||||
"bar"
|
||||
],
|
||||
"valid": false,
|
||||
"expect_errors": [
|
||||
{
|
||||
"code": "STRICT_ITEM_VIOLATION"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "ignores strings (invalid in strict mode - wait, strings are scalars, strict only checks obj/arr)",
|
||||
"data": "foobar",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "ignores other non-objects",
|
||||
"data": 12,
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "single dependency (EXTENSIBLE)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"foo": true,
|
||||
"bar": true
|
||||
},
|
||||
"dependentSchemas": {
|
||||
"bar": {
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "integer"
|
||||
},
|
||||
"bar": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"extensible": true
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "ignores arrays (valid in extensible mode)",
|
||||
"data": [
|
||||
"bar"
|
||||
],
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "boolean subschemas",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"foo": true,
|
||||
"bar": true
|
||||
},
|
||||
"dependentSchemas": {
|
||||
"foo": true,
|
||||
"bar": false
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "object with property having schema true is valid",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "object with property having schema false is invalid",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "object with both properties is invalid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "empty object is valid",
|
||||
"data": {},
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "dependencies with escaped characters",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"foo\tbar": true,
|
||||
"foo'bar": true,
|
||||
"a": true,
|
||||
"b": true,
|
||||
"c": true
|
||||
},
|
||||
"dependentSchemas": {
|
||||
"foo\tbar": {
|
||||
"minProperties": 4,
|
||||
"extensible": true
|
||||
},
|
||||
"foo'bar": {
|
||||
"required": [
|
||||
"foo\"bar"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "quoted tab",
|
||||
"data": {
|
||||
"foo\tbar": 1,
|
||||
"a": 2,
|
||||
"b": 3,
|
||||
"c": 4
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "quoted quote",
|
||||
"data": {
|
||||
"foo'bar": {
|
||||
"foo\"bar": 1
|
||||
}
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "quoted tab invalid under dependent schema",
|
||||
"data": {
|
||||
"foo\tbar": 1,
|
||||
"a": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "quoted quote invalid under dependent schema",
|
||||
"data": {
|
||||
"foo'bar": 1
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "dependent subschema incompatible with root (STRICT)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"foo": {},
|
||||
"baz": true
|
||||
},
|
||||
"dependentSchemas": {
|
||||
"foo": {
|
||||
"properties": {
|
||||
"bar": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "matches root",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "matches dependency (invalid in strict mode - bar not allowed if foo missing)",
|
||||
"data": {
|
||||
"bar": 1
|
||||
},
|
||||
"valid": false,
|
||||
"expect_errors": [
|
||||
{
|
||||
"code": "STRICT_PROPERTY_VIOLATION"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "matches both",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "no dependency",
|
||||
"data": {
|
||||
"baz": 1
|
||||
},
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "dependent subschema incompatible with root (EXTENSIBLE)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"foo": {},
|
||||
"baz": true
|
||||
},
|
||||
"dependentSchemas": {
|
||||
"foo": {
|
||||
"properties": {
|
||||
"bar": {}
|
||||
},
|
||||
"additionalProperties": false
|
||||
}
|
||||
},
|
||||
"extensible": true
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "matches dependency (valid in extensible mode)",
|
||||
"data": {
|
||||
"bar": 1
|
||||
},
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
1111
tests/fixtures/dynamicRef.json
vendored
1111
tests/fixtures/dynamicRef.json
vendored
File diff suppressed because it is too large
Load Diff
38
tests/fixtures/emptyString.json
vendored
38
tests/fixtures/emptyString.json
vendored
@ -1,8 +1,9 @@
|
||||
[
|
||||
{
|
||||
"description": "empty string is valid for all types (except const)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"properties": {
|
||||
"obj": {
|
||||
"type": "object"
|
||||
@ -35,7 +36,10 @@
|
||||
"con_empty": {
|
||||
"const": ""
|
||||
}
|
||||
},
|
||||
"$id": "emptyString_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -43,56 +47,64 @@
|
||||
"data": {
|
||||
"obj": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string valid for array",
|
||||
"data": {
|
||||
"arr": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string valid for string",
|
||||
"data": {
|
||||
"str": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string valid for integer",
|
||||
"data": {
|
||||
"int": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string valid for number",
|
||||
"data": {
|
||||
"num": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string valid for boolean",
|
||||
"data": {
|
||||
"bool": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string valid for null",
|
||||
"data": {
|
||||
"nul": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string valid for format",
|
||||
"data": {
|
||||
"fmt": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string INVALID for const (unless const is empty string)",
|
||||
@ -105,14 +117,16 @@
|
||||
"code": "CONST_VIOLATED",
|
||||
"path": "/con"
|
||||
}
|
||||
]
|
||||
],
|
||||
"schema_id": "emptyString_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty string VALID for const if const IS empty string",
|
||||
"data": {
|
||||
"con_empty": ""
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "emptyString_0_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
279
tests/fixtures/enum.json
vendored
279
tests/fixtures/enum.json
vendored
@ -1,31 +1,38 @@
|
||||
[
|
||||
{
|
||||
"description": "simple enum validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
1,
|
||||
2,
|
||||
3
|
||||
],
|
||||
"$id": "enum_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "one of the enum is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "something else is invalid",
|
||||
"data": 4,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "heterogeneous enum validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
6,
|
||||
"foo",
|
||||
@ -37,32 +44,39 @@
|
||||
],
|
||||
"properties": {
|
||||
"foo": {}
|
||||
},
|
||||
"$id": "enum_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "one of the enum is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "something else is invalid",
|
||||
"data": null,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "objects are deep compared",
|
||||
"data": {
|
||||
"foo": false
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "valid object matches",
|
||||
"data": {
|
||||
"foo": 12
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "extra properties in object is invalid",
|
||||
@ -70,41 +84,50 @@
|
||||
"foo": 12,
|
||||
"boo": 42
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "heterogeneous enum-with-null validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
6,
|
||||
null
|
||||
],
|
||||
"$id": "enum_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "null is valid",
|
||||
"data": null,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_2_0"
|
||||
},
|
||||
{
|
||||
"description": "number is valid",
|
||||
"data": 6,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_2_0"
|
||||
},
|
||||
{
|
||||
"description": "something else is invalid",
|
||||
"data": "test",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enums in properties",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"foo": {
|
||||
@ -120,6 +143,9 @@
|
||||
},
|
||||
"required": [
|
||||
"bar"
|
||||
],
|
||||
"$id": "enum_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -129,7 +155,8 @@
|
||||
"foo": "foo",
|
||||
"bar": "bar"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_3_0"
|
||||
},
|
||||
{
|
||||
"description": "wrong foo value",
|
||||
@ -137,7 +164,8 @@
|
||||
"foo": "foot",
|
||||
"bar": "bar"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_3_0"
|
||||
},
|
||||
{
|
||||
"description": "wrong bar value",
|
||||
@ -145,90 +173,112 @@
|
||||
"foo": "foo",
|
||||
"bar": "bart"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_3_0"
|
||||
},
|
||||
{
|
||||
"description": "missing optional property is valid",
|
||||
"data": {
|
||||
"bar": "bar"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_3_0"
|
||||
},
|
||||
{
|
||||
"description": "missing required property is invalid",
|
||||
"data": {
|
||||
"foo": "foo"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_3_0"
|
||||
},
|
||||
{
|
||||
"description": "missing all properties is invalid",
|
||||
"data": {},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with escaped characters",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
"foo\nbar",
|
||||
"foo\rbar"
|
||||
],
|
||||
"$id": "enum_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "member 1 is valid",
|
||||
"data": "foo\nbar",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_4_0"
|
||||
},
|
||||
{
|
||||
"description": "member 2 is valid",
|
||||
"data": "foo\rbar",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_4_0"
|
||||
},
|
||||
{
|
||||
"description": "another string is invalid",
|
||||
"data": "abc",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with false does not match 0",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
false
|
||||
],
|
||||
"$id": "enum_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "false is valid",
|
||||
"data": false,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_5_0"
|
||||
},
|
||||
{
|
||||
"description": "integer zero is invalid",
|
||||
"data": 0,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_5_0"
|
||||
},
|
||||
{
|
||||
"description": "float zero is invalid",
|
||||
"data": 0.0,
|
||||
"valid": false
|
||||
"data": 0,
|
||||
"valid": false,
|
||||
"schema_id": "enum_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with [false] does not match [0]",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
[
|
||||
false
|
||||
]
|
||||
],
|
||||
"$id": "enum_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -237,58 +287,72 @@
|
||||
"data": [
|
||||
false
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_6_0"
|
||||
},
|
||||
{
|
||||
"description": "[0] is invalid",
|
||||
"data": [
|
||||
0
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_6_0"
|
||||
},
|
||||
{
|
||||
"description": "[0.0] is invalid",
|
||||
"data": [
|
||||
0.0
|
||||
0
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with true does not match 1",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
true
|
||||
],
|
||||
"$id": "enum_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "true is valid",
|
||||
"data": true,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_7_0"
|
||||
},
|
||||
{
|
||||
"description": "integer one is invalid",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_7_0"
|
||||
},
|
||||
{
|
||||
"description": "float one is invalid",
|
||||
"data": 1.0,
|
||||
"valid": false
|
||||
"data": 1,
|
||||
"valid": false,
|
||||
"schema_id": "enum_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with [true] does not match [1]",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
[
|
||||
true
|
||||
]
|
||||
],
|
||||
"$id": "enum_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -297,58 +361,72 @@
|
||||
"data": [
|
||||
true
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_8_0"
|
||||
},
|
||||
{
|
||||
"description": "[1] is invalid",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_8_0"
|
||||
},
|
||||
{
|
||||
"description": "[1.0] is invalid",
|
||||
"data": [
|
||||
1.0
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with 0 does not match false",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
0
|
||||
],
|
||||
"$id": "enum_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "false is invalid",
|
||||
"data": false,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_9_0"
|
||||
},
|
||||
{
|
||||
"description": "integer zero is valid",
|
||||
"data": 0,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_9_0"
|
||||
},
|
||||
{
|
||||
"description": "float zero is valid",
|
||||
"data": 0.0,
|
||||
"valid": true
|
||||
"data": 0,
|
||||
"valid": true,
|
||||
"schema_id": "enum_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with [0] does not match [false]",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
[
|
||||
0
|
||||
]
|
||||
],
|
||||
"$id": "enum_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -357,58 +435,72 @@
|
||||
"data": [
|
||||
false
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_10_0"
|
||||
},
|
||||
{
|
||||
"description": "[0] is valid",
|
||||
"data": [
|
||||
0
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_10_0"
|
||||
},
|
||||
{
|
||||
"description": "[0.0] is valid",
|
||||
"data": [
|
||||
0.0
|
||||
0
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with 1 does not match true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
1
|
||||
],
|
||||
"$id": "enum_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "true is invalid",
|
||||
"data": true,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_11_0"
|
||||
},
|
||||
{
|
||||
"description": "integer one is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_11_0"
|
||||
},
|
||||
{
|
||||
"description": "float one is valid",
|
||||
"data": 1.0,
|
||||
"valid": true
|
||||
"data": 1,
|
||||
"valid": true,
|
||||
"schema_id": "enum_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "enum with [1] does not match [true]",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
[
|
||||
1
|
||||
]
|
||||
],
|
||||
"$id": "enum_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -417,55 +509,68 @@
|
||||
"data": [
|
||||
true
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_12_0"
|
||||
},
|
||||
{
|
||||
"description": "[1] is valid",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_12_0"
|
||||
},
|
||||
{
|
||||
"description": "[1.0] is valid",
|
||||
"data": [
|
||||
1.0
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "nul characters in strings",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
"hello\u0000there"
|
||||
],
|
||||
"$id": "enum_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "match string with nul",
|
||||
"data": "hello\u0000there",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_13_0"
|
||||
},
|
||||
{
|
||||
"description": "do not match string lacking nul",
|
||||
"data": "hellothere",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in enum object match",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"enum": [
|
||||
{
|
||||
"foo": 1
|
||||
}
|
||||
],
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "enum_14_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -474,14 +579,16 @@
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "enum_14_0"
|
||||
},
|
||||
{
|
||||
"description": "extra property ignored during strict check, enum match succeeds",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "enum_14_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
24
tests/fixtures/exclusiveMaximum.json
vendored
24
tests/fixtures/exclusiveMaximum.json
vendored
@ -1,30 +1,38 @@
|
||||
[
|
||||
{
|
||||
"description": "exclusiveMaximum validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"exclusiveMaximum": 3.0
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"exclusiveMaximum": 3,
|
||||
"$id": "exclusiveMaximum_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "below the exclusiveMaximum is valid",
|
||||
"data": 2.2,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "exclusiveMaximum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "boundary point is invalid",
|
||||
"data": 3.0,
|
||||
"valid": false
|
||||
"data": 3,
|
||||
"valid": false,
|
||||
"schema_id": "exclusiveMaximum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "above the exclusiveMaximum is invalid",
|
||||
"data": 3.5,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "exclusiveMaximum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-numbers",
|
||||
"data": "x",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "exclusiveMaximum_0_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
22
tests/fixtures/exclusiveMinimum.json
vendored
22
tests/fixtures/exclusiveMinimum.json
vendored
@ -1,30 +1,38 @@
|
||||
[
|
||||
{
|
||||
"description": "exclusiveMinimum validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"exclusiveMinimum": 1.1
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"exclusiveMinimum": 1.1,
|
||||
"$id": "exclusiveMinimum_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "above the exclusiveMinimum is valid",
|
||||
"data": 1.2,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "exclusiveMinimum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "boundary point is invalid",
|
||||
"data": 1.1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "exclusiveMinimum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "below the exclusiveMinimum is invalid",
|
||||
"data": 0.6,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "exclusiveMinimum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-numbers",
|
||||
"data": "x",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "exclusiveMinimum_0_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
199
tests/fixtures/families.json
vendored
Normal file
199
tests/fixtures/families.json
vendored
Normal file
@ -0,0 +1,199 @@
|
||||
[
|
||||
{
|
||||
"description": "Entity families via pure $ref graph",
|
||||
"database": {
|
||||
"types": [
|
||||
{
|
||||
"name": "entity",
|
||||
"variations": [
|
||||
"entity",
|
||||
"organization",
|
||||
"person"
|
||||
],
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "entity",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"$id": "light.entity",
|
||||
"$ref": "entity"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "organization",
|
||||
"variations": [
|
||||
"organization",
|
||||
"person"
|
||||
],
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "organization",
|
||||
"$ref": "entity",
|
||||
"properties": {
|
||||
"name": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "person",
|
||||
"variations": [
|
||||
"person"
|
||||
],
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "person",
|
||||
"$ref": "organization",
|
||||
"properties": {
|
||||
"first_name": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"$id": "light.person",
|
||||
"$ref": "light.entity"
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"puncs": [
|
||||
{
|
||||
"name": "get_entities",
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "get_entities.response",
|
||||
"$family": "entity"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "get_light_entities",
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "get_light_entities.response",
|
||||
"$family": "light.entity"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "Family matches base entity",
|
||||
"schema_id": "get_entities.response",
|
||||
"data": {
|
||||
"id": "1",
|
||||
"type": "entity"
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "Family matches descendant person",
|
||||
"schema_id": "get_entities.response",
|
||||
"data": {
|
||||
"id": "2",
|
||||
"type": "person",
|
||||
"name": "ACME",
|
||||
"first_name": "John"
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "Graph family matches light.entity",
|
||||
"schema_id": "get_light_entities.response",
|
||||
"data": {
|
||||
"id": "3",
|
||||
"type": "entity"
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "Graph family matches light.person (because it $refs light.entity)",
|
||||
"schema_id": "get_light_entities.response",
|
||||
"data": {
|
||||
"id": "4",
|
||||
"type": "person"
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "Graph family excludes organization (missing light. schema that $refs light.entity)",
|
||||
"schema_id": "get_light_entities.response",
|
||||
"data": {
|
||||
"id": "5",
|
||||
"type": "organization",
|
||||
"name": "ACME"
|
||||
},
|
||||
"valid": false,
|
||||
"expect_errors": [
|
||||
{
|
||||
"code": "FAMILY_MISMATCH",
|
||||
"path": ""
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "Ad-hoc non-entity families (using normal json-schema object structures)",
|
||||
"database": {
|
||||
"puncs": [
|
||||
{
|
||||
"name": "get_widgets",
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "widget",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "string"
|
||||
},
|
||||
"widget_type": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"$id": "special_widget",
|
||||
"$ref": "widget",
|
||||
"properties": {
|
||||
"special_feature": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"$id": "get_widgets.response",
|
||||
"$family": "widget"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "Ad-hoc family matches strictly by shape (no magic variations for base schemas)",
|
||||
"schema_id": "get_widgets.response",
|
||||
"data": {
|
||||
"id": "1",
|
||||
"widget_type": "special",
|
||||
"special_feature": "yes"
|
||||
},
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
1926
tests/fixtures/format.json
vendored
1926
tests/fixtures/format.json
vendored
File diff suppressed because it is too large
Load Diff
215
tests/fixtures/if-then-else.json
vendored
215
tests/fixtures/if-then-else.json
vendored
@ -1,129 +1,162 @@
|
||||
[
|
||||
{
|
||||
"description": "ignore if without then or else",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": {
|
||||
"const": 0
|
||||
},
|
||||
"$id": "if-then-else_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid when valid against lone if",
|
||||
"data": 0,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_0_0"
|
||||
},
|
||||
{
|
||||
"description": "valid when invalid against lone if",
|
||||
"data": "hello",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "ignore then without if",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"then": {
|
||||
"const": 0
|
||||
},
|
||||
"$id": "if-then-else_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid when valid against lone then",
|
||||
"data": 0,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_1_0"
|
||||
},
|
||||
{
|
||||
"description": "valid when invalid against lone then",
|
||||
"data": "hello",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "ignore else without if",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"else": {
|
||||
"const": 0
|
||||
},
|
||||
"$id": "if-then-else_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid when valid against lone else",
|
||||
"data": 0,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_2_0"
|
||||
},
|
||||
{
|
||||
"description": "valid when invalid against lone else",
|
||||
"data": "hello",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "if and then without else",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": {
|
||||
"exclusiveMaximum": 0
|
||||
},
|
||||
"then": {
|
||||
"minimum": -10
|
||||
},
|
||||
"$id": "if-then-else_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid through then",
|
||||
"data": -1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_3_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid through then",
|
||||
"data": -100,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_3_0"
|
||||
},
|
||||
{
|
||||
"description": "valid when if test fails",
|
||||
"data": 3,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "if and else without then",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": {
|
||||
"exclusiveMaximum": 0
|
||||
},
|
||||
"else": {
|
||||
"multipleOf": 2
|
||||
},
|
||||
"$id": "if-then-else_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid when if test passes",
|
||||
"data": -1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_4_0"
|
||||
},
|
||||
{
|
||||
"description": "valid through else",
|
||||
"data": 4,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_4_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid through else",
|
||||
"data": 3,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "validate against correct branch, then vs else",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": {
|
||||
"exclusiveMaximum": 0
|
||||
},
|
||||
@ -132,35 +165,43 @@
|
||||
},
|
||||
"else": {
|
||||
"multipleOf": 2
|
||||
},
|
||||
"$id": "if-then-else_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid through then",
|
||||
"data": -1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_5_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid through then",
|
||||
"data": -100,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_5_0"
|
||||
},
|
||||
{
|
||||
"description": "valid through else",
|
||||
"data": 4,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_5_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid through else",
|
||||
"data": 3,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "non-interference across combined schemas",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"if": {
|
||||
@ -177,75 +218,93 @@
|
||||
"multipleOf": 2
|
||||
}
|
||||
}
|
||||
],
|
||||
"$id": "if-then-else_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid, but would have been invalid through then",
|
||||
"data": -100,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_6_0"
|
||||
},
|
||||
{
|
||||
"description": "valid, but would have been invalid through else",
|
||||
"data": 3,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "if with boolean schema true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": true,
|
||||
"then": {
|
||||
"const": "then"
|
||||
},
|
||||
"else": {
|
||||
"const": "else"
|
||||
},
|
||||
"$id": "if-then-else_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "boolean schema true in if always chooses the then path (valid)",
|
||||
"data": "then",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_7_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean schema true in if always chooses the then path (invalid)",
|
||||
"data": "else",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "if with boolean schema false",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": false,
|
||||
"then": {
|
||||
"const": "then"
|
||||
},
|
||||
"else": {
|
||||
"const": "else"
|
||||
},
|
||||
"$id": "if-then-else_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "boolean schema false in if always chooses the else path (invalid)",
|
||||
"data": "then",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_8_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean schema false in if always chooses the else path (valid)",
|
||||
"data": "else",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "if appears at the end when serialized (keyword processing sequence)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"then": {
|
||||
"const": "yes"
|
||||
},
|
||||
@ -254,77 +313,99 @@
|
||||
},
|
||||
"if": {
|
||||
"maxLength": 4
|
||||
},
|
||||
"$id": "if-then-else_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "yes redirects to then and passes",
|
||||
"data": "yes",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_9_0"
|
||||
},
|
||||
{
|
||||
"description": "other redirects to else and passes",
|
||||
"data": "other",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_9_0"
|
||||
},
|
||||
{
|
||||
"description": "no redirects to then and fails",
|
||||
"data": "no",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_9_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid redirects to else and fails",
|
||||
"data": "invalid",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "then: false fails when condition matches",
|
||||
"schema": {
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": {
|
||||
"const": 1
|
||||
},
|
||||
"then": false
|
||||
"then": false,
|
||||
"$id": "if-then-else_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "matches if → then=false → invalid",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_10_0"
|
||||
},
|
||||
{
|
||||
"description": "does not match if → then ignored → valid",
|
||||
"data": 2,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "else: false fails when condition does not match",
|
||||
"schema": {
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": {
|
||||
"const": 1
|
||||
},
|
||||
"else": false
|
||||
"else": false,
|
||||
"$id": "if-then-else_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "matches if → else ignored → valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_11_0"
|
||||
},
|
||||
{
|
||||
"description": "does not match if → else executes → invalid",
|
||||
"data": 2,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in if-then-else",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": {
|
||||
"properties": {
|
||||
"foo": {
|
||||
@ -345,7 +426,10 @@
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "if-then-else_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -355,14 +439,16 @@
|
||||
"bar": 2,
|
||||
"extra": "prop"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "strict by default with if-then properties",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"if": {
|
||||
"properties": {
|
||||
"foo": {
|
||||
@ -379,7 +465,10 @@
|
||||
"const": 2
|
||||
}
|
||||
}
|
||||
},
|
||||
"$id": "if-then-else_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -388,7 +477,8 @@
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "if-then-else_13_0"
|
||||
},
|
||||
{
|
||||
"description": "fails on extra property z explicitly",
|
||||
@ -397,7 +487,8 @@
|
||||
"bar": 2,
|
||||
"z": 3
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "if-then-else_13_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
303
tests/fixtures/items.json
vendored
303
tests/fixtures/items.json
vendored
@ -1,11 +1,15 @@
|
||||
[
|
||||
{
|
||||
"description": "a schema given for items",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"items": {
|
||||
"type": "integer"
|
||||
},
|
||||
"$id": "items_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -15,7 +19,8 @@
|
||||
2,
|
||||
3
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_0_0"
|
||||
},
|
||||
{
|
||||
"description": "wrong type of items",
|
||||
@ -23,14 +28,16 @@
|
||||
1,
|
||||
"x"
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_0_0"
|
||||
},
|
||||
{
|
||||
"description": "non-arrays are invalid",
|
||||
"data": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_0_0"
|
||||
},
|
||||
{
|
||||
"description": "JavaScript pseudo-arrays are invalid",
|
||||
@ -38,15 +45,20 @@
|
||||
"0": "invalid",
|
||||
"length": 1
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "items with boolean schema (true)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"items": true
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"items": true,
|
||||
"$id": "items_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -56,20 +68,26 @@
|
||||
"foo",
|
||||
true
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_1_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "items with boolean schema (false)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"items": false
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"items": false,
|
||||
"$id": "items_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -79,51 +97,57 @@
|
||||
"foo",
|
||||
true
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_2_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "items and subitems",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$defs": {
|
||||
"item": {
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "array",
|
||||
"items": false,
|
||||
"prefixItems": [
|
||||
{
|
||||
"$ref": "#/$defs/sub-item"
|
||||
"$ref": "item"
|
||||
},
|
||||
{
|
||||
"$ref": "#/$defs/sub-item"
|
||||
"$ref": "item"
|
||||
},
|
||||
{
|
||||
"$ref": "item"
|
||||
}
|
||||
],
|
||||
"$id": "items_3_0"
|
||||
},
|
||||
{
|
||||
"$id": "item",
|
||||
"type": "array",
|
||||
"items": false,
|
||||
"prefixItems": [
|
||||
{
|
||||
"$ref": "sub-item"
|
||||
},
|
||||
{
|
||||
"$ref": "sub-item"
|
||||
}
|
||||
]
|
||||
},
|
||||
"sub-item": {
|
||||
{
|
||||
"$id": "sub-item",
|
||||
"type": "object",
|
||||
"required": [
|
||||
"foo"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "array",
|
||||
"items": false,
|
||||
"prefixItems": [
|
||||
{
|
||||
"$ref": "#/$defs/item"
|
||||
},
|
||||
{
|
||||
"$ref": "#/$defs/item"
|
||||
},
|
||||
{
|
||||
"$ref": "#/$defs/item"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -155,7 +179,8 @@
|
||||
}
|
||||
]
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_3_0"
|
||||
},
|
||||
{
|
||||
"description": "too many items",
|
||||
@ -193,7 +218,8 @@
|
||||
}
|
||||
]
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_3_0"
|
||||
},
|
||||
{
|
||||
"description": "too many sub-items",
|
||||
@ -226,7 +252,8 @@
|
||||
}
|
||||
]
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_3_0"
|
||||
},
|
||||
{
|
||||
"description": "wrong item",
|
||||
@ -251,7 +278,8 @@
|
||||
}
|
||||
]
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_3_0"
|
||||
},
|
||||
{
|
||||
"description": "wrong sub-item",
|
||||
@ -279,7 +307,8 @@
|
||||
}
|
||||
]
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_3_0"
|
||||
},
|
||||
{
|
||||
"description": "fewer items is invalid",
|
||||
@ -295,14 +324,16 @@
|
||||
}
|
||||
]
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "nested items",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "array",
|
||||
@ -315,7 +346,10 @@
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"$id": "items_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -350,7 +384,8 @@
|
||||
]
|
||||
]
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_4_0"
|
||||
},
|
||||
{
|
||||
"description": "nested array with invalid type",
|
||||
@ -384,7 +419,8 @@
|
||||
]
|
||||
]
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_4_0"
|
||||
},
|
||||
{
|
||||
"description": "not deep enough",
|
||||
@ -412,33 +448,40 @@
|
||||
]
|
||||
]
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "prefixItems with no additional items allowed",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"prefixItems": [
|
||||
{},
|
||||
{},
|
||||
{}
|
||||
],
|
||||
"items": false
|
||||
"items": false,
|
||||
"$id": "items_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty array",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_5_0"
|
||||
},
|
||||
{
|
||||
"description": "fewer number of items present (1)",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_5_0"
|
||||
},
|
||||
{
|
||||
"description": "fewer number of items present (2)",
|
||||
@ -446,7 +489,8 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_5_0"
|
||||
},
|
||||
{
|
||||
"description": "equal number of items present",
|
||||
@ -455,7 +499,8 @@
|
||||
2,
|
||||
3
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_5_0"
|
||||
},
|
||||
{
|
||||
"description": "additional items are not permitted",
|
||||
@ -465,14 +510,16 @@
|
||||
3,
|
||||
4
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "items does not look in applicators, valid case",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"allOf": [
|
||||
{
|
||||
"prefixItems": [
|
||||
@ -484,7 +531,10 @@
|
||||
],
|
||||
"items": {
|
||||
"minimum": 5
|
||||
},
|
||||
"$id": "items_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -493,7 +543,8 @@
|
||||
3,
|
||||
5
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_6_0"
|
||||
},
|
||||
{
|
||||
"description": "prefixItems in allOf does not constrain items, valid case",
|
||||
@ -501,14 +552,16 @@
|
||||
5,
|
||||
5
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "prefixItems validation adjusts the starting index for items",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"prefixItems": [
|
||||
{
|
||||
"type": "string"
|
||||
@ -516,7 +569,10 @@
|
||||
],
|
||||
"items": {
|
||||
"type": "integer"
|
||||
},
|
||||
"$id": "items_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -526,7 +582,8 @@
|
||||
2,
|
||||
3
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_7_0"
|
||||
},
|
||||
{
|
||||
"description": "wrong type of second item",
|
||||
@ -534,18 +591,23 @@
|
||||
"x",
|
||||
"y"
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "items with heterogeneous array",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"prefixItems": [
|
||||
{}
|
||||
],
|
||||
"items": false
|
||||
"items": false,
|
||||
"$id": "items_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -555,24 +617,30 @@
|
||||
"bar",
|
||||
37
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_8_0"
|
||||
},
|
||||
{
|
||||
"description": "valid instance",
|
||||
"data": [
|
||||
null
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "items with null instance elements",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"items": {
|
||||
"type": "null"
|
||||
},
|
||||
"$id": "items_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -580,16 +648,21 @@
|
||||
"data": [
|
||||
null
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra items (when items is false)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"items": false,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "items_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -597,18 +670,23 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties for items",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"items": {
|
||||
"minimum": 5
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "items_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -617,29 +695,36 @@
|
||||
5,
|
||||
6
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_11_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid item (less than min) is invalid even with extensible: true",
|
||||
"data": [
|
||||
4
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "array: simple extensible array",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "array",
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "items_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty array is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_12_0"
|
||||
},
|
||||
{
|
||||
"description": "array with items is valid (extensible)",
|
||||
@ -647,46 +732,58 @@
|
||||
1,
|
||||
"foo"
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "array: strict array",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "array",
|
||||
"extensible": false
|
||||
"extensible": false,
|
||||
"$id": "items_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty array is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_13_0"
|
||||
},
|
||||
{
|
||||
"description": "array with items is invalid (strict)",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_13_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "array: items extensible",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "array",
|
||||
"items": {
|
||||
"extensible": true
|
||||
},
|
||||
"$id": "items_14_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty array is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_14_0"
|
||||
},
|
||||
{
|
||||
"description": "array with items is valid (items explicitly allowed to be anything extensible)",
|
||||
@ -695,19 +792,24 @@
|
||||
"foo",
|
||||
{}
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_14_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "array: items strict",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"extensible": false
|
||||
},
|
||||
"$id": "items_15_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -715,14 +817,16 @@
|
||||
"data": [
|
||||
{}
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_15_0"
|
||||
},
|
||||
{
|
||||
"description": "array with strict object items is valid",
|
||||
"data": [
|
||||
{}
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "items_15_0"
|
||||
},
|
||||
{
|
||||
"description": "array with invalid strict object items (extra property)",
|
||||
@ -731,7 +835,8 @@
|
||||
"extra": 1
|
||||
}
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "items_15_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
91
tests/fixtures/maxContains.json
vendored
91
tests/fixtures/maxContains.json
vendored
@ -1,10 +1,14 @@
|
||||
[
|
||||
{
|
||||
"description": "maxContains without contains is ignored",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxContains": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxContains_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -12,7 +16,8 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxContains_0_0"
|
||||
},
|
||||
{
|
||||
"description": "two items still valid against lone maxContains",
|
||||
@ -20,32 +25,39 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxContains_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maxContains with contains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"maxContains": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxContains_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty data",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxContains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, valid maxContains",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxContains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, invalid maxContains",
|
||||
@ -53,7 +65,8 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxContains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "some elements match, valid maxContains",
|
||||
@ -61,7 +74,8 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxContains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "some elements match, invalid maxContains",
|
||||
@ -70,19 +84,24 @@
|
||||
2,
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxContains_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maxContains with contains, value with a decimal",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"maxContains": 1.0,
|
||||
"extensible": true
|
||||
"maxContains": 1,
|
||||
"extensible": true,
|
||||
"$id": "maxContains_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -90,7 +109,8 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxContains_2_0"
|
||||
},
|
||||
{
|
||||
"description": "too many elements match, invalid maxContains",
|
||||
@ -98,26 +118,32 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxContains_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minContains < maxContains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"minContains": 1,
|
||||
"maxContains": 3,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxContains_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "actual < minContains < maxContains",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxContains_3_0"
|
||||
},
|
||||
{
|
||||
"description": "minContains < actual < maxContains",
|
||||
@ -125,7 +151,8 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxContains_3_0"
|
||||
},
|
||||
{
|
||||
"description": "minContains < maxContains < actual",
|
||||
@ -135,19 +162,24 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxContains_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows non-matching items in maxContains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"maxContains": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxContains_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -156,7 +188,8 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxContains_4_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
53
tests/fixtures/maxItems.json
vendored
53
tests/fixtures/maxItems.json
vendored
@ -1,10 +1,14 @@
|
||||
[
|
||||
{
|
||||
"description": "maxItems validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxItems": 2,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxItems_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -12,7 +16,8 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "exact length is valid",
|
||||
@ -20,7 +25,8 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "too long is invalid",
|
||||
@ -29,21 +35,27 @@
|
||||
2,
|
||||
3
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-arrays",
|
||||
"data": "foobar",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxItems_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maxItems validation with a decimal",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"maxItems": 2.0,
|
||||
"extensible": true
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxItems": 2,
|
||||
"extensible": true,
|
||||
"$id": "maxItems_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -51,7 +63,8 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxItems_1_0"
|
||||
},
|
||||
{
|
||||
"description": "too long is invalid",
|
||||
@ -60,16 +73,21 @@
|
||||
2,
|
||||
3
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxItems_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra items in maxItems (but counted)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxItems": 2,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxItems_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -79,7 +97,8 @@
|
||||
2,
|
||||
3
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxItems_2_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
43
tests/fixtures/maxLength.json
vendored
43
tests/fixtures/maxLength.json
vendored
@ -1,54 +1,69 @@
|
||||
[
|
||||
{
|
||||
"description": "maxLength validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"maxLength": 2
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxLength": 2,
|
||||
"$id": "maxLength_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "shorter is valid",
|
||||
"data": "f",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxLength_0_0"
|
||||
},
|
||||
{
|
||||
"description": "exact length is valid",
|
||||
"data": "fo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxLength_0_0"
|
||||
},
|
||||
{
|
||||
"description": "too long is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxLength_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-strings",
|
||||
"data": 100,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxLength_0_0"
|
||||
},
|
||||
{
|
||||
"description": "two graphemes is long enough",
|
||||
"data": "\uD83D\uDCA9\uD83D\uDCA9",
|
||||
"valid": true
|
||||
"data": "💩💩",
|
||||
"valid": true,
|
||||
"schema_id": "maxLength_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maxLength validation with a decimal",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"maxLength": 2.0
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxLength": 2,
|
||||
"$id": "maxLength_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "shorter is valid",
|
||||
"data": "f",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxLength_1_0"
|
||||
},
|
||||
{
|
||||
"description": "too long is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxLength_1_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
78
tests/fixtures/maxProperties.json
vendored
78
tests/fixtures/maxProperties.json
vendored
@ -1,10 +1,14 @@
|
||||
[
|
||||
{
|
||||
"description": "maxProperties validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxProperties": 2,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxProperties_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -12,7 +16,8 @@
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "exact length is valid",
|
||||
@ -20,7 +25,8 @@
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "too long is invalid",
|
||||
@ -29,7 +35,8 @@
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores arrays",
|
||||
@ -38,26 +45,33 @@
|
||||
2,
|
||||
3
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores strings",
|
||||
"data": "foobar",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores other non-objects",
|
||||
"data": 12,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxProperties_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maxProperties validation with a decimal",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"maxProperties": 2.0,
|
||||
"extensible": true
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxProperties": 2,
|
||||
"extensible": true,
|
||||
"$id": "maxProperties_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -65,7 +79,8 @@
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxProperties_1_0"
|
||||
},
|
||||
{
|
||||
"description": "too long is invalid",
|
||||
@ -74,38 +89,49 @@
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxProperties_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maxProperties = 0 means the object is empty",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxProperties": 0,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxProperties_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "no properties is valid",
|
||||
"data": {},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxProperties_2_0"
|
||||
},
|
||||
{
|
||||
"description": "one property is invalid",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxProperties_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in maxProperties (though maxProperties still counts them!)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maxProperties": 2,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "maxProperties_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -115,14 +141,16 @@
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maxProperties_3_0"
|
||||
},
|
||||
{
|
||||
"description": "extra property is valid if below maxProperties",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maxProperties_3_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
48
tests/fixtures/maximum.json
vendored
48
tests/fixtures/maximum.json
vendored
@ -1,59 +1,75 @@
|
||||
[
|
||||
{
|
||||
"description": "maximum validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"maximum": 3.0
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maximum": 3,
|
||||
"$id": "maximum_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "below the maximum is valid",
|
||||
"data": 2.6,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maximum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "boundary point is valid",
|
||||
"data": 3.0,
|
||||
"valid": true
|
||||
"data": 3,
|
||||
"valid": true,
|
||||
"schema_id": "maximum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "above the maximum is invalid",
|
||||
"data": 3.5,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maximum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-numbers",
|
||||
"data": "x",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maximum_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maximum validation with unsigned integer",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"maximum": 300
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"maximum": 300,
|
||||
"$id": "maximum_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "below the maximum is invalid",
|
||||
"data": 299.97,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maximum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "boundary point integer is valid",
|
||||
"data": 300,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "maximum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "boundary point float is valid",
|
||||
"data": 300.00,
|
||||
"valid": true
|
||||
"data": 300,
|
||||
"valid": true,
|
||||
"schema_id": "maximum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "above the maximum is invalid",
|
||||
"data": 300.5,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "maximum_1_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
87
tests/fixtures/merge.json
vendored
87
tests/fixtures/merge.json
vendored
@ -1,23 +1,26 @@
|
||||
[
|
||||
{
|
||||
"description": "merging: properties accumulate",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$defs": {
|
||||
"base": {
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "base_0",
|
||||
"properties": {
|
||||
"base_prop": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"$ref": "#/$defs/base",
|
||||
{
|
||||
"$ref": "base_0",
|
||||
"properties": {
|
||||
"child_prop": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"$id": "merge_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -26,7 +29,8 @@
|
||||
"base_prop": "a",
|
||||
"child_prop": "b"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "merge_0_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid when base property has wrong type",
|
||||
@ -40,16 +44,17 @@
|
||||
"code": "TYPE_MISMATCH",
|
||||
"path": "/base_prop"
|
||||
}
|
||||
]
|
||||
],
|
||||
"schema_id": "merge_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "merging: required fields accumulate",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$defs": {
|
||||
"base": {
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "base_1",
|
||||
"properties": {
|
||||
"a": {
|
||||
"type": "string"
|
||||
@ -58,9 +63,9 @@
|
||||
"required": [
|
||||
"a"
|
||||
]
|
||||
}
|
||||
},
|
||||
"$ref": "#/$defs/base",
|
||||
{
|
||||
"$ref": "base_1",
|
||||
"properties": {
|
||||
"b": {
|
||||
"type": "string"
|
||||
@ -68,6 +73,9 @@
|
||||
},
|
||||
"required": [
|
||||
"b"
|
||||
],
|
||||
"$id": "merge_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -77,7 +85,8 @@
|
||||
"a": "ok",
|
||||
"b": "ok"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "merge_1_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid when base required missing",
|
||||
@ -90,7 +99,8 @@
|
||||
"code": "REQUIRED_FIELD_MISSING",
|
||||
"path": "/a"
|
||||
}
|
||||
]
|
||||
],
|
||||
"schema_id": "merge_1_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid when child required missing",
|
||||
@ -103,16 +113,17 @@
|
||||
"code": "REQUIRED_FIELD_MISSING",
|
||||
"path": "/b"
|
||||
}
|
||||
]
|
||||
],
|
||||
"schema_id": "merge_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "merging: dependencies accumulate",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$defs": {
|
||||
"base": {
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "base_2",
|
||||
"properties": {
|
||||
"trigger": {
|
||||
"type": "string"
|
||||
@ -126,9 +137,9 @@
|
||||
"base_dep"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"$ref": "#/$defs/base",
|
||||
{
|
||||
"$ref": "base_2",
|
||||
"properties": {
|
||||
"child_dep": {
|
||||
"type": "string"
|
||||
@ -138,7 +149,10 @@
|
||||
"trigger": [
|
||||
"child_dep"
|
||||
]
|
||||
},
|
||||
"$id": "merge_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -148,7 +162,8 @@
|
||||
"base_dep": "ok",
|
||||
"child_dep": "ok"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "merge_2_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid missing base dep",
|
||||
@ -162,7 +177,8 @@
|
||||
"code": "DEPENDENCY_FAILED",
|
||||
"path": "/base_dep"
|
||||
}
|
||||
]
|
||||
],
|
||||
"schema_id": "merge_2_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid missing child dep",
|
||||
@ -176,16 +192,17 @@
|
||||
"code": "DEPENDENCY_FAILED",
|
||||
"path": "/child_dep"
|
||||
}
|
||||
]
|
||||
],
|
||||
"schema_id": "merge_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "merging: form and display do NOT merge",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$defs": {
|
||||
"base": {
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "base_3",
|
||||
"properties": {
|
||||
"a": {
|
||||
"type": "string"
|
||||
@ -198,9 +215,9 @@
|
||||
"a",
|
||||
"b"
|
||||
]
|
||||
}
|
||||
},
|
||||
"$ref": "#/$defs/base",
|
||||
{
|
||||
"$ref": "base_3",
|
||||
"properties": {
|
||||
"c": {
|
||||
"type": "string"
|
||||
@ -208,6 +225,9 @@
|
||||
},
|
||||
"form": [
|
||||
"c"
|
||||
],
|
||||
"$id": "merge_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -219,7 +239,8 @@
|
||||
"c": "ok"
|
||||
},
|
||||
"valid": true,
|
||||
"comment": "Verifies validator handles the unmerged metadata correctly (ignores it or handles replacement)"
|
||||
"comment": "Verifies validator handles the unmerged metadata correctly (ignores it or handles replacement)",
|
||||
"schema_id": "merge_3_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
179
tests/fixtures/minContains.json
vendored
179
tests/fixtures/minContains.json
vendored
@ -1,10 +1,14 @@
|
||||
[
|
||||
{
|
||||
"description": "minContains without contains is ignored",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minContains": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minContains_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -12,44 +16,53 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_0_0"
|
||||
},
|
||||
{
|
||||
"description": "zero items still valid against lone minContains",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minContains=1 with contains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"minContains": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minContains_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty data",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "no elements match",
|
||||
"data": [
|
||||
2
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "single element matches, valid minContains",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "some elements match, valid minContains",
|
||||
@ -57,7 +70,8 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_1_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, valid minContains",
|
||||
@ -65,32 +79,39 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minContains=2 with contains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"minContains": 2,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minContains_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty data",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_2_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, invalid minContains",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_2_0"
|
||||
},
|
||||
{
|
||||
"description": "some elements match, invalid minContains",
|
||||
@ -98,7 +119,8 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_2_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, valid minContains (exactly as needed)",
|
||||
@ -106,7 +128,8 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_2_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, valid minContains (more than needed)",
|
||||
@ -115,7 +138,8 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_2_0"
|
||||
},
|
||||
{
|
||||
"description": "some elements match, valid minContains",
|
||||
@ -124,19 +148,24 @@
|
||||
2,
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minContains=2 with contains with a decimal value",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"minContains": 2.0,
|
||||
"extensible": true
|
||||
"minContains": 2,
|
||||
"extensible": true,
|
||||
"$id": "minContains_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -144,7 +173,8 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_3_0"
|
||||
},
|
||||
{
|
||||
"description": "both elements match, valid minContains",
|
||||
@ -152,33 +182,40 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maxContains = minContains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"maxContains": 2,
|
||||
"minContains": 2,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minContains_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty data",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_4_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, invalid minContains",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_4_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, invalid maxContains",
|
||||
@ -187,7 +224,8 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_4_0"
|
||||
},
|
||||
{
|
||||
"description": "all elements match, valid maxContains and minContains",
|
||||
@ -195,33 +233,40 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "maxContains < minContains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"maxContains": 1,
|
||||
"minContains": 3,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minContains_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty data",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_5_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid minContains",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_5_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid maxContains",
|
||||
@ -230,7 +275,8 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_5_0"
|
||||
},
|
||||
{
|
||||
"description": "invalid maxContains and minContains",
|
||||
@ -238,58 +284,71 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minContains = 0",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"minContains": 0,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minContains_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty data",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_6_0"
|
||||
},
|
||||
{
|
||||
"description": "minContains = 0 makes contains always pass",
|
||||
"data": [
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minContains = 0 with maxContains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"minContains": 0,
|
||||
"maxContains": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minContains_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "empty data",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_7_0"
|
||||
},
|
||||
{
|
||||
"description": "not more than maxContains",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_7_0"
|
||||
},
|
||||
{
|
||||
"description": "too many",
|
||||
@ -297,19 +356,24 @@
|
||||
1,
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minContains_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows non-matching items in minContains",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"contains": {
|
||||
"const": 1
|
||||
},
|
||||
"minContains": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minContains_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -318,7 +382,8 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minContains_8_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
53
tests/fixtures/minItems.json
vendored
53
tests/fixtures/minItems.json
vendored
@ -1,10 +1,14 @@
|
||||
[
|
||||
{
|
||||
"description": "minItems validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minItems": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minItems_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -13,33 +17,41 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "exact length is valid",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "too short is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-arrays",
|
||||
"data": "",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minItems_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minItems validation with a decimal",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"minItems": 1.0,
|
||||
"extensible": true
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minItems": 1,
|
||||
"extensible": true,
|
||||
"$id": "minItems_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -48,21 +60,27 @@
|
||||
1,
|
||||
2
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minItems_1_0"
|
||||
},
|
||||
{
|
||||
"description": "too short is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minItems_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra items in minItems",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minItems": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minItems_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -70,7 +88,8 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minItems_2_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
43
tests/fixtures/minLength.json
vendored
43
tests/fixtures/minLength.json
vendored
@ -1,54 +1,69 @@
|
||||
[
|
||||
{
|
||||
"description": "minLength validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"minLength": 2
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minLength": 2,
|
||||
"$id": "minLength_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "longer is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minLength_0_0"
|
||||
},
|
||||
{
|
||||
"description": "exact length is valid",
|
||||
"data": "fo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minLength_0_0"
|
||||
},
|
||||
{
|
||||
"description": "too short is invalid",
|
||||
"data": "f",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minLength_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-strings",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minLength_0_0"
|
||||
},
|
||||
{
|
||||
"description": "one grapheme is not long enough",
|
||||
"data": "\uD83D\uDCA9",
|
||||
"valid": false
|
||||
"data": "💩",
|
||||
"valid": false,
|
||||
"schema_id": "minLength_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minLength validation with a decimal",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"minLength": 2.0
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minLength": 2,
|
||||
"$id": "minLength_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "longer is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minLength_1_0"
|
||||
},
|
||||
{
|
||||
"description": "too short is invalid",
|
||||
"data": "f",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minLength_1_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
59
tests/fixtures/minProperties.json
vendored
59
tests/fixtures/minProperties.json
vendored
@ -1,10 +1,14 @@
|
||||
[
|
||||
{
|
||||
"description": "minProperties validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minProperties": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minProperties_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -13,43 +17,53 @@
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "exact length is valid",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "too short is invalid",
|
||||
"data": {},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores arrays",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores strings",
|
||||
"data": "",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores other non-objects",
|
||||
"data": 12,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minProperties_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minProperties validation with a decimal",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"minProperties": 1.0,
|
||||
"extensible": true
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minProperties": 1,
|
||||
"extensible": true,
|
||||
"$id": "minProperties_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -58,21 +72,27 @@
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minProperties_1_0"
|
||||
},
|
||||
{
|
||||
"description": "too short is invalid",
|
||||
"data": {},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minProperties_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in minProperties",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minProperties": 1,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "minProperties_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -80,7 +100,8 @@
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minProperties_2_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
55
tests/fixtures/minimum.json
vendored
55
tests/fixtures/minimum.json
vendored
@ -1,74 +1,93 @@
|
||||
[
|
||||
{
|
||||
"description": "minimum validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"minimum": 1.1
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minimum": 1.1,
|
||||
"$id": "minimum_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "above the minimum is valid",
|
||||
"data": 2.6,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minimum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "boundary point is valid",
|
||||
"data": 1.1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minimum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "below the minimum is invalid",
|
||||
"data": 0.6,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minimum_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-numbers",
|
||||
"data": "x",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minimum_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "minimum validation with signed integer",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"minimum": -2
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"minimum": -2,
|
||||
"$id": "minimum_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "negative above the minimum is valid",
|
||||
"data": -1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minimum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "positive above the minimum is valid",
|
||||
"data": 0,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minimum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "boundary point is valid",
|
||||
"data": -2,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minimum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "boundary point with float is valid",
|
||||
"data": -2.0,
|
||||
"valid": true
|
||||
"data": -2,
|
||||
"valid": true,
|
||||
"schema_id": "minimum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "float below the minimum is invalid",
|
||||
"data": -2.0001,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minimum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "int below the minimum is invalid",
|
||||
"data": -3,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "minimum_1_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-numbers",
|
||||
"data": "x",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "minimum_1_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
67
tests/fixtures/multipleOf.json
vendored
67
tests/fixtures/multipleOf.json
vendored
@ -1,83 +1,108 @@
|
||||
[
|
||||
{
|
||||
"description": "by int",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"multipleOf": 2
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"multipleOf": 2,
|
||||
"$id": "multipleOf_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "int by int",
|
||||
"data": 10,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "multipleOf_0_0"
|
||||
},
|
||||
{
|
||||
"description": "int by int fail",
|
||||
"data": 7,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "multipleOf_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores non-numbers",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "multipleOf_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "by number",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"multipleOf": 1.5
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"multipleOf": 1.5,
|
||||
"$id": "multipleOf_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "zero is multiple of anything",
|
||||
"data": 0,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "multipleOf_1_0"
|
||||
},
|
||||
{
|
||||
"description": "4.5 is multiple of 1.5",
|
||||
"data": 4.5,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "multipleOf_1_0"
|
||||
},
|
||||
{
|
||||
"description": "35 is not multiple of 1.5",
|
||||
"data": 35,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "multipleOf_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "by small number",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"multipleOf": 0.0001
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"multipleOf": 0.0001,
|
||||
"$id": "multipleOf_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "0.0075 is multiple of 0.0001",
|
||||
"data": 0.0075,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "multipleOf_2_0"
|
||||
},
|
||||
{
|
||||
"description": "0.00751 is not multiple of 0.0001",
|
||||
"data": 0.00751,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "multipleOf_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "small multiple of large integer",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "integer",
|
||||
"multipleOf": 1e-8
|
||||
"multipleOf": 1e-8,
|
||||
"$id": "multipleOf_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any integer is a multiple of 1e-8",
|
||||
"data": 12391239123,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "multipleOf_3_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
237
tests/fixtures/not.json
vendored
237
tests/fixtures/not.json
vendored
@ -1,58 +1,72 @@
|
||||
[
|
||||
{
|
||||
"description": "not",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": {
|
||||
"type": "integer"
|
||||
},
|
||||
"$id": "not_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "allowed",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_0_0"
|
||||
},
|
||||
{
|
||||
"description": "disallowed",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "not multiple types",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": {
|
||||
"type": [
|
||||
"integer",
|
||||
"boolean"
|
||||
]
|
||||
},
|
||||
"$id": "not_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_1_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_1_0"
|
||||
},
|
||||
{
|
||||
"description": "other mismatch",
|
||||
"data": true,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "not more complex schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
@ -61,39 +75,49 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "not_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "match",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_2_0"
|
||||
},
|
||||
{
|
||||
"description": "other match",
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_2_0"
|
||||
},
|
||||
{
|
||||
"description": "mismatch",
|
||||
"data": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "forbidden property",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"properties": {
|
||||
"foo": {
|
||||
"not": {}
|
||||
}
|
||||
},
|
||||
"$id": "not_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -102,214 +126,264 @@
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_3_0"
|
||||
},
|
||||
{
|
||||
"description": "empty object is valid",
|
||||
"data": {},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "forbid everything with empty schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"not": {}
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": {},
|
||||
"$id": "not_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "number is invalid",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
},
|
||||
{
|
||||
"description": "string is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean true is invalid",
|
||||
"data": true,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean false is invalid",
|
||||
"data": false,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
},
|
||||
{
|
||||
"description": "null is invalid",
|
||||
"data": null,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
},
|
||||
{
|
||||
"description": "object is invalid",
|
||||
"data": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
},
|
||||
{
|
||||
"description": "empty object is invalid",
|
||||
"data": {},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
},
|
||||
{
|
||||
"description": "array is invalid",
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "forbid everything with boolean schema true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"not": true
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": true,
|
||||
"$id": "not_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "number is invalid",
|
||||
"data": 1,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
},
|
||||
{
|
||||
"description": "string is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean true is invalid",
|
||||
"data": true,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean false is invalid",
|
||||
"data": false,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
},
|
||||
{
|
||||
"description": "null is invalid",
|
||||
"data": null,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
},
|
||||
{
|
||||
"description": "object is invalid",
|
||||
"data": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
},
|
||||
{
|
||||
"description": "empty object is invalid",
|
||||
"data": {},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
},
|
||||
{
|
||||
"description": "array is invalid",
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is invalid",
|
||||
"data": [],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "allow everything with boolean schema false",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": false,
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "not_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "number is valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
},
|
||||
{
|
||||
"description": "string is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean true is valid",
|
||||
"data": true,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
},
|
||||
{
|
||||
"description": "boolean false is valid",
|
||||
"data": false,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
},
|
||||
{
|
||||
"description": "null is valid",
|
||||
"data": null,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
},
|
||||
{
|
||||
"description": "object is valid",
|
||||
"data": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
},
|
||||
{
|
||||
"description": "empty object is valid",
|
||||
"data": {},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
},
|
||||
{
|
||||
"description": "array is valid",
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "double negation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": {
|
||||
"not": {}
|
||||
},
|
||||
"$id": "not_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in not",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": {
|
||||
"type": "integer"
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "not_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -317,17 +391,22 @@
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: false (default) forbids extra properties in not",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"not": {
|
||||
"type": "integer"
|
||||
},
|
||||
"$id": "not_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -335,14 +414,16 @@
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "property next to not (extensible: true)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"properties": {
|
||||
"bar": {
|
||||
"type": "string"
|
||||
@ -351,7 +432,10 @@
|
||||
"not": {
|
||||
"type": "integer"
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "not_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -360,14 +444,16 @@
|
||||
"bar": "baz",
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "property next to not (extensible: false)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"properties": {
|
||||
"bar": {
|
||||
"type": "string"
|
||||
@ -375,7 +461,10 @@
|
||||
},
|
||||
"not": {
|
||||
"type": "integer"
|
||||
},
|
||||
"$id": "not_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -384,14 +473,16 @@
|
||||
"bar": "baz",
|
||||
"foo": 1
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "not_11_0"
|
||||
},
|
||||
{
|
||||
"description": "defined property allowed",
|
||||
"data": {
|
||||
"bar": "baz"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "not_11_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
740
tests/fixtures/oneOf.json
vendored
740
tests/fixtures/oneOf.json
vendored
@ -1,8 +1,9 @@
|
||||
[
|
||||
{
|
||||
"description": "oneOf",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "integer"
|
||||
@ -10,35 +11,43 @@
|
||||
{
|
||||
"minimum": 2
|
||||
}
|
||||
],
|
||||
"$id": "oneOf_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "first oneOf valid",
|
||||
"data": 1,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_0_0"
|
||||
},
|
||||
{
|
||||
"description": "second oneOf valid",
|
||||
"data": 2.5,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_0_0"
|
||||
},
|
||||
{
|
||||
"description": "both oneOf valid",
|
||||
"data": 3,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_0_0"
|
||||
},
|
||||
{
|
||||
"description": "neither oneOf valid",
|
||||
"data": 1.5,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with base schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "string",
|
||||
"oneOf": [
|
||||
{
|
||||
@ -47,403 +56,129 @@
|
||||
{
|
||||
"maxLength": 4
|
||||
}
|
||||
],
|
||||
"$id": "oneOf_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "mismatch base schema",
|
||||
"data": 3,
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_1_0"
|
||||
},
|
||||
{
|
||||
"description": "one oneOf valid",
|
||||
"data": "foobar",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_1_0"
|
||||
},
|
||||
{
|
||||
"description": "both oneOf valid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with boolean schemas, all true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
true,
|
||||
true,
|
||||
true
|
||||
],
|
||||
"$id": "oneOf_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with boolean schemas, one true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
true,
|
||||
false,
|
||||
false
|
||||
],
|
||||
"$id": "oneOf_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with boolean schemas, more than one true",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
true,
|
||||
true,
|
||||
false
|
||||
],
|
||||
"$id": "oneOf_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with boolean schemas, all false",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
false,
|
||||
false,
|
||||
false
|
||||
],
|
||||
"$id": "oneOf_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "any value is invalid",
|
||||
"data": "foo",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf complex types",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"oneOf": [
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"properties": {
|
||||
"bar": {
|
||||
"type": "integer"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"foo"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "first oneOf valid (complex)",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "second oneOf valid (complex)",
|
||||
"data": {
|
||||
"foo": "baz"
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "both oneOf valid (complex)",
|
||||
"data": {
|
||||
"foo": "baz",
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "neither oneOf valid (complex)",
|
||||
"data": {
|
||||
"foo": 2,
|
||||
"bar": "quux"
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with empty schema",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "number"
|
||||
},
|
||||
{}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "one valid - valid",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "both valid - invalid",
|
||||
"data": 123,
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with required",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"foo": true,
|
||||
"bar": true,
|
||||
"baz": true
|
||||
},
|
||||
"oneOf": [
|
||||
{
|
||||
"required": [
|
||||
"foo",
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"required": [
|
||||
"foo",
|
||||
"baz"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "both invalid - invalid",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "first valid - valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "second valid - valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "both valid - invalid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "extra property invalid (strict)",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"extra": 3
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with required (extensible)",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"type": "object",
|
||||
"extensible": true,
|
||||
"oneOf": [
|
||||
{
|
||||
"required": [
|
||||
"foo",
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"required": [
|
||||
"foo",
|
||||
"baz"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "both invalid - invalid",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "first valid - valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "second valid - valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "both valid - invalid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "extra properties are valid (extensible)",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"extra": "value"
|
||||
},
|
||||
"valid": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with missing optional property",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"oneOf": [
|
||||
{
|
||||
"properties": {
|
||||
"bar": true,
|
||||
"baz": true
|
||||
},
|
||||
"required": [
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"properties": {
|
||||
"foo": true
|
||||
},
|
||||
"required": [
|
||||
"foo"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "first oneOf valid",
|
||||
"data": {
|
||||
"bar": 8
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "second oneOf valid",
|
||||
"data": {
|
||||
"foo": "foo"
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "both oneOf valid",
|
||||
"data": {
|
||||
"foo": "foo",
|
||||
"bar": 8
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "neither oneOf valid",
|
||||
"data": {
|
||||
"baz": "quux"
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "nested oneOf, to check validation semantics",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"oneOf": [
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "null"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "null is valid",
|
||||
"data": null,
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "anything non-null is invalid",
|
||||
"data": 123,
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in oneOf",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"oneOf": [
|
||||
{
|
||||
"properties": {
|
||||
@ -466,7 +201,357 @@
|
||||
]
|
||||
}
|
||||
],
|
||||
"extensible": true
|
||||
"$id": "oneOf_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "first oneOf valid (complex)",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_6_0"
|
||||
},
|
||||
{
|
||||
"description": "second oneOf valid (complex)",
|
||||
"data": {
|
||||
"foo": "baz"
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_6_0"
|
||||
},
|
||||
{
|
||||
"description": "both oneOf valid (complex)",
|
||||
"data": {
|
||||
"foo": "baz",
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_6_0"
|
||||
},
|
||||
{
|
||||
"description": "neither oneOf valid (complex)",
|
||||
"data": {
|
||||
"foo": 2,
|
||||
"bar": "quux"
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_6_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with empty schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "number"
|
||||
},
|
||||
{}
|
||||
],
|
||||
"$id": "oneOf_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "one valid - valid",
|
||||
"data": "foo",
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_7_0"
|
||||
},
|
||||
{
|
||||
"description": "both valid - invalid",
|
||||
"data": 123,
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_7_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with required",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"foo": true,
|
||||
"bar": true,
|
||||
"baz": true
|
||||
},
|
||||
"oneOf": [
|
||||
{
|
||||
"required": [
|
||||
"foo",
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"required": [
|
||||
"foo",
|
||||
"baz"
|
||||
]
|
||||
}
|
||||
],
|
||||
"$id": "oneOf_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "both invalid - invalid",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_8_0"
|
||||
},
|
||||
{
|
||||
"description": "first valid - valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_8_0"
|
||||
},
|
||||
{
|
||||
"description": "second valid - valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_8_0"
|
||||
},
|
||||
{
|
||||
"description": "both valid - invalid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_8_0"
|
||||
},
|
||||
{
|
||||
"description": "extra property invalid (strict)",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"extra": 3
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_8_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with required (extensible)",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"type": "object",
|
||||
"extensible": true,
|
||||
"oneOf": [
|
||||
{
|
||||
"required": [
|
||||
"foo",
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"required": [
|
||||
"foo",
|
||||
"baz"
|
||||
]
|
||||
}
|
||||
],
|
||||
"$id": "oneOf_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "both invalid - invalid",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_9_0"
|
||||
},
|
||||
{
|
||||
"description": "first valid - valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_9_0"
|
||||
},
|
||||
{
|
||||
"description": "second valid - valid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_9_0"
|
||||
},
|
||||
{
|
||||
"description": "both valid - invalid",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"baz": 3
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_9_0"
|
||||
},
|
||||
{
|
||||
"description": "extra properties are valid (extensible)",
|
||||
"data": {
|
||||
"foo": 1,
|
||||
"bar": 2,
|
||||
"extra": "value"
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_9_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "oneOf with missing optional property",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"properties": {
|
||||
"bar": true,
|
||||
"baz": true
|
||||
},
|
||||
"required": [
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"properties": {
|
||||
"foo": true
|
||||
},
|
||||
"required": [
|
||||
"foo"
|
||||
]
|
||||
}
|
||||
],
|
||||
"$id": "oneOf_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "first oneOf valid",
|
||||
"data": {
|
||||
"bar": 8
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_10_0"
|
||||
},
|
||||
{
|
||||
"description": "second oneOf valid",
|
||||
"data": {
|
||||
"foo": "foo"
|
||||
},
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_10_0"
|
||||
},
|
||||
{
|
||||
"description": "both oneOf valid",
|
||||
"data": {
|
||||
"foo": "foo",
|
||||
"bar": 8
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_10_0"
|
||||
},
|
||||
{
|
||||
"description": "neither oneOf valid",
|
||||
"data": {
|
||||
"baz": "quux"
|
||||
},
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_10_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "nested oneOf, to check validation semantics",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "null"
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"$id": "oneOf_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "null is valid",
|
||||
"data": null,
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_11_0"
|
||||
},
|
||||
{
|
||||
"description": "anything non-null is invalid",
|
||||
"data": 123,
|
||||
"valid": false,
|
||||
"schema_id": "oneOf_11_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties in oneOf",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"properties": {
|
||||
"bar": {
|
||||
"type": "integer"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
{
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"foo"
|
||||
]
|
||||
}
|
||||
],
|
||||
"extensible": true,
|
||||
"$id": "oneOf_12_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -475,7 +560,8 @@
|
||||
"bar": 2,
|
||||
"extra": "prop"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "oneOf_12_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
49
tests/fixtures/pattern.json
vendored
49
tests/fixtures/pattern.json
vendored
@ -1,64 +1,81 @@
|
||||
[
|
||||
{
|
||||
"description": "pattern validation",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"pattern": "^a*$"
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"pattern": "^a*$",
|
||||
"$id": "pattern_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "a matching pattern is valid",
|
||||
"data": "aaa",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "pattern_0_0"
|
||||
},
|
||||
{
|
||||
"description": "a non-matching pattern is invalid",
|
||||
"data": "abc",
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "pattern_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores booleans",
|
||||
"data": true,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "pattern_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores integers",
|
||||
"data": 123,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "pattern_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores floats",
|
||||
"data": 1.0,
|
||||
"valid": true
|
||||
"data": 1,
|
||||
"valid": true,
|
||||
"schema_id": "pattern_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores objects",
|
||||
"data": {},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "pattern_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores arrays",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "pattern_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores null",
|
||||
"data": null,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "pattern_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "pattern is not anchored",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"pattern": "a+"
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"pattern": "a+",
|
||||
"$id": "pattern_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "matches a substring",
|
||||
"data": "xxaayy",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "pattern_1_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
132
tests/fixtures/patternProperties.json
vendored
132
tests/fixtures/patternProperties.json
vendored
@ -1,14 +1,18 @@
|
||||
[
|
||||
{
|
||||
"description": "patternProperties validates properties matching a regex",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"patternProperties": {
|
||||
"f.*o": {
|
||||
"type": "integer"
|
||||
}
|
||||
},
|
||||
"items": {}
|
||||
"items": {},
|
||||
"$id": "patternProperties_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -16,7 +20,8 @@
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "multiple valid matches is valid",
|
||||
@ -24,7 +29,8 @@
|
||||
"foo": 1,
|
||||
"foooooo": 2
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "a single invalid match is invalid",
|
||||
@ -32,7 +38,8 @@
|
||||
"foo": "bar",
|
||||
"fooooo": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "multiple invalid matches is invalid",
|
||||
@ -40,24 +47,28 @@
|
||||
"foo": "bar",
|
||||
"foooooo": "baz"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores arrays",
|
||||
"data": [
|
||||
"foo"
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores strings",
|
||||
"data": "foo",
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "ignores other non-objects",
|
||||
"data": 12,
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_0_0"
|
||||
},
|
||||
{
|
||||
"description": "extra property not matching pattern is INVALID (strict by default)",
|
||||
@ -65,14 +76,16 @@
|
||||
"foo": 1,
|
||||
"extra": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "multiple simultaneous patternProperties are validated",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"patternProperties": {
|
||||
"a*": {
|
||||
"type": "integer"
|
||||
@ -80,7 +93,10 @@
|
||||
"aaa*": {
|
||||
"maximum": 20
|
||||
}
|
||||
},
|
||||
"$id": "patternProperties_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -88,14 +104,16 @@
|
||||
"data": {
|
||||
"a": 21
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_1_0"
|
||||
},
|
||||
{
|
||||
"description": "a simultaneous match is valid",
|
||||
"data": {
|
||||
"aaaa": 18
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_1_0"
|
||||
},
|
||||
{
|
||||
"description": "multiple matches is valid",
|
||||
@ -103,21 +121,24 @@
|
||||
"a": 21,
|
||||
"aaaa": 18
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_1_0"
|
||||
},
|
||||
{
|
||||
"description": "an invalid due to one is invalid",
|
||||
"data": {
|
||||
"a": "bar"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_1_0"
|
||||
},
|
||||
{
|
||||
"description": "an invalid due to the other is invalid",
|
||||
"data": {
|
||||
"aaaa": 31
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_1_0"
|
||||
},
|
||||
{
|
||||
"description": "an invalid due to both is invalid",
|
||||
@ -125,14 +146,16 @@
|
||||
"aaa": "foo",
|
||||
"aaaa": 31
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "regexes are not anchored by default and are case sensitive",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"patternProperties": {
|
||||
"[0-9]{2,}": {
|
||||
"type": "boolean"
|
||||
@ -141,7 +164,10 @@
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "patternProperties_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -149,39 +175,47 @@
|
||||
"data": {
|
||||
"answer 1": "42"
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_2_0"
|
||||
},
|
||||
{
|
||||
"description": "recognized members are accounted for",
|
||||
"data": {
|
||||
"a31b": null
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_2_0"
|
||||
},
|
||||
{
|
||||
"description": "regexes are case sensitive",
|
||||
"data": {
|
||||
"a_x_3": 3
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_2_0"
|
||||
},
|
||||
{
|
||||
"description": "regexes are case sensitive, 2",
|
||||
"data": {
|
||||
"a_X_3": 3
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "patternProperties with boolean schemas",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"patternProperties": {
|
||||
"f.*": true,
|
||||
"b.*": false
|
||||
},
|
||||
"$id": "patternProperties_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -189,14 +223,16 @@
|
||||
"data": {
|
||||
"foo": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_3_0"
|
||||
},
|
||||
{
|
||||
"description": "object with property matching schema false is invalid",
|
||||
"data": {
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_3_0"
|
||||
},
|
||||
{
|
||||
"description": "object with both properties is invalid",
|
||||
@ -204,31 +240,38 @@
|
||||
"foo": 1,
|
||||
"bar": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_3_0"
|
||||
},
|
||||
{
|
||||
"description": "object with a property matching both true and false is invalid",
|
||||
"data": {
|
||||
"foobar": 1
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_3_0"
|
||||
},
|
||||
{
|
||||
"description": "empty object is valid",
|
||||
"data": {},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "patternProperties with null valued instance properties",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"patternProperties": {
|
||||
"^.*bar$": {
|
||||
"type": "null"
|
||||
}
|
||||
},
|
||||
"$id": "patternProperties_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -236,20 +279,25 @@
|
||||
"data": {
|
||||
"foobar": null
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra properties NOT matching pattern",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"patternProperties": {
|
||||
"f.*o": {
|
||||
"type": "integer"
|
||||
}
|
||||
},
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "patternProperties_5_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -257,14 +305,16 @@
|
||||
"data": {
|
||||
"bar": 1
|
||||
},
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "patternProperties_5_0"
|
||||
},
|
||||
{
|
||||
"description": "property matching pattern MUST still be valid",
|
||||
"data": {
|
||||
"foo": "invalid string"
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "patternProperties_5_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
80
tests/fixtures/prefixItems.json
vendored
80
tests/fixtures/prefixItems.json
vendored
@ -1,8 +1,9 @@
|
||||
[
|
||||
{
|
||||
"description": "a schema given for prefixItems",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"prefixItems": [
|
||||
{
|
||||
"type": "integer"
|
||||
@ -10,6 +11,9 @@
|
||||
{
|
||||
"type": "string"
|
||||
}
|
||||
],
|
||||
"$id": "prefixItems_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -19,7 +23,8 @@
|
||||
1,
|
||||
"foo"
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "prefixItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "wrong types",
|
||||
@ -27,14 +32,16 @@
|
||||
"foo",
|
||||
1
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "prefixItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "incomplete array of items",
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "prefixItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "array with additional items (invalid due to strictness)",
|
||||
@ -43,12 +50,14 @@
|
||||
"foo",
|
||||
true
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "prefixItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "prefixItems_0_0"
|
||||
},
|
||||
{
|
||||
"description": "JavaScript pseudo-array is valid (invalid due to strict object validation)",
|
||||
@ -57,17 +66,22 @@
|
||||
"1": "valid",
|
||||
"length": 2
|
||||
},
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "prefixItems_0_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "prefixItems with boolean schemas",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"prefixItems": [
|
||||
true,
|
||||
false
|
||||
],
|
||||
"$id": "prefixItems_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -76,7 +90,8 @@
|
||||
"data": [
|
||||
1
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "prefixItems_1_0"
|
||||
},
|
||||
{
|
||||
"description": "array with two items is invalid",
|
||||
@ -84,25 +99,31 @@
|
||||
1,
|
||||
"foo"
|
||||
],
|
||||
"valid": false
|
||||
"valid": false,
|
||||
"schema_id": "prefixItems_1_0"
|
||||
},
|
||||
{
|
||||
"description": "empty array is valid",
|
||||
"data": [],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "prefixItems_1_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "additional items are allowed by default",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"prefixItems": [
|
||||
{
|
||||
"type": "integer"
|
||||
}
|
||||
],
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "prefixItems_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -112,18 +133,23 @@
|
||||
"foo",
|
||||
false
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "prefixItems_2_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "prefixItems with null instance elements",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"prefixItems": [
|
||||
{
|
||||
"type": "null"
|
||||
}
|
||||
],
|
||||
"$id": "prefixItems_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
@ -132,20 +158,25 @@
|
||||
"data": [
|
||||
null
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "prefixItems_3_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true allows extra items with prefixItems",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"database": {
|
||||
"schemas": [
|
||||
{
|
||||
"prefixItems": [
|
||||
{
|
||||
"type": "integer"
|
||||
}
|
||||
],
|
||||
"extensible": true
|
||||
"extensible": true,
|
||||
"$id": "prefixItems_4_0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
@ -154,7 +185,8 @@
|
||||
1,
|
||||
"foo"
|
||||
],
|
||||
"valid": true
|
||||
"valid": true,
|
||||
"schema_id": "prefixItems_4_0"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user