Compare commits
13 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 960a99034a | |||
| 81388149e8 | |||
| b8b3f7a501 | |||
| bc5489b1ea | |||
| 7b55277116 | |||
| ed636b05a4 | |||
| 2aec2da2fd | |||
| ad78896f72 | |||
| 55b93d9957 | |||
| 7ec6e09ae0 | |||
| 9d9c6d2c06 | |||
| 12e952fa94 | |||
| 776a912098 |
49
.agent/workflows/jspg.md
Normal file
49
.agent/workflows/jspg.md
Normal file
@ -0,0 +1,49 @@
|
||||
---
|
||||
description: jspg work preparation
|
||||
---
|
||||
|
||||
This workflow will get you up-to-speed on the JSPG custom json-schema-based cargo pgrx postgres validation extension. Everything you read will be in the jspg directory/project.
|
||||
|
||||
Read over this entire workflow and commit to every section of work in a task list, so that you don't stop half way through before reviewing all of the directories and files mentioned. Do not ask for confirmation after generating this task list and proceed through all sections in your list.
|
||||
|
||||
Please analyze the files and directories and do not use cat, find, or the terminal to discover or read in any of these files. Analyze every file mentioned. If a directory is mentioned or a /*, please analyze the directory, every single file at its root, and recursively analyze every subdirectory and every single file in every subdirectory to capture not just critical files, but the entirety of what is requested. I state again, DO NOT just review a cherry picking of files in any folder or wildcard specified. Review 100% of all files discovered recursively!
|
||||
|
||||
Section 1: Documentation
|
||||
|
||||
- GEMINI.md at the root
|
||||
|
||||
Section 2: Flow file for cmd interface
|
||||
|
||||
- flow at the root
|
||||
|
||||
Section 3: Source
|
||||
|
||||
- src/*
|
||||
|
||||
Section 4: Test Fixtures
|
||||
|
||||
- Just review some of the *.json files in tests/fixtures/*
|
||||
|
||||
Section 5: Build
|
||||
|
||||
- build.rs
|
||||
|
||||
Section 6: Cargo TOML
|
||||
|
||||
- Cargo.toml
|
||||
|
||||
Section 7: Some PUNC Syntax
|
||||
|
||||
Now, review some punc type and enum source in the api project with api/ these files:
|
||||
|
||||
- punc/sql/tables.sql
|
||||
- punc/sql/domains.sql
|
||||
- punc/sql/indexes.sql
|
||||
- punc/sql/functions/entity.sql
|
||||
- punc/sql/functions/puncs.sql
|
||||
- punc/sql/puncs/entity.sql
|
||||
- punc/sql/puncs/persons.sql
|
||||
- punc/sql/puncs/puncs.sql
|
||||
- punc/sql/puncs/job.sql
|
||||
|
||||
Now you are ready to help me work on this extension.
|
||||
13
GEMINI.md
13
GEMINI.md
@ -9,7 +9,7 @@ It is designed to serve as the validation engine for the "Punc" architecture, wh
|
||||
1. **Draft 2020-12 Compliance**: Attempt to adhere to the official JSON Schema Draft 2020-12 specification.
|
||||
2. **Ultra-Fast Validation**: Compile schemas into an optimized in-memory representation for near-instant validation during high-throughput workloads.
|
||||
3. **Connection-Bound Caching**: Leverage the PostgreSQL session lifecycle to maintain a per-connection schema cache, eliminating the need for repetitive parsing.
|
||||
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `.family` schemas.
|
||||
4. **Structural Inheritance**: Support object-oriented schema design via Implicit Keyword Shadowing and virtual `$family` references.
|
||||
5. **Punc Integration**: validation is aware of the "Punc" context (request/response) and can validate `cue` objects efficiently.
|
||||
|
||||
## 🔌 API Reference
|
||||
@ -27,7 +27,7 @@ Loads and compiles the entire schema registry into the session's memory, atomica
|
||||
* **Behavior**:
|
||||
* Parses all inputs into an internal schema graph.
|
||||
* Resolves all internal references (`$ref`).
|
||||
* Generates virtual `.family` schemas for type hierarchies.
|
||||
* Generates virtual union schemas for type hierarchies referenced via `$family`.
|
||||
* Compiles schemas into validators.
|
||||
* **Returns**: `{"response": "success"}` or an error object.
|
||||
|
||||
@ -78,16 +78,17 @@ Standard JSON Schema composition (`allOf`) is additive (Intersection), meaning c
|
||||
|
||||
* **Composition (`allOf`)**: When using `allOf`, standard intersection rules apply. No shadowing occurs; all constraints from all branches must pass. This is used for mixins or interfaces.
|
||||
|
||||
### 2. Virtual Family Schemas (`.family`)
|
||||
### 2. Virtual Family References (`$family`)
|
||||
To support polymorphic fields (e.g., a field that accepts any "User" type), JSPG generates virtual schemas representing type hierarchies.
|
||||
|
||||
* **Mechanism**: When caching types, if a type defines a `hierarchy` (e.g., `["entity", "organization", "person"]`), JSPG generates a schema like `organization.family` which is a `oneOf` containing refs to all valid descendants.
|
||||
* **Mechanism**: When caching types, if a type defines a `hierarchy` (e.g., `["entity", "organization", "person"]`), JSPG generates a virtual `oneOf` family containing refs to all valid descendants. These can be pointed to exclusively by using `{"$family": "organization"}`. Because `$family` is a macro-pointer that swaps in the virtual union, it **must** be used exclusively in its schema object; you cannot define other properties alongside it.
|
||||
|
||||
### 3. Strict by Default & Extensibility
|
||||
JSPG enforces a "Secure by Default" philosophy. All schemas are treated as if `unevaluatedProperties: false` (and `unevaluatedItems: false`) is set, unless explicitly overridden.
|
||||
|
||||
* **Strictness**: By default, any property in the instance data that is not explicitly defined in the schema causes a validation error. This prevents clients from sending undeclared fields.
|
||||
* **Extensibility (`extensible: true`)**: To allow additional, undefined properties, you must add `"extensible": true` to the schema. This is useful for types that are designed to be open for extension.
|
||||
* **Strictness**: By default, any property or array item in the instance data that is not explicitly defined in the schema causes a validation error. This prevents clients from sending undeclared fields or extra array elements.
|
||||
* **Extensibility (`extensible: true`)**: To allow a free-for-all of additional, undefined properties or extra array items, you must add `"extensible": true` to the schema. This globally disables the strictness check for that object or array, useful for types designed to be completely open.
|
||||
* **Structured Additional Properties (`additionalProperties: {...}`)**: Instead of a boolean free-for-all, you can define `additionalProperties` as a schema object (e.g., `{"type": "string"}`). This maintains strictness (no arbitrary keys) but allows any extra keys as long as their values match the defined structure.
|
||||
* **Ref Boundaries**: Strictness is reset when crossing `$ref` boundaries. The referenced schema's strictness is determined by its own definition (strict by default unless `extensible: true`), ignoring the caller's state.
|
||||
* **Inheritance**: Strictness is inherited. A schema extending a strict parent will also be strict unless it declares itself `extensible: true`. Conversely, a schema extending a loose parent will also be loose unless it declares itself `extensible: false`.
|
||||
|
||||
|
||||
61
build.rs
61
build.rs
@ -3,38 +3,38 @@ use std::fs::File;
|
||||
use std::io::Write;
|
||||
use std::path::Path;
|
||||
|
||||
fn to_safe_identifier(name: &str) -> String {
|
||||
let mut safe = String::new();
|
||||
for (i, c) in name.chars().enumerate() {
|
||||
if c.is_uppercase() {
|
||||
if i > 0 {
|
||||
safe.push('_');
|
||||
}
|
||||
safe.push(c.to_ascii_lowercase());
|
||||
} else if c == '-' || c == '.' {
|
||||
safe.push('_');
|
||||
} else {
|
||||
safe.push(c);
|
||||
}
|
||||
}
|
||||
safe
|
||||
}
|
||||
|
||||
fn main() {
|
||||
println!("cargo:rerun-if-changed=tests/fixtures");
|
||||
println!("cargo:rerun-if-changed=Cargo.toml");
|
||||
|
||||
// File 1: src/tests.rs for #[pg_test]
|
||||
let pg_dest_path = Path::new("src/tests.rs");
|
||||
// File 1: src/tests/fixtures.rs for #[pg_test]
|
||||
let pg_dest_path = Path::new("src/tests/fixtures.rs");
|
||||
let mut pg_file = File::create(&pg_dest_path).unwrap();
|
||||
|
||||
// File 2: tests/tests.rs for standard #[test] integration
|
||||
let std_dest_path = Path::new("tests/tests.rs");
|
||||
// File 2: tests/fixtures.rs for standard #[test] integration
|
||||
let std_dest_path = Path::new("tests/fixtures.rs");
|
||||
let mut std_file = File::create(&std_dest_path).unwrap();
|
||||
|
||||
// Write headers
|
||||
writeln!(std_file, "use jspg::util;").unwrap();
|
||||
|
||||
// Helper for snake_case conversion
|
||||
// let _to_snake_case = |s: &str| -> String {
|
||||
// s.chars().fold(String::new(), |mut acc, c| {
|
||||
// if c.is_uppercase() {
|
||||
// if !acc.is_empty() {
|
||||
// acc.push('_');
|
||||
// }
|
||||
// acc.push(c.to_ascii_lowercase());
|
||||
// } else if c == '-' || c == ' ' || c == '.' || c == '/' || c == ':' {
|
||||
// acc.push('_');
|
||||
// } else if c.is_alphanumeric() {
|
||||
// acc.push(c);
|
||||
// }
|
||||
// acc
|
||||
// })
|
||||
// };
|
||||
|
||||
// Walk tests/fixtures directly
|
||||
let fixtures_path = "tests/fixtures";
|
||||
if Path::new(fixtures_path).exists() {
|
||||
@ -51,24 +51,7 @@ fn main() {
|
||||
if let Some(arr) = val.as_array() {
|
||||
for (i, _item) in arr.iter().enumerate() {
|
||||
// Use deterministic names: test_{filename}_{index}
|
||||
// We sanitize the filename to be a valid identifier
|
||||
// Use manual snake_case logic since we don't want to add a build-dependency just yet if not needed,
|
||||
// but `dynamicRef` -> `dynamic_ref` requires parsing.
|
||||
// Let's implement a simple camelToSnake helper.
|
||||
let mut safe_filename = String::new();
|
||||
for (i, c) in file_name.chars().enumerate() {
|
||||
if c.is_uppercase() {
|
||||
if i > 0 {
|
||||
safe_filename.push('_');
|
||||
}
|
||||
safe_filename.push(c.to_ascii_lowercase());
|
||||
} else if c == '-' || c == '.' {
|
||||
safe_filename.push('_');
|
||||
} else {
|
||||
safe_filename.push(c);
|
||||
}
|
||||
}
|
||||
|
||||
let safe_filename = to_safe_identifier(file_name);
|
||||
let fn_name = format!("test_{}_{}", safe_filename, i);
|
||||
|
||||
// Write to src/tests.rs (PG Test)
|
||||
|
||||
71
flow
71
flow
@ -15,25 +15,28 @@ CARGO_DEPENDENCIES=(cargo-pgrx==0.16.1)
|
||||
GITEA_ORGANIZATION="cellular"
|
||||
GITEA_REPOSITORY="jspg"
|
||||
|
||||
pgrx-prepare() {
|
||||
pgrx-up() {
|
||||
info "Initializing pgrx..."
|
||||
# Explicitly point to the postgresql@${POSTGRES_VERSION} pg_config, don't rely on 'which'
|
||||
local POSTGRES_CONFIG_PATH="/opt/homebrew/opt/postgresql@${POSTGRES_VERSION}/bin/pg_config"
|
||||
|
||||
if [ ! -x "$POSTGRES_CONFIG_PATH" ]; then
|
||||
error "pg_config not found or not executable at $POSTGRES_CONFIG_PATH."
|
||||
warning "Ensure postgresql@${POSTGRES_VERSION} is installed correctly via Homebrew."
|
||||
return 2
|
||||
abort "pg_config not found or not executable at $POSTGRES_CONFIG_PATH." 2
|
||||
fi
|
||||
|
||||
if cargo pgrx init --pg"$POSTGRES_VERSION"="$POSTGRES_CONFIG_PATH"; then
|
||||
success "pgrx initialized successfully."
|
||||
else
|
||||
error "Failed to initialize pgrx. Check PostgreSQL development packages are installed and $POSTGRES_CONFIG_PATH is valid."
|
||||
return 2
|
||||
success "pgrx initialized successfully." && return 0
|
||||
fi
|
||||
|
||||
abort "Failed to initialize pgrx. Check PostgreSQL development packages are installed and $POSTGRES_CONFIG_PATH is valid." 2
|
||||
}
|
||||
|
||||
pgrx-down() {
|
||||
info "Taking pgrx down..."
|
||||
}
|
||||
|
||||
|
||||
build() {
|
||||
local version
|
||||
version=$(get-version) || return $?
|
||||
@ -51,11 +54,10 @@ build() {
|
||||
info "Creating tarball: ${tarball_path}"
|
||||
# Set COPYFILE_DISABLE=1 to prevent macOS tar from including ._ metadata files
|
||||
if COPYFILE_DISABLE=1 tar --exclude='.git*' --exclude='./target' --exclude='./package' --exclude='./flows' --exclude='./flow' -czf "${tarball_path}" .; then
|
||||
success "Successfully created source tarball: ${tarball_path}"
|
||||
else
|
||||
error "Failed to create source tarball."
|
||||
return 2
|
||||
success "Successfully created source tarball: ${tarball_path}" && return 0
|
||||
fi
|
||||
|
||||
abort "Failed to create source tarball." 2
|
||||
}
|
||||
|
||||
install() {
|
||||
@ -66,8 +68,7 @@ install() {
|
||||
|
||||
# Run the pgrx install command
|
||||
if ! cargo pgrx install; then
|
||||
error "cargo pgrx install command failed."
|
||||
return 2
|
||||
abort "cargo pgrx install command failed." 2
|
||||
fi
|
||||
success "PGRX extension v$version successfully built and installed."
|
||||
|
||||
@ -76,36 +77,28 @@ install() {
|
||||
pg_sharedir=$("$POSTGRES_CONFIG_PATH" --sharedir)
|
||||
local pg_config_status=$?
|
||||
if [ $pg_config_status -ne 0 ] || [ -z "$pg_sharedir" ]; then
|
||||
error "Failed to determine PostgreSQL shared directory using pg_config."
|
||||
return 2
|
||||
abort "Failed to determine PostgreSQL shared directory using pg_config." 2
|
||||
fi
|
||||
local installed_control_path="${pg_sharedir}/extension/jspg.control"
|
||||
|
||||
# Modify the control file
|
||||
if [ ! -f "$installed_control_path" ]; then
|
||||
error "Installed control file not found: '$installed_control_path'"
|
||||
return 2
|
||||
abort "Installed control file not found: '$installed_control_path'" 2
|
||||
fi
|
||||
|
||||
info "Modifying control file for non-superuser access: ${installed_control_path}"
|
||||
# Use sed -i '' for macOS compatibility
|
||||
if sed -i '' '/^superuser = false/d' "$installed_control_path" && \
|
||||
echo 'trusted = true' >> "$installed_control_path"; then
|
||||
success "Control file modified successfully."
|
||||
else
|
||||
error "Failed to modify control file: ${installed_control_path}"
|
||||
return 2
|
||||
success "Control file modified successfully." && return 0
|
||||
fi
|
||||
|
||||
abort "Failed to modify control file: ${installed_control_path}" 2
|
||||
}
|
||||
|
||||
test-jspg() {
|
||||
test() {
|
||||
info "Running jspg tests..."
|
||||
cargo pgrx test "pg${POSTGRES_VERSION}" "$@" || return $?
|
||||
}
|
||||
|
||||
test-validator() {
|
||||
info "Running validator tests..."
|
||||
cargo test -p boon --features "pgrx/pg${POSTGRES_VERSION}" "$@" || return $?
|
||||
cargo test --tests "$@" || return $?
|
||||
}
|
||||
|
||||
clean() {
|
||||
@ -114,27 +107,27 @@ clean() {
|
||||
}
|
||||
|
||||
jspg-usage() {
|
||||
printf "prepare\tCheck OS, Cargo, and PGRX dependencies.\n"
|
||||
printf "install\tBuild and install the extension locally (after prepare).\n"
|
||||
printf "reinstall\tClean, build, and install the extension locally (after prepare).\n"
|
||||
printf "test-jspg\t\tRun pgrx integration tests.\n"
|
||||
printf "test-validator\t\tRun validator integration tests.\n"
|
||||
printf "clean\t\tRemove pgrx build artifacts.\n"
|
||||
echo "up|Check OS, Cargo, and PGRX dependencies."
|
||||
echo "install|Build and install the extension locally (after up)."
|
||||
echo "reinstall|Clean, build, and install the extension locally (after up)."
|
||||
echo "test-jspg|Run pgrx integration tests."
|
||||
echo "test-validator|Run validator integration tests."
|
||||
echo "clean|Remove pgrx build artifacts."
|
||||
}
|
||||
|
||||
jspg-flow() {
|
||||
case "$1" in
|
||||
prepare) prepare && cargo-prepare && pgrx-prepare; return $?;;
|
||||
up) up && rust-up && pgrx-up; return $?;;
|
||||
down) pgrx-down && rust-down && down; return $?;;
|
||||
build) build; return $?;;
|
||||
install) install; return $?;;
|
||||
reinstall) clean && install; return $?;;
|
||||
test-jspg) test-jspg "${@:2}"; return $?;;
|
||||
test-validator) test-validator "${@:2}"; return $?;;
|
||||
test) test "${@:2}"; return $?;;
|
||||
clean) clean; return $?;;
|
||||
*) return 1 ;;
|
||||
*) return 127 ;;
|
||||
esac
|
||||
}
|
||||
|
||||
register-flow "jspg-usage" "jspg-flow"
|
||||
register-flow "jspg"
|
||||
|
||||
dispatch "$@"
|
||||
2
flows
2
flows
Submodule flows updated: e154758056...a7b0f5dc4d
@ -113,6 +113,9 @@ impl Compiler {
|
||||
Self::compile_recursive(Arc::make_mut(s));
|
||||
}
|
||||
}
|
||||
if let Some(add_props) = &mut schema.additional_properties {
|
||||
Self::compile_recursive(Arc::make_mut(add_props));
|
||||
}
|
||||
|
||||
// ... Recurse logic ...
|
||||
if let Some(items) = &mut schema.items {
|
||||
@ -323,6 +326,11 @@ impl Compiler {
|
||||
Self::compile_index(sub_schema, registry, current_base.clone(), sub);
|
||||
}
|
||||
}
|
||||
if let Some(add_props) = &schema.additional_properties {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("additionalProperties".to_string());
|
||||
Self::compile_index(add_props, registry, current_base.clone(), sub);
|
||||
}
|
||||
if let Some(contains) = &schema.contains {
|
||||
let mut sub = child_pointer.clone();
|
||||
sub.push("contains".to_string());
|
||||
|
||||
118
src/context.rs
Normal file
118
src/context.rs
Normal file
@ -0,0 +1,118 @@
|
||||
use crate::error::ValidationError;
|
||||
use crate::instance::ValidationInstance;
|
||||
use crate::result::ValidationResult;
|
||||
use crate::schema::Schema;
|
||||
use crate::validator::Validator;
|
||||
use std::collections::HashSet;
|
||||
|
||||
pub struct ValidationContext<'a, I: ValidationInstance<'a>> {
|
||||
pub validator: &'a Validator,
|
||||
pub root: &'a Schema,
|
||||
pub schema: &'a Schema,
|
||||
pub instance: I,
|
||||
pub path: String,
|
||||
pub depth: usize,
|
||||
pub scope: Vec<String>,
|
||||
pub overrides: HashSet<String>,
|
||||
pub extensible: bool,
|
||||
pub reporter: bool,
|
||||
}
|
||||
|
||||
impl<'a, I: ValidationInstance<'a>> ValidationContext<'a, I> {
|
||||
pub fn new(
|
||||
validator: &'a Validator,
|
||||
root: &'a Schema,
|
||||
schema: &'a Schema,
|
||||
instance: I,
|
||||
scope: Vec<String>,
|
||||
overrides: HashSet<String>,
|
||||
extensible: bool,
|
||||
reporter: bool,
|
||||
) -> Self {
|
||||
let effective_extensible = schema.extensible.unwrap_or(extensible);
|
||||
Self {
|
||||
validator,
|
||||
root,
|
||||
schema,
|
||||
instance,
|
||||
path: String::new(),
|
||||
depth: 0,
|
||||
scope,
|
||||
overrides,
|
||||
extensible: effective_extensible,
|
||||
reporter,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn derive(
|
||||
&self,
|
||||
schema: &'a Schema,
|
||||
instance: I,
|
||||
path: &str,
|
||||
scope: Vec<String>,
|
||||
overrides: HashSet<String>,
|
||||
extensible: bool,
|
||||
reporter: bool,
|
||||
) -> Self {
|
||||
let effective_extensible = schema.extensible.unwrap_or(extensible);
|
||||
|
||||
Self {
|
||||
validator: self.validator,
|
||||
root: self.root,
|
||||
schema,
|
||||
instance,
|
||||
path: path.to_string(),
|
||||
depth: self.depth + 1,
|
||||
scope,
|
||||
overrides,
|
||||
extensible: effective_extensible,
|
||||
reporter,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn derive_for_schema(&self, schema: &'a Schema, reporter: bool) -> Self {
|
||||
self.derive(
|
||||
schema,
|
||||
self.instance,
|
||||
&self.path,
|
||||
self.scope.clone(),
|
||||
HashSet::new(),
|
||||
self.extensible,
|
||||
reporter,
|
||||
)
|
||||
}
|
||||
|
||||
pub fn validate(&self) -> Result<ValidationResult, ValidationError> {
|
||||
let mut effective_scope = self.scope.clone();
|
||||
|
||||
if let Some(id) = &self.schema.obj.id {
|
||||
let current_base = self.scope.last().map(|s| s.as_str()).unwrap_or("");
|
||||
let mut new_base = id.clone();
|
||||
if !current_base.is_empty() {
|
||||
if let Ok(base_url) = url::Url::parse(current_base) {
|
||||
if let Ok(joined) = base_url.join(id) {
|
||||
new_base = joined.to_string();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
effective_scope.push(new_base);
|
||||
|
||||
let shadow = ValidationContext {
|
||||
validator: self.validator,
|
||||
root: self.root,
|
||||
schema: self.schema,
|
||||
instance: self.instance,
|
||||
path: self.path.clone(),
|
||||
depth: self.depth,
|
||||
scope: effective_scope,
|
||||
overrides: self.overrides.clone(),
|
||||
extensible: self.extensible,
|
||||
reporter: self.reporter,
|
||||
};
|
||||
return shadow.validate_scoped();
|
||||
}
|
||||
|
||||
self.validate_scoped()
|
||||
}
|
||||
}
|
||||
@ -13,7 +13,7 @@ pub struct Drop {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub response: Option<Value>,
|
||||
|
||||
#[serde(default)]
|
||||
#[serde(default, skip_serializing_if = "Vec::is_empty")]
|
||||
pub errors: Vec<Error>,
|
||||
}
|
||||
|
||||
@ -29,7 +29,7 @@ impl Drop {
|
||||
pub fn success() -> Self {
|
||||
Self {
|
||||
type_: "drop".to_string(),
|
||||
response: Some(serde_json::json!({ "result": "success" })), // Or appropriate success response
|
||||
response: Some(serde_json::json!("success")),
|
||||
errors: vec![],
|
||||
}
|
||||
}
|
||||
@ -53,8 +53,6 @@ impl Drop {
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize, Clone)]
|
||||
pub struct Error {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub punc: Option<String>,
|
||||
pub code: String,
|
||||
pub message: String,
|
||||
pub details: ErrorDetails,
|
||||
|
||||
6
src/error.rs
Normal file
6
src/error.rs
Normal file
@ -0,0 +1,6 @@
|
||||
#[derive(Debug, Clone, serde::Serialize)]
|
||||
pub struct ValidationError {
|
||||
pub code: String,
|
||||
pub message: String,
|
||||
pub path: String,
|
||||
}
|
||||
98
src/instance.rs
Normal file
98
src/instance.rs
Normal file
@ -0,0 +1,98 @@
|
||||
use serde_json::Value;
|
||||
use std::collections::HashSet;
|
||||
use std::ptr::NonNull;
|
||||
|
||||
pub trait ValidationInstance<'a>: Copy + Clone {
|
||||
fn as_value(&self) -> &'a Value;
|
||||
fn child_at_key(&self, key: &str) -> Option<Self>;
|
||||
fn child_at_index(&self, idx: usize) -> Option<Self>;
|
||||
fn prune_object(&self, _keys: &HashSet<String>) {}
|
||||
fn prune_array(&self, _indices: &HashSet<usize>) {}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct ReadOnlyInstance<'a>(pub &'a Value);
|
||||
|
||||
impl<'a> ValidationInstance<'a> for ReadOnlyInstance<'a> {
|
||||
fn as_value(&self) -> &'a Value {
|
||||
self.0
|
||||
}
|
||||
|
||||
fn child_at_key(&self, key: &str) -> Option<Self> {
|
||||
self.0.get(key).map(ReadOnlyInstance)
|
||||
}
|
||||
|
||||
fn child_at_index(&self, idx: usize) -> Option<Self> {
|
||||
self.0.get(idx).map(ReadOnlyInstance)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct MutableInstance {
|
||||
ptr: NonNull<Value>,
|
||||
}
|
||||
|
||||
impl MutableInstance {
|
||||
pub fn new(val: &mut Value) -> Self {
|
||||
Self {
|
||||
ptr: NonNull::from(val),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> ValidationInstance<'a> for MutableInstance {
|
||||
fn as_value(&self) -> &'a Value {
|
||||
unsafe { self.ptr.as_ref() }
|
||||
}
|
||||
|
||||
fn child_at_key(&self, key: &str) -> Option<Self> {
|
||||
unsafe {
|
||||
if let Some(obj) = self.ptr.as_ref().as_object() {
|
||||
if obj.contains_key(key) {
|
||||
let parent_mut = &mut *self.ptr.as_ptr();
|
||||
if let Some(child_val) = parent_mut.get_mut(key) {
|
||||
return Some(MutableInstance::new(child_val));
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn child_at_index(&self, idx: usize) -> Option<Self> {
|
||||
unsafe {
|
||||
if let Some(arr) = self.ptr.as_ref().as_array() {
|
||||
if idx < arr.len() {
|
||||
let parent_mut = &mut *self.ptr.as_ptr();
|
||||
if let Some(child_val) = parent_mut.get_mut(idx) {
|
||||
return Some(MutableInstance::new(child_val));
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn prune_object(&self, keys: &HashSet<String>) {
|
||||
unsafe {
|
||||
let val_mut = &mut *self.ptr.as_ptr();
|
||||
if let Some(obj) = val_mut.as_object_mut() {
|
||||
obj.retain(|k, _| keys.contains(k));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn prune_array(&self, indices: &HashSet<usize>) {
|
||||
unsafe {
|
||||
let val_mut = &mut *self.ptr.as_ptr();
|
||||
if let Some(arr) = val_mut.as_array_mut() {
|
||||
let mut i = 0;
|
||||
arr.retain(|_| {
|
||||
let keep = indices.contains(&i);
|
||||
i += 1;
|
||||
keep
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
156
src/lib.rs
156
src/lib.rs
@ -11,8 +11,13 @@ mod schema;
|
||||
pub mod util;
|
||||
mod validator;
|
||||
|
||||
use crate::schema::Schema;
|
||||
use serde_json::{Value, json};
|
||||
pub mod context;
|
||||
pub mod error;
|
||||
pub mod instance;
|
||||
pub mod result;
|
||||
pub(crate) mod rules;
|
||||
|
||||
use serde_json::json;
|
||||
use std::sync::{Arc, RwLock};
|
||||
|
||||
lazy_static::lazy_static! {
|
||||
@ -25,80 +30,13 @@ lazy_static::lazy_static! {
|
||||
}
|
||||
|
||||
#[pg_extern(strict)]
|
||||
fn cache_json_schemas(enums: JsonB, types: JsonB, puncs: JsonB) -> JsonB {
|
||||
// 1. Build a new Registry LOCALLY (on stack)
|
||||
let mut registry = registry::Registry::new();
|
||||
|
||||
// Generate Family Schemas from Types
|
||||
{
|
||||
let mut family_map: std::collections::HashMap<String, std::collections::HashSet<String>> =
|
||||
std::collections::HashMap::new();
|
||||
if let Value::Array(arr) = &types.0 {
|
||||
for item in arr {
|
||||
if let Some(name) = item.get("name").and_then(|v| v.as_str()) {
|
||||
if let Some(hierarchy) = item.get("hierarchy").and_then(|v| v.as_array()) {
|
||||
for ancestor in hierarchy {
|
||||
if let Some(anc_str) = ancestor.as_str() {
|
||||
family_map
|
||||
.entry(anc_str.to_string())
|
||||
.or_default()
|
||||
.insert(name.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (family_name, members) in family_map {
|
||||
let id = format!("{}.family", family_name);
|
||||
|
||||
// Object Union (for polymorphic object validation)
|
||||
// This allows the schema to match ANY of the types in the family hierarchy
|
||||
let object_refs: Vec<Value> = members.iter().map(|s| json!({ "$ref": s })).collect();
|
||||
|
||||
let schema_json = json!({
|
||||
"$id": id,
|
||||
"oneOf": object_refs
|
||||
});
|
||||
|
||||
if let Ok(schema) = serde_json::from_value::<Schema>(schema_json) {
|
||||
registry.add(schema);
|
||||
}
|
||||
}
|
||||
|
||||
// Helper to parse and cache a list of items
|
||||
let mut cache_items = |items: JsonB| {
|
||||
if let Value::Array(arr) = items.0 {
|
||||
for item in arr {
|
||||
// For now, we assume the item structure matches what the generator expects
|
||||
// or what `json_schemas.sql` sends.
|
||||
// The `Schema` struct in `schema.rs` is designed to deserialize standard JSON Schema.
|
||||
// However, the input here is an array of objects that *contain* a `schemas` array.
|
||||
// We need to extract those inner schemas.
|
||||
|
||||
if let Some(schemas_val) = item.get("schemas") {
|
||||
if let Value::Array(schemas) = schemas_val {
|
||||
for schema_val in schemas {
|
||||
// Deserialize into our robust Schema struct to ensure validity/parsing
|
||||
if let Ok(schema) = serde_json::from_value::<Schema>(schema_val.clone()) {
|
||||
// Registry handles compilation
|
||||
registry.add(schema);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
cache_items(enums);
|
||||
cache_items(types);
|
||||
cache_items(puncs); // public/private distinction logic to come later
|
||||
}
|
||||
|
||||
// 2. Wrap in Validator and Arc
|
||||
let new_validator = validator::Validator::new(registry);
|
||||
pub fn cache_json_schemas(enums: JsonB, types: JsonB, puncs: JsonB) -> JsonB {
|
||||
// 1 & 2. Build Registry, Families, and Wrap in Validator all in one shot
|
||||
let new_validator = crate::validator::Validator::from_punc_definition(
|
||||
Some(&enums.0),
|
||||
Some(&types.0),
|
||||
Some(&puncs.0),
|
||||
);
|
||||
let new_arc = Arc::new(new_validator);
|
||||
|
||||
// 3. ATOMIC SWAP
|
||||
@ -107,11 +45,12 @@ fn cache_json_schemas(enums: JsonB, types: JsonB, puncs: JsonB) -> JsonB {
|
||||
*lock = Some(new_arc);
|
||||
}
|
||||
|
||||
JsonB(json!({ "response": "success" }))
|
||||
let drop = crate::drop::Drop::success();
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn mask_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
pub fn mask_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
// 1. Acquire Snapshot
|
||||
let validator_arc = {
|
||||
let lock = GLOBAL_VALIDATOR.read().unwrap();
|
||||
@ -135,7 +74,6 @@ fn mask_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
.errors
|
||||
.into_iter()
|
||||
.map(|e| crate::drop::Error {
|
||||
punc: None,
|
||||
code: e.code,
|
||||
message: e.message,
|
||||
details: crate::drop::ErrorDetails { path: e.path },
|
||||
@ -148,7 +86,6 @@ fn mask_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
Err(e) => {
|
||||
// Schema Not Found or other fatal error
|
||||
let error = crate::drop::Error {
|
||||
punc: None,
|
||||
code: e.code,
|
||||
message: e.message,
|
||||
details: crate::drop::ErrorDetails { path: e.path },
|
||||
@ -158,19 +95,20 @@ fn mask_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
}
|
||||
}
|
||||
} else {
|
||||
JsonB(json!({
|
||||
"punc": null,
|
||||
"errors": [{
|
||||
"code": "VALIDATOR_NOT_INITIALIZED",
|
||||
"message": "JSON Schemas have not been cached yet. Run cache_json_schemas()",
|
||||
"details": { "path": "" }
|
||||
}]
|
||||
}))
|
||||
let error = crate::drop::Error {
|
||||
code: "VALIDATOR_NOT_INITIALIZED".to_string(),
|
||||
message: "JSON Schemas have not been cached yet. Run cache_json_schemas()".to_string(),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: "".to_string(),
|
||||
},
|
||||
};
|
||||
let drop = crate::drop::Drop::with_errors(vec![error]);
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn validate_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
pub fn validate_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
// 1. Acquire Snapshot
|
||||
let validator_arc = {
|
||||
let lock = GLOBAL_VALIDATOR.read().unwrap();
|
||||
@ -189,7 +127,6 @@ fn validate_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
.errors
|
||||
.into_iter()
|
||||
.map(|e| crate::drop::Error {
|
||||
punc: None,
|
||||
code: e.code,
|
||||
message: e.message,
|
||||
details: crate::drop::ErrorDetails { path: e.path },
|
||||
@ -201,7 +138,6 @@ fn validate_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
}
|
||||
Err(e) => {
|
||||
let error = crate::drop::Error {
|
||||
punc: None,
|
||||
code: e.code,
|
||||
message: e.message,
|
||||
details: crate::drop::ErrorDetails { path: e.path },
|
||||
@ -211,19 +147,20 @@ fn validate_json_schema(schema_id: &str, instance: JsonB) -> JsonB {
|
||||
}
|
||||
}
|
||||
} else {
|
||||
JsonB(json!({
|
||||
"punc": null,
|
||||
"errors": [{
|
||||
"code": "VALIDATOR_NOT_INITIALIZED",
|
||||
"message": "JSON Schemas have not been cached yet. Run cache_json_schemas()",
|
||||
"details": { "path": "" }
|
||||
}]
|
||||
}))
|
||||
let error = crate::drop::Error {
|
||||
code: "VALIDATOR_NOT_INITIALIZED".to_string(),
|
||||
message: "JSON Schemas have not been cached yet. Run cache_json_schemas()".to_string(),
|
||||
details: crate::drop::ErrorDetails {
|
||||
path: "".to_string(),
|
||||
},
|
||||
};
|
||||
let drop = crate::drop::Drop::with_errors(vec![error]);
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn json_schema_cached(schema_id: &str) -> bool {
|
||||
pub fn json_schema_cached(schema_id: &str) -> bool {
|
||||
if let Some(validator) = GLOBAL_VALIDATOR.read().unwrap().as_ref() {
|
||||
match validator.validate(schema_id, &serde_json::Value::Null) {
|
||||
Err(e) if e.code == "SCHEMA_NOT_FOUND" => false,
|
||||
@ -235,18 +172,23 @@ fn json_schema_cached(schema_id: &str) -> bool {
|
||||
}
|
||||
|
||||
#[pg_extern(strict)]
|
||||
fn clear_json_schemas() -> JsonB {
|
||||
pub fn clear_json_schemas() -> JsonB {
|
||||
let mut lock = GLOBAL_VALIDATOR.write().unwrap();
|
||||
*lock = None;
|
||||
JsonB(json!({ "response": "success" }))
|
||||
let drop = crate::drop::Drop::success();
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
|
||||
#[pg_extern(strict, parallel_safe)]
|
||||
fn show_json_schemas() -> JsonB {
|
||||
if let Some(_validator) = GLOBAL_VALIDATOR.read().unwrap().as_ref() {
|
||||
JsonB(json!({ "response": "success", "status": "active" }))
|
||||
pub fn show_json_schemas() -> JsonB {
|
||||
if let Some(validator) = GLOBAL_VALIDATOR.read().unwrap().as_ref() {
|
||||
let mut keys = validator.get_schema_ids();
|
||||
keys.sort();
|
||||
let drop = crate::drop::Drop::success_with_val(json!(keys));
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
} else {
|
||||
JsonB(json!({ "response": "success", "status": "empty" }))
|
||||
let drop = crate::drop::Drop::success_with_val(json!([]));
|
||||
JsonB(serde_json::to_value(drop).unwrap())
|
||||
}
|
||||
}
|
||||
|
||||
@ -254,7 +196,7 @@ fn show_json_schemas() -> JsonB {
|
||||
#[pg_schema]
|
||||
mod tests {
|
||||
use pgrx::prelude::*;
|
||||
include!("tests.rs");
|
||||
include!("tests/fixtures.rs");
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
|
||||
27
src/result.rs
Normal file
27
src/result.rs
Normal file
@ -0,0 +1,27 @@
|
||||
use crate::error::ValidationError;
|
||||
use std::collections::HashSet;
|
||||
|
||||
#[derive(Debug, Default, Clone, serde::Serialize)]
|
||||
pub struct ValidationResult {
|
||||
pub errors: Vec<ValidationError>,
|
||||
#[serde(skip)]
|
||||
pub evaluated_keys: HashSet<String>,
|
||||
#[serde(skip)]
|
||||
pub evaluated_indices: HashSet<usize>,
|
||||
}
|
||||
|
||||
impl ValidationResult {
|
||||
pub fn new() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
|
||||
pub fn merge(&mut self, other: ValidationResult) {
|
||||
self.errors.extend(other.errors);
|
||||
self.evaluated_keys.extend(other.evaluated_keys);
|
||||
self.evaluated_indices.extend(other.evaluated_indices);
|
||||
}
|
||||
|
||||
pub fn is_valid(&self) -> bool {
|
||||
self.errors.is_empty()
|
||||
}
|
||||
}
|
||||
1008
src/rules.rs
Normal file
1008
src/rules.rs
Normal file
File diff suppressed because it is too large
Load Diff
@ -33,6 +33,11 @@ pub struct SchemaObject {
|
||||
pub properties: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
#[serde(rename = "patternProperties")]
|
||||
pub pattern_properties: Option<BTreeMap<String, Arc<Schema>>>,
|
||||
#[serde(rename = "additionalProperties")]
|
||||
pub additional_properties: Option<Arc<Schema>>,
|
||||
#[serde(rename = "$family")]
|
||||
pub family: Option<String>,
|
||||
|
||||
pub required: Option<Vec<String>>,
|
||||
|
||||
// dependencies can be schema dependencies or property dependencies
|
||||
|
||||
@ -155,6 +155,24 @@ fn test_puncs_7() {
|
||||
crate::util::run_test_file_at_index(&path, 7).unwrap();
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_additional_properties_0() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_additional_properties_1() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_additional_properties_2() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
crate::util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[pg_test]
|
||||
fn test_exclusive_minimum_0() {
|
||||
let path = format!("{}/tests/fixtures/exclusiveMinimum.json", env!("CARGO_MANIFEST_DIR"));
|
||||
164
src/util.rs
164
src/util.rs
@ -50,74 +50,12 @@ pub fn run_test_file_at_index(path: &str, index: usize) -> Result<(), String> {
|
||||
let group = &suite[index];
|
||||
let mut failures = Vec::<String>::new();
|
||||
|
||||
// Create Local Registry for this test group
|
||||
let mut registry = crate::registry::Registry::new();
|
||||
|
||||
// Helper to register items with 'schemas'
|
||||
let register_schemas = |registry: &mut crate::registry::Registry, items_val: Option<&Value>| {
|
||||
if let Some(val) = items_val {
|
||||
if let Value::Array(arr) = val {
|
||||
for item in arr {
|
||||
if let Some(schemas_val) = item.get("schemas") {
|
||||
if let Value::Array(schemas) = schemas_val {
|
||||
for schema_val in schemas {
|
||||
if let Ok(schema) =
|
||||
serde_json::from_value::<crate::schema::Schema>(schema_val.clone())
|
||||
{
|
||||
registry.add(schema);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// 1. Register Family Schemas if 'types' is present
|
||||
if let Some(types_val) = &group.types {
|
||||
if let Value::Array(arr) = types_val {
|
||||
let mut family_map: std::collections::HashMap<String, std::collections::HashSet<String>> =
|
||||
std::collections::HashMap::new();
|
||||
|
||||
for item in arr {
|
||||
if let Some(name) = item.get("name").and_then(|v| v.as_str()) {
|
||||
if let Some(hierarchy) = item.get("hierarchy").and_then(|v| v.as_array()) {
|
||||
for ancestor in hierarchy {
|
||||
if let Some(anc_str) = ancestor.as_str() {
|
||||
family_map
|
||||
.entry(anc_str.to_string())
|
||||
.or_default()
|
||||
.insert(name.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (family_name, members) in family_map {
|
||||
let id = format!("{}.family", family_name);
|
||||
let object_refs: Vec<Value> = members
|
||||
.iter()
|
||||
.map(|s| serde_json::json!({ "$ref": s }))
|
||||
.collect();
|
||||
|
||||
let schema_json = serde_json::json!({
|
||||
"$id": id,
|
||||
"oneOf": object_refs
|
||||
});
|
||||
|
||||
if let Ok(schema) = serde_json::from_value::<crate::schema::Schema>(schema_json) {
|
||||
registry.add(schema);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Register items directly
|
||||
register_schemas(&mut registry, group.enums.as_ref());
|
||||
register_schemas(&mut registry, group.types.as_ref());
|
||||
register_schemas(&mut registry, group.puncs.as_ref());
|
||||
// Create Validator Instance and parse enums, types, and puncs automatically
|
||||
let mut validator = Validator::from_punc_definition(
|
||||
group.enums.as_ref(),
|
||||
group.types.as_ref(),
|
||||
group.puncs.as_ref(),
|
||||
);
|
||||
|
||||
// 3. Register root 'schemas' if present (generic test support)
|
||||
// Some tests use a raw 'schema' or 'schemas' field at the group level
|
||||
@ -126,12 +64,12 @@ pub fn run_test_file_at_index(path: &str, index: usize) -> Result<(), String> {
|
||||
Ok(mut schema) => {
|
||||
let id_clone = schema.obj.id.clone();
|
||||
if id_clone.is_some() {
|
||||
registry.add(schema);
|
||||
validator.registry.add(schema);
|
||||
} else {
|
||||
// Fallback ID if none provided in schema
|
||||
let id = format!("test:{}:{}", path, index);
|
||||
schema.obj.id = Some(id);
|
||||
registry.add(schema);
|
||||
validator.registry.add(schema);
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
@ -143,9 +81,6 @@ pub fn run_test_file_at_index(path: &str, index: usize) -> Result<(), String> {
|
||||
}
|
||||
}
|
||||
|
||||
// Create Validator Instance (Takes ownership of registry)
|
||||
let validator = Validator::new(registry);
|
||||
|
||||
// 4. Run Tests
|
||||
for (_test_index, test) in group.tests.iter().enumerate() {
|
||||
let mut schema_id = test.schema_id.clone();
|
||||
@ -251,79 +186,13 @@ pub fn run_test_file(path: &str) -> Result<(), String> {
|
||||
|
||||
let mut failures = Vec::<String>::new();
|
||||
for (group_index, group) in suite.into_iter().enumerate() {
|
||||
// Create Isolated Registry for this test group
|
||||
let mut registry = crate::registry::Registry::new();
|
||||
// Create Validator Instance and parse enums, types, and puncs automatically
|
||||
let mut validator = Validator::from_punc_definition(
|
||||
group.enums.as_ref(),
|
||||
group.types.as_ref(),
|
||||
group.puncs.as_ref(),
|
||||
);
|
||||
|
||||
// Helper to register items with 'schemas'
|
||||
let register_schemas = |registry: &mut crate::registry::Registry, items_val: Option<Value>| {
|
||||
if let Some(val) = items_val {
|
||||
if let Value::Array(arr) = val {
|
||||
for item in arr {
|
||||
if let Some(schemas_val) = item.get("schemas") {
|
||||
if let Value::Array(schemas) = schemas_val {
|
||||
for schema_val in schemas {
|
||||
if let Ok(schema) =
|
||||
serde_json::from_value::<crate::schema::Schema>(schema_val.clone())
|
||||
{
|
||||
registry.add(schema);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// 1. Register Family Schemas if 'types' is present
|
||||
if let Some(types_val) = &group.types {
|
||||
if let Value::Array(arr) = types_val {
|
||||
let mut family_map: std::collections::HashMap<String, std::collections::HashSet<String>> =
|
||||
std::collections::HashMap::new();
|
||||
|
||||
for item in arr {
|
||||
if let Some(name) = item.get("name").and_then(|v| v.as_str()) {
|
||||
// Default hierarchy contains self if not specified?
|
||||
// Usually hierarchy is explicit in these tests.
|
||||
if let Some(hierarchy) = item.get("hierarchy").and_then(|v| v.as_array()) {
|
||||
for ancestor in hierarchy {
|
||||
if let Some(anc_str) = ancestor.as_str() {
|
||||
family_map
|
||||
.entry(anc_str.to_string())
|
||||
.or_default()
|
||||
.insert(name.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (family_name, members) in family_map {
|
||||
let id = format!("{}.family", family_name);
|
||||
let object_refs: Vec<Value> = members
|
||||
.into_iter()
|
||||
.map(|s| serde_json::json!({ "$ref": s }))
|
||||
.collect();
|
||||
|
||||
let schema_json = serde_json::json!({
|
||||
"$id": id,
|
||||
"oneOf": object_refs
|
||||
});
|
||||
|
||||
if let Ok(schema) = serde_json::from_value::<crate::schema::Schema>(schema_json) {
|
||||
registry.add(schema);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Register 'types', 'enums', and 'puncs' if present (JSPG style)
|
||||
register_schemas(&mut registry, group.types);
|
||||
register_schemas(&mut registry, group.enums);
|
||||
register_schemas(&mut registry, group.puncs);
|
||||
|
||||
// Register main 'schema' if present (Standard style)
|
||||
// Ensure ID is a valid URI to avoid Url::parse errors in Compiler
|
||||
let unique_id = format!("test:{}:{}", path, group_index);
|
||||
|
||||
// Register main 'schema' if present (Standard style)
|
||||
@ -336,12 +205,9 @@ pub fn run_test_file(path: &str) -> Result<(), String> {
|
||||
if schema.obj.id.is_none() {
|
||||
schema.obj.id = Some(unique_id.clone());
|
||||
}
|
||||
registry.add(schema);
|
||||
validator.registry.add(schema);
|
||||
}
|
||||
|
||||
// Create Instance (Takes Ownership)
|
||||
let validator = Validator::new(registry);
|
||||
|
||||
for test in group.tests {
|
||||
// Use explicit schema_id from test, or default to unique_id
|
||||
let schema_id = test.schema_id.as_deref().unwrap_or(&unique_id).to_string();
|
||||
|
||||
1339
src/validator.rs
1339
src/validator.rs
File diff suppressed because it is too large
Load Diff
@ -156,6 +156,24 @@ fn test_puncs_7() {
|
||||
util::run_test_file_at_index(&path, 7).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_additional_properties_0() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 0).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_additional_properties_1() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 1).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_additional_properties_2() {
|
||||
let path = format!("{}/tests/fixtures/additionalProperties.json", env!("CARGO_MANIFEST_DIR"));
|
||||
util::run_test_file_at_index(&path, 2).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_exclusive_minimum_0() {
|
||||
let path = format!("{}/tests/fixtures/exclusiveMinimum.json", env!("CARGO_MANIFEST_DIR"));
|
||||
132
tests/fixtures/additionalProperties.json
vendored
Normal file
132
tests/fixtures/additionalProperties.json
vendored
Normal file
@ -0,0 +1,132 @@
|
||||
[
|
||||
{
|
||||
"description": "additionalProperties validates properties not matched by properties",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "string"
|
||||
},
|
||||
"bar": {
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"additionalProperties": {
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "defined properties are valid",
|
||||
"data": {
|
||||
"foo": "value",
|
||||
"bar": 123
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "additional property matching schema is valid",
|
||||
"data": {
|
||||
"foo": "value",
|
||||
"is_active": true,
|
||||
"hidden": false
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "additional property not matching schema is invalid",
|
||||
"data": {
|
||||
"foo": "value",
|
||||
"is_active": 1
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "extensible: true with additionalProperties still validates structure",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"foo": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"extensible": true,
|
||||
"additionalProperties": {
|
||||
"type": "integer"
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "additional property matching schema is valid",
|
||||
"data": {
|
||||
"foo": "hello",
|
||||
"count": 5,
|
||||
"age": 42
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "additional property not matching schema is invalid despite extensible: true",
|
||||
"data": {
|
||||
"foo": "hello",
|
||||
"count": "five"
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "complex additionalProperties with object and array items",
|
||||
"schema": {
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"properties": {
|
||||
"type": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"additionalProperties": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"tests": [
|
||||
{
|
||||
"description": "valid array of strings",
|
||||
"data": {
|
||||
"type": "my_type",
|
||||
"group_a": [
|
||||
"field1",
|
||||
"field2"
|
||||
],
|
||||
"group_b": [
|
||||
"field3"
|
||||
]
|
||||
},
|
||||
"valid": true
|
||||
},
|
||||
{
|
||||
"description": "invalid array of integers",
|
||||
"data": {
|
||||
"type": "my_type",
|
||||
"group_a": [
|
||||
1,
|
||||
2
|
||||
]
|
||||
},
|
||||
"valid": false
|
||||
},
|
||||
{
|
||||
"description": "invalid non-array type",
|
||||
"data": {
|
||||
"type": "my_type",
|
||||
"group_a": "field1"
|
||||
},
|
||||
"valid": false
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
34
tests/fixtures/puncs.json
vendored
34
tests/fixtures/puncs.json
vendored
@ -1067,7 +1067,7 @@
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "polymorphic_org_punc.request",
|
||||
"$ref": "organization.family"
|
||||
"$family": "organization"
|
||||
}
|
||||
]
|
||||
},
|
||||
@ -1080,6 +1080,21 @@
|
||||
"$ref": "organization"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "invalid_family_punc",
|
||||
"public": false,
|
||||
"schemas": [
|
||||
{
|
||||
"$id": "invalid_family_punc.request",
|
||||
"$family": "organization",
|
||||
"properties": {
|
||||
"extra": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"tests": [
|
||||
@ -1240,6 +1255,23 @@
|
||||
"path": "/first_name"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"description": "invalid schema due to family exclusivity violation",
|
||||
"schema_id": "invalid_family_punc.request",
|
||||
"data": {
|
||||
"id": "org-2",
|
||||
"type": "organization",
|
||||
"name": "Strict Corp",
|
||||
"extra": "value"
|
||||
},
|
||||
"valid": false,
|
||||
"expect_errors": [
|
||||
{
|
||||
"code": "INVALID_SCHEMA",
|
||||
"path": ""
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
113
tests/lib.rs
Normal file
113
tests/lib.rs
Normal file
@ -0,0 +1,113 @@
|
||||
use jspg::*;
|
||||
use pgrx::JsonB;
|
||||
use serde_json::json;
|
||||
|
||||
#[test]
|
||||
fn test_library_api() {
|
||||
// 1. Initially, schemas are not cached.
|
||||
assert!(!json_schema_cached("test_schema"));
|
||||
|
||||
// Expected uninitialized drop format: errors + null response
|
||||
let uninitialized_drop = validate_json_schema("test_schema", JsonB(json!({})));
|
||||
assert_eq!(
|
||||
uninitialized_drop.0,
|
||||
json!({
|
||||
"type": "drop",
|
||||
"errors": [{
|
||||
"code": "VALIDATOR_NOT_INITIALIZED",
|
||||
"message": "JSON Schemas have not been cached yet. Run cache_json_schemas()",
|
||||
"details": { "path": "" }
|
||||
}]
|
||||
})
|
||||
);
|
||||
|
||||
// 2. Cache schemas
|
||||
let puncs = json!([]);
|
||||
let types = json!([{
|
||||
"schemas": [{
|
||||
"$id": "test_schema",
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": { "type": "string" }
|
||||
},
|
||||
"required": ["name"]
|
||||
}]
|
||||
}]);
|
||||
let enums = json!([]);
|
||||
|
||||
let cache_drop = cache_json_schemas(JsonB(enums), JsonB(types), JsonB(puncs));
|
||||
assert_eq!(
|
||||
cache_drop.0,
|
||||
json!({
|
||||
"type": "drop",
|
||||
"response": "success"
|
||||
})
|
||||
);
|
||||
|
||||
// 3. Check schemas are cached
|
||||
assert!(json_schema_cached("test_schema"));
|
||||
|
||||
let show_drop = show_json_schemas();
|
||||
assert_eq!(
|
||||
show_drop.0,
|
||||
json!({
|
||||
"type": "drop",
|
||||
"response": ["test_schema"]
|
||||
})
|
||||
);
|
||||
|
||||
// 4. Validate Happy Path
|
||||
let happy_drop = validate_json_schema("test_schema", JsonB(json!({"name": "Neo"})));
|
||||
assert_eq!(
|
||||
happy_drop.0,
|
||||
json!({
|
||||
"type": "drop",
|
||||
"response": "success"
|
||||
})
|
||||
);
|
||||
|
||||
// 5. Validate Unhappy Path
|
||||
let unhappy_drop = validate_json_schema("test_schema", JsonB(json!({"wrong": "data"})));
|
||||
assert_eq!(
|
||||
unhappy_drop.0,
|
||||
json!({
|
||||
"type": "drop",
|
||||
"errors": [
|
||||
{
|
||||
"code": "REQUIRED_FIELD_MISSING",
|
||||
"message": "Missing name",
|
||||
"details": { "path": "/name" }
|
||||
},
|
||||
{
|
||||
"code": "STRICT_PROPERTY_VIOLATION",
|
||||
"message": "Unexpected property 'wrong'",
|
||||
"details": { "path": "/wrong" }
|
||||
}
|
||||
]
|
||||
})
|
||||
);
|
||||
|
||||
// 6. Mask Happy Path
|
||||
let mask_drop = mask_json_schema(
|
||||
"test_schema",
|
||||
JsonB(json!({"name": "Neo", "extra": "data"})),
|
||||
);
|
||||
assert_eq!(
|
||||
mask_drop.0,
|
||||
json!({
|
||||
"type": "drop",
|
||||
"response": {"name": "Neo"}
|
||||
})
|
||||
);
|
||||
|
||||
// 7. Clear Schemas
|
||||
let clear_drop = clear_json_schemas();
|
||||
assert_eq!(
|
||||
clear_drop.0,
|
||||
json!({
|
||||
"type": "drop",
|
||||
"response": "success"
|
||||
})
|
||||
);
|
||||
assert!(!json_schema_cached("test_schema"));
|
||||
}
|
||||
Reference in New Issue
Block a user