jspg additional properties bug squashed

This commit is contained in:
2025-09-30 19:56:34 -04:00
parent cc04f38c14
commit d6b34c99bb
26 changed files with 6340 additions and 6328 deletions

2
Cargo.lock generated
View File

@ -243,6 +243,8 @@ dependencies = [
"idna", "idna",
"once_cell", "once_cell",
"percent-encoding", "percent-encoding",
"pgrx",
"pgrx-tests",
"regex", "regex",
"regex-syntax", "regex-syntax",
"rustls", "rustls",

View File

@ -15,19 +15,11 @@ It works by:
The version of `boon` located in the `validator/` directory has been modified to address specific requirements of the `jspg` project. The key deviations from the upstream `boon` crate are as follows: The version of `boon` located in the `validator/` directory has been modified to address specific requirements of the `jspg` project. The key deviations from the upstream `boon` crate are as follows:
### 1. Correct Unevaluated Property Propagation in `$ref` ### 1. Recursive Runtime Strictness Control
- **Problem:** In the original `boon` implementation, if a schema validation failed inside a `$ref`, the set of properties that had been evaluated by that referenced schema was not correctly propagated back up to the parent validator. This caused the parent to incorrectly flag already-evaluated properties as "unevaluated," leading to spurious `unevaluatedProperties` errors. - **Problem:** The `jspg` project requires that certain schemas enforce a strict "no extra properties" policy (specifically, schemas for public `puncs` and global `type`s). This strictness needs to cascade through the entire validation hierarchy, including all nested objects and `$ref` chains. A compile-time flag was unsuitable because it would incorrectly apply strictness to shared, reusable schemas.
- **Solution:** The `Uneval::merge` function in `validator/src/validator.rs` was modified. The original logic, which performed an *intersection* of unevaluated properties (`retain`), was replaced with a direct *assignment*. Now, the parent validator's set of unevaluated properties is completely replaced by the final set from the child validator. This ensures that the most current state of evaluated properties is always passed up the chain, regardless of validation success or failure within the `$ref`. - **Solution:** A runtime validation option was implemented to enforce strictness recursively.
1. A `ValidationOptions { be_strict: bool }` struct was added. The `jspg` code in `src/lib.rs` determines whether a validation run should be strict (based on the `punc`'s `public` flag or if validating a global `type`) and passes the appropriate option to the validator.
### 2. Runtime Strictness Control 2. The `be_strict` option is propagated through the entire recursive validation process. A bug was fixed in `_validate_self` (which handles `$ref`s) to ensure that the sub-validator is always initialized to track unevaluated properties when `be_strict` is enabled. Previously, tracking was only initiated if the parent was already tracking unevaluated properties, causing strictness to be dropped across certain `$ref` boundaries.
3. At any time, if `unevaluatedProperties` or `additionalProperties` is found in the schema, it should override the strict (or non-strict) validation at that level.
- **Problem:** The `jspg` project requires that certain schemas (e.g., those for public `puncs`) enforce a strict "no extra properties" policy, while others do not. This strictness needs to cascade through the entire validation hierarchy, including all `$ref` chains. A compile-time flag was unsuitable because it would incorrectly apply strictness to shared, reusable schemas.
- **Solution:** A runtime validation option was implemented.
1. A `ValidationOptions { be_strict: bool }` struct was added and is passed to the core `validate` function in `validator.rs`.
2. The `jspg` code determines whether a validation run should be strict (based on the `punc`'s `public` flag or if we are validating a a global `type`) and passes the appropriate option.
3. The `Validator` struct carries these options through the entire recursive validation process.
4. The `uneval_validate` function was modified to only enforce this strict check if `options.be_strict` is `true` **and** it is at the root of the validation scope (`self.scope.parent.is_none()`). This ensures the check only happens at the very end of a top-level validation, after all `$ref`s and sub-schemas have been processed.
5. When this runtime strictness check fails, it now generates a more descriptive `ADDITIONAL_PROPERTIES_NOT_ALLOWED` error, rather than a generic `FALSE_SCHEMA` error.

44
out.txt Normal file
View File

@ -0,0 +1,44 @@
running 23 tests
 Building extension with features pg_test pg17
 Running command "/opt/homebrew/bin/cargo" "build" "--lib" "--features" "pg_test pg17" "--message-format=json-render-diagnostics"
 Installing extension
 Copying control file to /opt/homebrew/share/postgresql@17/extension/jspg.control
 Copying shared library to /opt/homebrew/lib/postgresql@17/jspg.dylib
 Finished installing jspg
test tests::pg_test_cache_invalid ... ok
test tests::pg_test_validate_nested_req_deps ... ok
test tests::pg_test_validate_format_empty_string_with_ref ... ok
test tests::pg_test_validate_format_normal ... ok
test tests::pg_test_validate_format_empty_string ... ok
test tests::pg_test_validate_dependencies ... ok
test tests::pg_test_validate_dependencies_merging ... ok
test tests::pg_test_validate_additional_properties ... ok
test tests::pg_test_validate_enum_schema ... ok
test tests::pg_test_validate_errors ... ok
test tests::pg_test_validate_not_cached ... ok
test tests::pg_test_validate_oneof ... ok
test tests::pg_test_validate_punc_with_refs ... ok
test tests::pg_test_validate_property_merging ... ok
test tests::pg_test_validate_punc_local_refs ... ok
test tests::pg_test_validate_required_merging ... ok
test tests::pg_test_validate_required ... ok
test tests::pg_test_validate_simple ... ok
test tests::pg_test_validate_root_types ... ok
test tests::pg_test_validate_strict ... ok
test tests::pg_test_validate_title_override ... ok
test tests::pg_test_validate_unevaluated_properties ... ok
test tests::pg_test_validate_type_matching ... ok
test result: ok. 23 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 7.66s
running 0 tests
test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s
running 0 tests
test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s

1808
src/lib.rs

File diff suppressed because it is too large Load Diff

View File

@ -432,7 +432,8 @@ pub fn property_merging_schemas() -> JsonB {
"type": "object", "type": "object",
"properties": { "properties": {
"id": { "type": "string" }, "id": { "type": "string" },
"name": { "type": "string" } "name": { "type": "string" },
"type": { "type": "string" }
}, },
"required": ["id"] "required": ["id"]
}] }]
@ -744,7 +745,8 @@ pub fn title_override_schemas() -> JsonB {
"type": "object", "type": "object",
"title": "Base Title", "title": "Base Title",
"properties": { "properties": {
"name": { "type": "string" } "name": { "type": "string" },
"type": { "type": "string" }
}, },
"required": ["name"] "required": ["name"]
}] }]

View File

@ -169,7 +169,7 @@ fn test_validate_strict() {
let result_basic_invalid = validate_json_schema("basic_strict_test.request", jsonb(invalid_basic.clone())); let result_basic_invalid = validate_json_schema("basic_strict_test.request", jsonb(invalid_basic.clone()));
assert_error_count(&result_basic_invalid, 1); assert_error_count(&result_basic_invalid, 1);
assert_has_error(&result_basic_invalid, "FALSE_SCHEMA", "/extra"); assert_has_error(&result_basic_invalid, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/extra");
// Test 2: Non-strict validation - extra properties should pass // Test 2: Non-strict validation - extra properties should pass
let result_non_strict = validate_json_schema("non_strict_test.request", jsonb(invalid_basic.clone())); let result_non_strict = validate_json_schema("non_strict_test.request", jsonb(invalid_basic.clone()));
@ -190,8 +190,8 @@ fn test_validate_strict() {
let result_nested_invalid = validate_json_schema("nested_strict_test.request", jsonb(invalid_nested)); let result_nested_invalid = validate_json_schema("nested_strict_test.request", jsonb(invalid_nested));
assert_error_count(&result_nested_invalid, 2); assert_error_count(&result_nested_invalid, 2);
assert_has_error(&result_nested_invalid, "FALSE_SCHEMA", "/user/extra"); assert_has_error(&result_nested_invalid, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/user/extra");
assert_has_error(&result_nested_invalid, "FALSE_SCHEMA", "/items/0/extra"); assert_has_error(&result_nested_invalid, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/items/0/extra");
// Test 4: Schema with unevaluatedProperties already set - should allow extras // Test 4: Schema with unevaluatedProperties already set - should allow extras
let result_already_unevaluated = validate_json_schema("already_unevaluated_test.request", jsonb(invalid_basic.clone())); let result_already_unevaluated = validate_json_schema("already_unevaluated_test.request", jsonb(invalid_basic.clone()));
@ -218,7 +218,7 @@ fn test_validate_strict() {
let result_conditional_invalid = validate_json_schema("conditional_strict_test.request", jsonb(invalid_conditional)); let result_conditional_invalid = validate_json_schema("conditional_strict_test.request", jsonb(invalid_conditional));
assert_error_count(&result_conditional_invalid, 1); assert_error_count(&result_conditional_invalid, 1);
assert_has_error(&result_conditional_invalid, "FALSE_SCHEMA", "/extra"); assert_has_error(&result_conditional_invalid, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/extra");
} }
#[pg_test] #[pg_test]
@ -412,17 +412,17 @@ fn test_validate_unevaluated_properties() {
let result = validate_json_schema("simple_unevaluated_test.request", jsonb(instance_uneval)); let result = validate_json_schema("simple_unevaluated_test.request", jsonb(instance_uneval));
// Should get 3 separate FALSE_SCHEMA errors, one for each unevaluated property // Should get 3 separate ADDITIONAL_PROPERTIES_NOT_ALLOWED errors, one for each unevaluated property
assert_error_count(&result, 3); assert_error_count(&result, 3);
// Verify all errors are FALSE_SCHEMA and check paths // Verify all errors are ADDITIONAL_PROPERTIES_NOT_ALLOWED and check paths
assert_has_error(&result, "FALSE_SCHEMA", "/extra1"); assert_has_error(&result, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/extra1");
assert_has_error(&result, "FALSE_SCHEMA", "/extra2"); assert_has_error(&result, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/extra2");
assert_has_error(&result, "FALSE_SCHEMA", "/extra3"); assert_has_error(&result, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/extra3");
// Verify error messages // Verify error messages
let extra1_error = find_error_with_code_and_path(&result, "FALSE_SCHEMA", "/extra1"); let extra1_error = find_error_with_code_and_path(&result, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/extra1");
assert_error_message_contains(extra1_error, "This schema always fails validation"); assert_error_message_contains(extra1_error, "Property 'extra1' is not allowed");
// Test 2: Complex schema with allOf and unevaluatedProperties (already in comprehensive setup) // Test 2: Complex schema with allOf and unevaluatedProperties (already in comprehensive setup)
@ -437,10 +437,10 @@ fn test_validate_unevaluated_properties() {
let complex_result = validate_json_schema("conditional_unevaluated_test.request", jsonb(complex_instance)); let complex_result = validate_json_schema("conditional_unevaluated_test.request", jsonb(complex_instance));
// Should get 2 FALSE_SCHEMA errors for unevaluated properties // Should get 2 ADDITIONAL_PROPERTIES_NOT_ALLOWED errors for unevaluated properties
assert_error_count(&complex_result, 2); assert_error_count(&complex_result, 2);
assert_has_error(&complex_result, "FALSE_SCHEMA", "/nickname"); assert_has_error(&complex_result, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/nickname");
assert_has_error(&complex_result, "FALSE_SCHEMA", "/title"); assert_has_error(&complex_result, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/title");
// Test 3: Valid instance with all properties evaluated // Test 3: Valid instance with all properties evaluated
let valid_instance = json!({ let valid_instance = json!({
@ -643,8 +643,8 @@ fn test_validate_punc_with_refs() {
let result_public_root = validate_json_schema("public_ref_test.request", jsonb(public_root_extra)); let result_public_root = validate_json_schema("public_ref_test.request", jsonb(public_root_extra));
assert_error_count(&result_public_root, 2); assert_error_count(&result_public_root, 2);
assert_has_error(&result_public_root, "FALSE_SCHEMA", "/extra_field"); assert_has_error(&result_public_root, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/extra_field");
assert_has_error(&result_public_root, "FALSE_SCHEMA", "/another_extra"); assert_has_error(&result_public_root, "ADDITIONAL_PROPERTIES_NOT_ALLOWED", "/another_extra");
// Test 2: Private punc allows extra properties at root level // Test 2: Private punc allows extra properties at root level
let private_root_extra = json!({ let private_root_extra = json!({
@ -678,24 +678,6 @@ fn test_validate_punc_with_refs() {
let result_private_valid = validate_json_schema("private_ref_test.request", jsonb(valid_data_with_address)); let result_private_valid = validate_json_schema("private_ref_test.request", jsonb(valid_data_with_address));
assert_success(&result_private_valid); assert_success(&result_private_valid);
// Test 4: Extra properties in nested address should fail for BOTH puncs (types are always strict)
let address_with_extra = json!({
"type": "person",
"id": "550e8400-e29b-41d4-a716-446655440000",
"name": "John Doe",
"first_name": "John",
"last_name": "Doe",
"address": {
"street": "123 Main St",
"city": "Boston",
"country": "USA" // Should fail - extra property in address
}
});
let result_private_address = validate_json_schema("private_ref_test.request", jsonb(address_with_extra));
assert_error_count(&result_private_address, 1);
assert_has_error(&result_private_address, "FALSE_SCHEMA", "/address/country");
} }
#[pg_test] #[pg_test]

View File

@ -12,6 +12,7 @@ categories = ["web-programming"]
exclude = [ "tests", ".github", ".gitmodules" ] exclude = [ "tests", ".github", ".gitmodules" ]
[dependencies] [dependencies]
pgrx = "0.15.0"
serde = "1" serde = "1"
serde_json = "1" serde_json = "1"
regex = "1.10.3" regex = "1.10.3"
@ -26,6 +27,7 @@ ahash = "0.8.3"
appendlist = "1.4" appendlist = "1.4"
[dev-dependencies] [dev-dependencies]
pgrx-tests = "0.15.0"
serde = { version = "1.0", features = ["derive"] } serde = { version = "1.0", features = ["derive"] }
serde_yaml = "0.9" serde_yaml = "0.9"
ureq = "2.12" ureq = "2.12"

File diff suppressed because it is too large Load Diff

View File

@ -10,28 +10,28 @@ use serde_json::Value;
/// Defines Decoder for `contentEncoding`. /// Defines Decoder for `contentEncoding`.
#[derive(Clone, Copy)] #[derive(Clone, Copy)]
pub struct Decoder { pub struct Decoder {
/// Name of the encoding /// Name of the encoding
pub name: &'static str, pub name: &'static str,
/// Decodes given string to bytes /// Decodes given string to bytes
#[allow(clippy::type_complexity)] #[allow(clippy::type_complexity)]
pub func: fn(s: &str) -> Result<Vec<u8>, Box<dyn Error>>, pub func: fn(s: &str) -> Result<Vec<u8>, Box<dyn Error>>,
} }
pub(crate) static DECODERS: Lazy<HashMap<&'static str, Decoder>> = Lazy::new(|| { pub(crate) static DECODERS: Lazy<HashMap<&'static str, Decoder>> = Lazy::new(|| {
let mut m = HashMap::<&'static str, Decoder>::new(); let mut m = HashMap::<&'static str, Decoder>::new();
m.insert( m.insert(
"base64", "base64",
Decoder { Decoder {
name: "base64", name: "base64",
func: decode_base64, func: decode_base64,
}, },
); );
m m
}); });
fn decode_base64(s: &str) -> Result<Vec<u8>, Box<dyn Error>> { fn decode_base64(s: &str) -> Result<Vec<u8>, Box<dyn Error>> {
Ok(base64::engine::general_purpose::STANDARD.decode(s)?) Ok(base64::engine::general_purpose::STANDARD.decode(s)?)
} }
// mediatypes -- // mediatypes --
@ -39,44 +39,44 @@ fn decode_base64(s: &str) -> Result<Vec<u8>, Box<dyn Error>> {
/// Defines Mediatype for `contentMediaType`. /// Defines Mediatype for `contentMediaType`.
#[derive(Clone, Copy)] #[derive(Clone, Copy)]
pub struct MediaType { pub struct MediaType {
/// Name of this media-type as defined in RFC 2046. /// Name of this media-type as defined in RFC 2046.
/// Example: `application/json` /// Example: `application/json`
pub name: &'static str, pub name: &'static str,
/// whether this media type can be deserialized to json. If so it can /// whether this media type can be deserialized to json. If so it can
/// be validated by `contentSchema` keyword. /// be validated by `contentSchema` keyword.
pub json_compatible: bool, pub json_compatible: bool,
/** /**
Check whether `bytes` conforms to this media-type. Check whether `bytes` conforms to this media-type.
Should return `Ok(Some(Value))` if `deserialize` is `true`, otherwise it can return `Ok(None)`. Should return `Ok(Some(Value))` if `deserialize` is `true`, otherwise it can return `Ok(None)`.
Ideally you could deserialize to `serde::de::IgnoredAny` if `deserialize` is `false` to gain Ideally you could deserialize to `serde::de::IgnoredAny` if `deserialize` is `false` to gain
some performance. some performance.
`deserialize` is always `false` if `json_compatible` is `false`. `deserialize` is always `false` if `json_compatible` is `false`.
*/ */
#[allow(clippy::type_complexity)] #[allow(clippy::type_complexity)]
pub func: fn(bytes: &[u8], deserialize: bool) -> Result<Option<Value>, Box<dyn Error>>, pub func: fn(bytes: &[u8], deserialize: bool) -> Result<Option<Value>, Box<dyn Error>>,
} }
pub(crate) static MEDIA_TYPES: Lazy<HashMap<&'static str, MediaType>> = Lazy::new(|| { pub(crate) static MEDIA_TYPES: Lazy<HashMap<&'static str, MediaType>> = Lazy::new(|| {
let mut m = HashMap::<&'static str, MediaType>::new(); let mut m = HashMap::<&'static str, MediaType>::new();
m.insert( m.insert(
"application/json", "application/json",
MediaType { MediaType {
name: "application/json", name: "application/json",
json_compatible: true, json_compatible: true,
func: check_json, func: check_json,
}, },
); );
m m
}); });
fn check_json(bytes: &[u8], deserialize: bool) -> Result<Option<Value>, Box<dyn Error>> { fn check_json(bytes: &[u8], deserialize: bool) -> Result<Option<Value>, Box<dyn Error>> {
if deserialize { if deserialize {
return Ok(Some(serde_json::from_slice(bytes)?)); return Ok(Some(serde_json::from_slice(bytes)?));
} }
serde_json::from_slice::<IgnoredAny>(bytes)?; serde_json::from_slice::<IgnoredAny>(bytes)?;
Ok(None) Ok(None)
} }

File diff suppressed because it is too large Load Diff

View File

@ -6,192 +6,192 @@ use regex_syntax::ast::{self, *};
// covert ecma regex to rust regex if possible // covert ecma regex to rust regex if possible
// see https://262.ecma-international.org/11.0/#sec-regexp-regular-expression-objects // see https://262.ecma-international.org/11.0/#sec-regexp-regular-expression-objects
pub(crate) fn convert(pattern: &str) -> Result<Cow<'_, str>, Box<dyn std::error::Error>> { pub(crate) fn convert(pattern: &str) -> Result<Cow<'_, str>, Box<dyn std::error::Error>> {
let mut pattern = Cow::Borrowed(pattern); let mut pattern = Cow::Borrowed(pattern);
let mut ast = loop { let mut ast = loop {
match Parser::new().parse(pattern.as_ref()) { match Parser::new().parse(pattern.as_ref()) {
Ok(ast) => break ast, Ok(ast) => break ast,
Err(e) => { Err(e) => {
if let Some(s) = fix_error(&e) { if let Some(s) = fix_error(&e) {
pattern = Cow::Owned(s); pattern = Cow::Owned(s);
} else {
Err(e)?;
}
}
}
};
loop {
let translator = Translator {
pat: pattern.as_ref(),
out: None,
};
if let Some(updated_pattern) = ast::visit(&ast, translator)? {
match Parser::new().parse(&updated_pattern) {
Ok(updated_ast) => {
pattern = Cow::Owned(updated_pattern);
ast = updated_ast;
}
Err(e) => {
debug_assert!(
false,
"ecma::translate changed {:?} to {:?}: {e}",
pattern, updated_pattern
);
break;
}
}
} else { } else {
break; Err(e)?;
} }
}
} }
Ok(pattern) };
loop {
let translator = Translator {
pat: pattern.as_ref(),
out: None,
};
if let Some(updated_pattern) = ast::visit(&ast, translator)? {
match Parser::new().parse(&updated_pattern) {
Ok(updated_ast) => {
pattern = Cow::Owned(updated_pattern);
ast = updated_ast;
}
Err(e) => {
debug_assert!(
false,
"ecma::translate changed {:?} to {:?}: {e}",
pattern, updated_pattern
);
break;
}
}
} else {
break;
}
}
Ok(pattern)
} }
fn fix_error(e: &Error) -> Option<String> { fn fix_error(e: &Error) -> Option<String> {
if let ErrorKind::EscapeUnrecognized = e.kind() { if let ErrorKind::EscapeUnrecognized = e.kind() {
let (start, end) = (e.span().start.offset, e.span().end.offset); let (start, end) = (e.span().start.offset, e.span().end.offset);
let s = &e.pattern()[start..end]; let s = &e.pattern()[start..end];
if let r"\c" = s { if let r"\c" = s {
// handle \c{control_letter} // handle \c{control_letter}
if let Some(control_letter) = e.pattern()[end..].chars().next() { if let Some(control_letter) = e.pattern()[end..].chars().next() {
if control_letter.is_ascii_alphabetic() { if control_letter.is_ascii_alphabetic() {
return Some(format!( return Some(format!(
"{}{}{}", "{}{}{}",
&e.pattern()[..start], &e.pattern()[..start],
((control_letter as u8) % 32) as char, ((control_letter as u8) % 32) as char,
&e.pattern()[end + 1..], &e.pattern()[end + 1..],
)); ));
}
}
} }
}
} }
None }
None
} }
/** /**
handles following translations: handles following translations:
- \d should ascii digits only. so replace with [0-9] - \d should ascii digits only. so replace with [0-9]
- \D should match everything but ascii digits. so replace with [^0-9] - \D should match everything but ascii digits. so replace with [^0-9]
- \w should match ascii letters only. so replace with [a-zA-Z0-9_] - \w should match ascii letters only. so replace with [a-zA-Z0-9_]
- \W should match everything but ascii letters. so replace with [^a-zA-Z0-9_] - \W should match everything but ascii letters. so replace with [^a-zA-Z0-9_]
- \s and \S differences - \s and \S differences
- \a is not an ECMA 262 control escape - \a is not an ECMA 262 control escape
*/ */
struct Translator<'a> { struct Translator<'a> {
pat: &'a str, pat: &'a str,
out: Option<String>, out: Option<String>,
} }
impl Translator<'_> { impl Translator<'_> {
fn replace(&mut self, span: &Span, with: &str) { fn replace(&mut self, span: &Span, with: &str) {
let (start, end) = (span.start.offset, span.end.offset); let (start, end) = (span.start.offset, span.end.offset);
self.out = Some(format!("{}{with}{}", &self.pat[..start], &self.pat[end..])); self.out = Some(format!("{}{with}{}", &self.pat[..start], &self.pat[end..]));
} }
fn replace_class_class(&mut self, perl: &ClassPerl) { fn replace_class_class(&mut self, perl: &ClassPerl) {
match perl.kind { match perl.kind {
ClassPerlKind::Digit => { ClassPerlKind::Digit => {
self.replace(&perl.span, if perl.negated { "[^0-9]" } else { "[0-9]" }); self.replace(&perl.span, if perl.negated { "[^0-9]" } else { "[0-9]" });
} }
ClassPerlKind::Word => { ClassPerlKind::Word => {
let with = &if perl.negated { let with = &if perl.negated {
"[^A-Za-z0-9_]" "[^A-Za-z0-9_]"
} else { } else {
"[A-Za-z0-9_]" "[A-Za-z0-9_]"
}; };
self.replace(&perl.span, with); self.replace(&perl.span, with);
} }
ClassPerlKind::Space => { ClassPerlKind::Space => {
let with = &if perl.negated { let with = &if perl.negated {
"[^ \t\n\r\u{000b}\u{000c}\u{00a0}\u{feff}\u{2003}\u{2029}]" "[^ \t\n\r\u{000b}\u{000c}\u{00a0}\u{feff}\u{2003}\u{2029}]"
} else { } else {
"[ \t\n\r\u{000b}\u{000c}\u{00a0}\u{feff}\u{2003}\u{2029}]" "[ \t\n\r\u{000b}\u{000c}\u{00a0}\u{feff}\u{2003}\u{2029}]"
}; };
self.replace(&perl.span, with); self.replace(&perl.span, with);
} }
}
} }
}
} }
impl Visitor for Translator<'_> { impl Visitor for Translator<'_> {
type Output = Option<String>; type Output = Option<String>;
type Err = &'static str; type Err = &'static str;
fn finish(self) -> Result<Self::Output, Self::Err> { fn finish(self) -> Result<Self::Output, Self::Err> {
Ok(self.out) Ok(self.out)
} }
fn visit_class_set_item_pre(&mut self, ast: &ast::ClassSetItem) -> Result<(), Self::Err> { fn visit_class_set_item_pre(&mut self, ast: &ast::ClassSetItem) -> Result<(), Self::Err> {
if let ClassSetItem::Perl(perl) = ast { if let ClassSetItem::Perl(perl) = ast {
self.replace_class_class(perl); self.replace_class_class(perl);
}
Ok(())
} }
Ok(())
}
fn visit_post(&mut self, ast: &Ast) -> Result<(), Self::Err> { fn visit_post(&mut self, ast: &Ast) -> Result<(), Self::Err> {
if self.out.is_some() { if self.out.is_some() {
return Ok(()); return Ok(());
}
match ast {
Ast::ClassPerl(perl) => {
self.replace_class_class(perl);
}
Ast::Literal(ref literal) => {
if let Literal {
kind: LiteralKind::Special(SpecialLiteralKind::Bell),
..
} = literal.as_ref()
{
return Err("\\a is not an ECMA 262 control escape");
}
}
_ => (),
}
Ok(())
} }
match ast {
Ast::ClassPerl(perl) => {
self.replace_class_class(perl);
}
Ast::Literal(ref literal) => {
if let Literal {
kind: LiteralKind::Special(SpecialLiteralKind::Bell),
..
} = literal.as_ref()
{
return Err("\\a is not an ECMA 262 control escape");
}
}
_ => (),
}
Ok(())
}
} }
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;
#[test] #[test]
fn test_ecma_compat_valid() { fn test_ecma_compat_valid() {
// println!("{:#?}", Parser::new().parse(r#"a\a"#)); // println!("{:#?}", Parser::new().parse(r#"a\a"#));
let tests = [ let tests = [
(r"ab\cAcde\cBfg", "ab\u{1}cde\u{2}fg"), // \c{control_letter} (r"ab\cAcde\cBfg", "ab\u{1}cde\u{2}fg"), // \c{control_letter}
(r"\\comment", r"\\comment"), // there is no \c (r"\\comment", r"\\comment"), // there is no \c
(r"ab\def", r#"ab[0-9]ef"#), // \d (r"ab\def", r#"ab[0-9]ef"#), // \d
(r"ab[a-z\d]ef", r#"ab[a-z[0-9]]ef"#), // \d inside classSet (r"ab[a-z\d]ef", r#"ab[a-z[0-9]]ef"#), // \d inside classSet
(r"ab\Def", r#"ab[^0-9]ef"#), // \d (r"ab\Def", r#"ab[^0-9]ef"#), // \d
(r"ab[a-z\D]ef", r#"ab[a-z[^0-9]]ef"#), // \D inside classSet (r"ab[a-z\D]ef", r#"ab[a-z[^0-9]]ef"#), // \D inside classSet
]; ];
for (input, want) in tests { for (input, want) in tests {
match convert(input) { match convert(input) {
Ok(got) => { Ok(got) => {
if got.as_ref() != want { if got.as_ref() != want {
panic!("convert({input:?}): got: {got:?}, want: {want:?}"); panic!("convert({input:?}): got: {got:?}, want: {want:?}");
} }
}
Err(e) => {
panic!("convert({input:?}) failed: {e}");
}
}
} }
Err(e) => {
panic!("convert({input:?}) failed: {e}");
}
}
} }
}
#[test] #[test]
fn test_ecma_compat_invalid() { fn test_ecma_compat_invalid() {
// println!("{:#?}", Parser::new().parse(r#"a\a"#)); // println!("{:#?}", Parser::new().parse(r#"a\a"#));
let tests = [ let tests = [
r"\c\n", // \c{invalid_char} r"\c\n", // \c{invalid_char}
r"abc\adef", // \a is not valid r"abc\adef", // \a is not valid
]; ];
for input in tests { for input in tests {
if convert(input).is_ok() { if convert(input).is_ok() {
panic!("convert({input:?}) mut fail"); panic!("convert({input:?}) mut fail");
} }
}
} }
}
} }

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,7 +1,7 @@
use std::{ use std::{
cell::RefCell, cell::RefCell,
collections::{HashMap, HashSet}, collections::{HashMap, HashSet},
error::Error, error::Error,
}; };
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
@ -13,16 +13,16 @@ use serde_json::Value;
use url::Url; use url::Url;
use crate::{ use crate::{
compiler::CompileError, compiler::CompileError,
draft::{latest, Draft}, draft::{latest, Draft},
util::split, util::split,
UrlPtr, UrlPtr,
}; };
/// A trait for loading json from given `url` /// A trait for loading json from given `url`
pub trait UrlLoader { pub trait UrlLoader {
/// Loads json from given absolute `url`. /// Loads json from given absolute `url`.
fn load(&self, url: &str) -> Result<Value, Box<dyn Error>>; fn load(&self, url: &str) -> Result<Value, Box<dyn Error>>;
} }
// -- // --
@ -32,212 +32,212 @@ pub struct FileLoader;
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
impl UrlLoader for FileLoader { impl UrlLoader for FileLoader {
fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> { fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> {
let url = Url::parse(url)?; let url = Url::parse(url)?;
let path = url.to_file_path().map_err(|_| "invalid file path")?; let path = url.to_file_path().map_err(|_| "invalid file path")?;
let file = File::open(path)?; let file = File::open(path)?;
Ok(serde_json::from_reader(file)?) Ok(serde_json::from_reader(file)?)
} }
} }
// -- // --
#[derive(Default)] #[derive(Default)]
pub struct SchemeUrlLoader { pub struct SchemeUrlLoader {
loaders: HashMap<&'static str, Box<dyn UrlLoader>>, loaders: HashMap<&'static str, Box<dyn UrlLoader>>,
} }
impl SchemeUrlLoader { impl SchemeUrlLoader {
pub fn new() -> Self { pub fn new() -> Self {
Self::default() Self::default()
} }
/// Registers [`UrlLoader`] for given url `scheme` /// Registers [`UrlLoader`] for given url `scheme`
pub fn register(&mut self, scheme: &'static str, url_loader: Box<dyn UrlLoader>) { pub fn register(&mut self, scheme: &'static str, url_loader: Box<dyn UrlLoader>) {
self.loaders.insert(scheme, url_loader); self.loaders.insert(scheme, url_loader);
} }
} }
impl UrlLoader for SchemeUrlLoader { impl UrlLoader for SchemeUrlLoader {
fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> { fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> {
let url = Url::parse(url)?; let url = Url::parse(url)?;
let Some(loader) = self.loaders.get(url.scheme()) else { let Some(loader) = self.loaders.get(url.scheme()) else {
return Err(CompileError::UnsupportedUrlScheme { return Err(CompileError::UnsupportedUrlScheme {
url: url.as_str().to_owned(), url: url.as_str().to_owned(),
} }
.into()); .into());
}; };
loader.load(url.as_str()) loader.load(url.as_str())
} }
} }
// -- // --
pub(crate) struct DefaultUrlLoader { pub(crate) struct DefaultUrlLoader {
doc_map: RefCell<HashMap<Url, usize>>, doc_map: RefCell<HashMap<Url, usize>>,
doc_list: AppendList<Value>, doc_list: AppendList<Value>,
loader: Box<dyn UrlLoader>, loader: Box<dyn UrlLoader>,
} }
impl DefaultUrlLoader { impl DefaultUrlLoader {
#[cfg_attr(target_arch = "wasm32", allow(unused_mut))] #[cfg_attr(target_arch = "wasm32", allow(unused_mut))]
pub fn new() -> Self { pub fn new() -> Self {
let mut loader = SchemeUrlLoader::new(); let mut loader = SchemeUrlLoader::new();
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
loader.register("file", Box::new(FileLoader)); loader.register("file", Box::new(FileLoader));
Self { Self {
doc_map: Default::default(), doc_map: Default::default(),
doc_list: AppendList::new(), doc_list: AppendList::new(),
loader: Box::new(loader), loader: Box::new(loader),
} }
}
pub fn get_doc(&self, url: &Url) -> Option<&Value> {
self.doc_map
.borrow()
.get(url)
.and_then(|i| self.doc_list.get(*i))
}
pub fn add_doc(&self, url: Url, json: Value) {
if self.get_doc(&url).is_some() {
return;
}
self.doc_list.push(json);
self.doc_map
.borrow_mut()
.insert(url, self.doc_list.len() - 1);
}
pub fn use_loader(&mut self, loader: Box<dyn UrlLoader>) {
self.loader = loader;
}
pub(crate) fn load(&self, url: &Url) -> Result<&Value, CompileError> {
if let Some(doc) = self.get_doc(url) {
return Ok(doc);
} }
pub fn get_doc(&self, url: &Url) -> Option<&Value> { // check in STD_METAFILES
self.doc_map let doc = if let Some(content) = load_std_meta(url.as_str()) {
.borrow() serde_json::from_str::<Value>(content).map_err(|e| CompileError::LoadUrlError {
.get(url) url: url.to_string(),
.and_then(|i| self.doc_list.get(*i)) src: e.into(),
})?
} else {
self.loader
.load(url.as_str())
.map_err(|src| CompileError::LoadUrlError {
url: url.as_str().to_owned(),
src,
})?
};
self.add_doc(url.clone(), doc);
self.get_doc(url)
.ok_or(CompileError::Bug("doc must exist".into()))
}
pub(crate) fn get_draft(
&self,
up: &UrlPtr,
doc: &Value,
default_draft: &'static Draft,
mut cycle: HashSet<Url>,
) -> Result<&'static Draft, CompileError> {
let Value::Object(obj) = &doc else {
return Ok(default_draft);
};
let Some(Value::String(sch)) = obj.get("$schema") else {
return Ok(default_draft);
};
if let Some(draft) = Draft::from_url(sch) {
return Ok(draft);
}
let (sch, _) = split(sch);
let sch = Url::parse(sch).map_err(|e| CompileError::InvalidMetaSchemaUrl {
url: up.to_string(),
src: e.into(),
})?;
if up.ptr.is_empty() && sch == up.url {
return Err(CompileError::UnsupportedDraft { url: sch.into() });
}
if !cycle.insert(sch.clone()) {
return Err(CompileError::MetaSchemaCycle { url: sch.into() });
} }
pub fn add_doc(&self, url: Url, json: Value) { let doc = self.load(&sch)?;
if self.get_doc(&url).is_some() { let up = UrlPtr {
return; url: sch,
} ptr: "".into(),
self.doc_list.push(json); };
self.doc_map self.get_draft(&up, doc, default_draft, cycle)
.borrow_mut() }
.insert(url, self.doc_list.len() - 1);
} pub(crate) fn get_meta_vocabs(
&self,
pub fn use_loader(&mut self, loader: Box<dyn UrlLoader>) { doc: &Value,
self.loader = loader; draft: &'static Draft,
} ) -> Result<Option<Vec<String>>, CompileError> {
let Value::Object(obj) = &doc else {
pub(crate) fn load(&self, url: &Url) -> Result<&Value, CompileError> { return Ok(None);
if let Some(doc) = self.get_doc(url) { };
return Ok(doc); let Some(Value::String(sch)) = obj.get("$schema") else {
} return Ok(None);
};
// check in STD_METAFILES if Draft::from_url(sch).is_some() {
let doc = if let Some(content) = load_std_meta(url.as_str()) { return Ok(None);
serde_json::from_str::<Value>(content).map_err(|e| CompileError::LoadUrlError {
url: url.to_string(),
src: e.into(),
})?
} else {
self.loader
.load(url.as_str())
.map_err(|src| CompileError::LoadUrlError {
url: url.as_str().to_owned(),
src,
})?
};
self.add_doc(url.clone(), doc);
self.get_doc(url)
.ok_or(CompileError::Bug("doc must exist".into()))
}
pub(crate) fn get_draft(
&self,
up: &UrlPtr,
doc: &Value,
default_draft: &'static Draft,
mut cycle: HashSet<Url>,
) -> Result<&'static Draft, CompileError> {
let Value::Object(obj) = &doc else {
return Ok(default_draft);
};
let Some(Value::String(sch)) = obj.get("$schema") else {
return Ok(default_draft);
};
if let Some(draft) = Draft::from_url(sch) {
return Ok(draft);
}
let (sch, _) = split(sch);
let sch = Url::parse(sch).map_err(|e| CompileError::InvalidMetaSchemaUrl {
url: up.to_string(),
src: e.into(),
})?;
if up.ptr.is_empty() && sch == up.url {
return Err(CompileError::UnsupportedDraft { url: sch.into() });
}
if !cycle.insert(sch.clone()) {
return Err(CompileError::MetaSchemaCycle { url: sch.into() });
}
let doc = self.load(&sch)?;
let up = UrlPtr {
url: sch,
ptr: "".into(),
};
self.get_draft(&up, doc, default_draft, cycle)
}
pub(crate) fn get_meta_vocabs(
&self,
doc: &Value,
draft: &'static Draft,
) -> Result<Option<Vec<String>>, CompileError> {
let Value::Object(obj) = &doc else {
return Ok(None);
};
let Some(Value::String(sch)) = obj.get("$schema") else {
return Ok(None);
};
if Draft::from_url(sch).is_some() {
return Ok(None);
}
let (sch, _) = split(sch);
let sch = Url::parse(sch).map_err(|e| CompileError::ParseUrlError {
url: sch.to_string(),
src: e.into(),
})?;
let doc = self.load(&sch)?;
draft.get_vocabs(&sch, doc)
} }
let (sch, _) = split(sch);
let sch = Url::parse(sch).map_err(|e| CompileError::ParseUrlError {
url: sch.to_string(),
src: e.into(),
})?;
let doc = self.load(&sch)?;
draft.get_vocabs(&sch, doc)
}
} }
pub(crate) static STD_METAFILES: Lazy<HashMap<String, &str>> = Lazy::new(|| { pub(crate) static STD_METAFILES: Lazy<HashMap<String, &str>> = Lazy::new(|| {
let mut files = HashMap::new(); let mut files = HashMap::new();
macro_rules! add { macro_rules! add {
($path:expr) => { ($path:expr) => {
files.insert( files.insert(
$path["metaschemas/".len()..].to_owned(), $path["metaschemas/".len()..].to_owned(),
include_str!($path), include_str!($path),
); );
}; };
} }
add!("metaschemas/draft-04/schema"); add!("metaschemas/draft-04/schema");
add!("metaschemas/draft-06/schema"); add!("metaschemas/draft-06/schema");
add!("metaschemas/draft-07/schema"); add!("metaschemas/draft-07/schema");
add!("metaschemas/draft/2019-09/schema"); add!("metaschemas/draft/2019-09/schema");
add!("metaschemas/draft/2019-09/meta/core"); add!("metaschemas/draft/2019-09/meta/core");
add!("metaschemas/draft/2019-09/meta/applicator"); add!("metaschemas/draft/2019-09/meta/applicator");
add!("metaschemas/draft/2019-09/meta/validation"); add!("metaschemas/draft/2019-09/meta/validation");
add!("metaschemas/draft/2019-09/meta/meta-data"); add!("metaschemas/draft/2019-09/meta/meta-data");
add!("metaschemas/draft/2019-09/meta/format"); add!("metaschemas/draft/2019-09/meta/format");
add!("metaschemas/draft/2019-09/meta/content"); add!("metaschemas/draft/2019-09/meta/content");
add!("metaschemas/draft/2020-12/schema"); add!("metaschemas/draft/2020-12/schema");
add!("metaschemas/draft/2020-12/meta/core"); add!("metaschemas/draft/2020-12/meta/core");
add!("metaschemas/draft/2020-12/meta/applicator"); add!("metaschemas/draft/2020-12/meta/applicator");
add!("metaschemas/draft/2020-12/meta/unevaluated"); add!("metaschemas/draft/2020-12/meta/unevaluated");
add!("metaschemas/draft/2020-12/meta/validation"); add!("metaschemas/draft/2020-12/meta/validation");
add!("metaschemas/draft/2020-12/meta/meta-data"); add!("metaschemas/draft/2020-12/meta/meta-data");
add!("metaschemas/draft/2020-12/meta/content"); add!("metaschemas/draft/2020-12/meta/content");
add!("metaschemas/draft/2020-12/meta/format-annotation"); add!("metaschemas/draft/2020-12/meta/format-annotation");
add!("metaschemas/draft/2020-12/meta/format-assertion"); add!("metaschemas/draft/2020-12/meta/format-assertion");
files files
}); });
fn load_std_meta(url: &str) -> Option<&'static str> { fn load_std_meta(url: &str) -> Option<&'static str> {
let meta = url let meta = url
.strip_prefix("http://json-schema.org/") .strip_prefix("http://json-schema.org/")
.or_else(|| url.strip_prefix("https://json-schema.org/")); .or_else(|| url.strip_prefix("https://json-schema.org/"));
if let Some(meta) = meta { if let Some(meta) = meta {
if meta == "schema" { if meta == "schema" {
return load_std_meta(latest().url); return load_std_meta(latest().url);
}
return STD_METAFILES.get(meta).cloned();
} }
None return STD_METAFILES.get(meta).cloned();
}
None
} }

File diff suppressed because it is too large Load Diff

View File

@ -6,123 +6,123 @@ use serde_json::Value;
use url::Url; use url::Url;
pub(crate) struct Root { pub(crate) struct Root {
pub(crate) draft: &'static Draft, pub(crate) draft: &'static Draft,
pub(crate) resources: HashMap<JsonPointer, Resource>, // ptr => _ pub(crate) resources: HashMap<JsonPointer, Resource>, // ptr => _
pub(crate) url: Url, pub(crate) url: Url,
pub(crate) meta_vocabs: Option<Vec<String>>, pub(crate) meta_vocabs: Option<Vec<String>>,
} }
impl Root { impl Root {
pub(crate) fn has_vocab(&self, name: &str) -> bool { pub(crate) fn has_vocab(&self, name: &str) -> bool {
if self.draft.version < 2019 || name == "core" { if self.draft.version < 2019 || name == "core" {
return true; return true;
}
if let Some(vocabs) = &self.meta_vocabs {
return vocabs.iter().any(|s| s == name);
}
self.draft.default_vocabs.contains(&name)
} }
if let Some(vocabs) = &self.meta_vocabs {
return vocabs.iter().any(|s| s == name);
}
self.draft.default_vocabs.contains(&name)
}
fn resolve_fragment_in(&self, frag: &Fragment, res: &Resource) -> Result<UrlPtr, CompileError> { fn resolve_fragment_in(&self, frag: &Fragment, res: &Resource) -> Result<UrlPtr, CompileError> {
let ptr = match frag { let ptr = match frag {
Fragment::Anchor(anchor) => { Fragment::Anchor(anchor) => {
let Some(ptr) = res.anchors.get(anchor) else { let Some(ptr) = res.anchors.get(anchor) else {
return Err(CompileError::AnchorNotFound { return Err(CompileError::AnchorNotFound {
url: self.url.to_string(), url: self.url.to_string(),
reference: UrlFrag::format(&res.id, frag.as_str()), reference: UrlFrag::format(&res.id, frag.as_str()),
}); });
};
ptr.clone()
}
Fragment::JsonPointer(ptr) => res.ptr.concat(ptr),
}; };
Ok(UrlPtr { ptr.clone()
url: self.url.clone(), }
ptr, Fragment::JsonPointer(ptr) => res.ptr.concat(ptr),
}) };
} Ok(UrlPtr {
url: self.url.clone(),
ptr,
})
}
pub(crate) fn resolve_fragment(&self, frag: &Fragment) -> Result<UrlPtr, CompileError> { pub(crate) fn resolve_fragment(&self, frag: &Fragment) -> Result<UrlPtr, CompileError> {
let res = self.resources.get("").ok_or(CompileError::Bug( let res = self.resources.get("").ok_or(CompileError::Bug(
format!("no root resource found for {}", self.url).into(), format!("no root resource found for {}", self.url).into(),
))?; ))?;
self.resolve_fragment_in(frag, res) self.resolve_fragment_in(frag, res)
} }
// resolves `UrlFrag` to `UrlPtr` from root. // resolves `UrlFrag` to `UrlPtr` from root.
// returns `None` if it is external. // returns `None` if it is external.
pub(crate) fn resolve(&self, uf: &UrlFrag) -> Result<Option<UrlPtr>, CompileError> { pub(crate) fn resolve(&self, uf: &UrlFrag) -> Result<Option<UrlPtr>, CompileError> {
let res = { let res = {
if uf.url == self.url { if uf.url == self.url {
self.resources.get("").ok_or(CompileError::Bug( self.resources.get("").ok_or(CompileError::Bug(
format!("no root resource found for {}", self.url).into(), format!("no root resource found for {}", self.url).into(),
))? ))?
} else { } else {
// look for resource with id==uf.url // look for resource with id==uf.url
let Some(res) = self.resources.values().find(|res| res.id == uf.url) else { let Some(res) = self.resources.values().find(|res| res.id == uf.url) else {
return Ok(None); // external url return Ok(None); // external url
};
res
}
}; };
res
}
};
self.resolve_fragment_in(&uf.frag, res).map(Some) self.resolve_fragment_in(&uf.frag, res).map(Some)
}
pub(crate) fn resource(&self, ptr: &JsonPointer) -> &Resource {
let mut ptr = ptr.as_str();
loop {
if let Some(res) = self.resources.get(ptr) {
return res;
}
let Some((prefix, _)) = ptr.rsplit_once('/') else {
break;
};
ptr = prefix;
} }
self.resources.get("").expect("root resource should exist")
}
pub(crate) fn resource(&self, ptr: &JsonPointer) -> &Resource { pub(crate) fn base_url(&self, ptr: &JsonPointer) -> &Url {
let mut ptr = ptr.as_str(); &self.resource(ptr).id
loop { }
if let Some(res) = self.resources.get(ptr) {
return res; pub(crate) fn add_subschema(
} &mut self,
let Some((prefix, _)) = ptr.rsplit_once('/') else { doc: &Value,
break; ptr: &JsonPointer,
}; ) -> Result<(), CompileError> {
ptr = prefix; let v = ptr.lookup(doc, &self.url)?;
} let base_url = self.base_url(ptr).clone();
self.resources.get("").expect("root resource should exist") self.draft
} .collect_resources(v, &base_url, ptr.clone(), &self.url, &mut self.resources)?;
pub(crate) fn base_url(&self, ptr: &JsonPointer) -> &Url { // collect anchors
&self.resource(ptr).id if !self.resources.contains_key(ptr) {
} let res = self.resource(ptr);
if let Some(res) = self.resources.get_mut(&res.ptr.clone()) {
pub(crate) fn add_subschema( self.draft.collect_anchors(v, ptr, res, &self.url)?;
&mut self, }
doc: &Value,
ptr: &JsonPointer,
) -> Result<(), CompileError> {
let v = ptr.lookup(doc, &self.url)?;
let base_url = self.base_url(ptr).clone();
self.draft
.collect_resources(v, &base_url, ptr.clone(), &self.url, &mut self.resources)?;
// collect anchors
if !self.resources.contains_key(ptr) {
let res = self.resource(ptr);
if let Some(res) = self.resources.get_mut(&res.ptr.clone()) {
self.draft.collect_anchors(v, ptr, res, &self.url)?;
}
}
Ok(())
} }
Ok(())
}
} }
#[derive(Debug)] #[derive(Debug)]
pub(crate) struct Resource { pub(crate) struct Resource {
pub(crate) ptr: JsonPointer, // from root pub(crate) ptr: JsonPointer, // from root
pub(crate) id: Url, pub(crate) id: Url,
pub(crate) anchors: HashMap<Anchor, JsonPointer>, // anchor => ptr pub(crate) anchors: HashMap<Anchor, JsonPointer>, // anchor => ptr
pub(crate) dynamic_anchors: HashSet<Anchor>, pub(crate) dynamic_anchors: HashSet<Anchor>,
} }
impl Resource { impl Resource {
pub(crate) fn new(ptr: JsonPointer, id: Url) -> Self { pub(crate) fn new(ptr: JsonPointer, id: Url) -> Self {
Self { Self {
ptr, ptr,
id, id,
anchors: HashMap::new(), anchors: HashMap::new(),
dynamic_anchors: HashSet::new(), dynamic_anchors: HashSet::new(),
}
} }
}
} }

View File

@ -8,100 +8,100 @@ use url::Url;
// -- // --
pub(crate) struct Roots { pub(crate) struct Roots {
pub(crate) default_draft: &'static Draft, pub(crate) default_draft: &'static Draft,
map: HashMap<Url, Root>, map: HashMap<Url, Root>,
pub(crate) loader: DefaultUrlLoader, pub(crate) loader: DefaultUrlLoader,
} }
impl Roots { impl Roots {
fn new() -> Self { fn new() -> Self {
Self { Self {
default_draft: latest(), default_draft: latest(),
map: Default::default(), map: Default::default(),
loader: DefaultUrlLoader::new(), loader: DefaultUrlLoader::new(),
}
} }
}
} }
impl Default for Roots { impl Default for Roots {
fn default() -> Self { fn default() -> Self {
Self::new() Self::new()
} }
} }
impl Roots { impl Roots {
pub(crate) fn get(&self, url: &Url) -> Option<&Root> { pub(crate) fn get(&self, url: &Url) -> Option<&Root> {
self.map.get(url) self.map.get(url)
}
pub(crate) fn resolve_fragment(&mut self, uf: UrlFrag) -> Result<UrlPtr, CompileError> {
self.or_load(uf.url.clone())?;
let Some(root) = self.map.get(&uf.url) else {
return Err(CompileError::Bug("or_load didn't add".into()));
};
root.resolve_fragment(&uf.frag)
}
pub(crate) fn ensure_subschema(&mut self, up: &UrlPtr) -> Result<(), CompileError> {
self.or_load(up.url.clone())?;
let Some(root) = self.map.get_mut(&up.url) else {
return Err(CompileError::Bug("or_load didn't add".into()));
};
if !root.draft.is_subschema(up.ptr.as_str()) {
let doc = self.loader.load(&root.url)?;
let v = up.ptr.lookup(doc, &up.url)?;
root.draft.validate(up, v)?;
root.add_subschema(doc, &up.ptr)?;
}
Ok(())
}
pub(crate) fn or_load(&mut self, url: Url) -> Result<(), CompileError> {
debug_assert!(url.fragment().is_none(), "trying to add root with fragment");
if self.map.contains_key(&url) {
return Ok(());
}
let doc = self.loader.load(&url)?;
let r = self.create_root(url.clone(), doc)?;
self.map.insert(url, r);
Ok(())
}
pub(crate) fn create_root(&self, url: Url, doc: &Value) -> Result<Root, CompileError> {
let draft = {
let up = UrlPtr {
url: url.clone(),
ptr: "".into(),
};
self.loader
.get_draft(&up, doc, self.default_draft, HashSet::new())?
};
let vocabs = self.loader.get_meta_vocabs(doc, draft)?;
let resources = {
let mut m = HashMap::default();
draft.collect_resources(doc, &url, "".into(), &url, &mut m)?;
m
};
if !matches!(url.host_str(), Some("json-schema.org")) {
draft.validate(
&UrlPtr {
url: url.clone(),
ptr: "".into(),
},
doc,
)?;
} }
pub(crate) fn resolve_fragment(&mut self, uf: UrlFrag) -> Result<UrlPtr, CompileError> { Ok(Root {
self.or_load(uf.url.clone())?; draft,
let Some(root) = self.map.get(&uf.url) else { resources,
return Err(CompileError::Bug("or_load didn't add".into())); url: url.clone(),
}; meta_vocabs: vocabs,
root.resolve_fragment(&uf.frag) })
} }
pub(crate) fn ensure_subschema(&mut self, up: &UrlPtr) -> Result<(), CompileError> { pub(crate) fn insert(&mut self, roots: &mut HashMap<Url, Root>) {
self.or_load(up.url.clone())?; self.map.extend(roots.drain());
let Some(root) = self.map.get_mut(&up.url) else { }
return Err(CompileError::Bug("or_load didn't add".into()));
};
if !root.draft.is_subschema(up.ptr.as_str()) {
let doc = self.loader.load(&root.url)?;
let v = up.ptr.lookup(doc, &up.url)?;
root.draft.validate(up, v)?;
root.add_subschema(doc, &up.ptr)?;
}
Ok(())
}
pub(crate) fn or_load(&mut self, url: Url) -> Result<(), CompileError> {
debug_assert!(url.fragment().is_none(), "trying to add root with fragment");
if self.map.contains_key(&url) {
return Ok(());
}
let doc = self.loader.load(&url)?;
let r = self.create_root(url.clone(), doc)?;
self.map.insert(url, r);
Ok(())
}
pub(crate) fn create_root(&self, url: Url, doc: &Value) -> Result<Root, CompileError> {
let draft = {
let up = UrlPtr {
url: url.clone(),
ptr: "".into(),
};
self.loader
.get_draft(&up, doc, self.default_draft, HashSet::new())?
};
let vocabs = self.loader.get_meta_vocabs(doc, draft)?;
let resources = {
let mut m = HashMap::default();
draft.collect_resources(doc, &url, "".into(), &url, &mut m)?;
m
};
if !matches!(url.host_str(), Some("json-schema.org")) {
draft.validate(
&UrlPtr {
url: url.clone(),
ptr: "".into(),
},
doc,
)?;
}
Ok(Root {
draft,
resources,
url: url.clone(),
meta_vocabs: vocabs,
})
}
pub(crate) fn insert(&mut self, roots: &mut HashMap<Url, Root>) {
self.map.extend(roots.drain());
}
} }

View File

@ -1,8 +1,8 @@
use std::{ use std::{
borrow::{Borrow, Cow}, borrow::{Borrow, Cow},
fmt::Display, fmt::Display,
hash::{Hash, Hasher}, hash::{Hash, Hasher},
str::FromStr, str::FromStr,
}; };
use ahash::{AHashMap, AHasher}; use ahash::{AHashMap, AHasher};
@ -19,112 +19,112 @@ pub(crate) struct JsonPointer(pub(crate) String);
impl JsonPointer { impl JsonPointer {
pub(crate) fn escape(token: &str) -> Cow<'_, str> { pub(crate) fn escape(token: &str) -> Cow<'_, str> {
const SPECIAL: [char; 2] = ['~', '/']; const SPECIAL: [char; 2] = ['~', '/'];
if token.contains(SPECIAL) { if token.contains(SPECIAL) {
token.replace('~', "~0").replace('/', "~1").into() token.replace('~', "~0").replace('/', "~1").into()
} else { } else {
token.into() token.into()
}
} }
}
pub(crate) fn unescape(mut tok: &str) -> Result<Cow<'_, str>, ()> { pub(crate) fn unescape(mut tok: &str) -> Result<Cow<'_, str>, ()> {
let Some(mut tilde) = tok.find('~') else { let Some(mut tilde) = tok.find('~') else {
return Ok(Cow::Borrowed(tok)); return Ok(Cow::Borrowed(tok));
}; };
let mut s = String::with_capacity(tok.len()); let mut s = String::with_capacity(tok.len());
loop { loop {
s.push_str(&tok[..tilde]); s.push_str(&tok[..tilde]);
tok = &tok[tilde + 1..]; tok = &tok[tilde + 1..];
match tok.chars().next() { match tok.chars().next() {
Some('1') => s.push('/'), Some('1') => s.push('/'),
Some('0') => s.push('~'), Some('0') => s.push('~'),
_ => return Err(()), _ => return Err(()),
}
tok = &tok[1..];
let Some(i) = tok.find('~') else {
s.push_str(tok);
break;
};
tilde = i;
}
Ok(Cow::Owned(s))
}
pub(crate) fn lookup<'a>(
&self,
mut v: &'a Value,
v_url: &Url,
) -> Result<&'a Value, CompileError> {
for tok in self.0.split('/').skip(1) {
let Ok(tok) = Self::unescape(tok) else {
let loc = UrlFrag::format(v_url, self.as_str());
return Err(CompileError::InvalidJsonPointer(loc));
};
match v {
Value::Object(obj) => {
if let Some(pvalue) = obj.get(tok.as_ref()) {
v = pvalue;
continue;
}
}
Value::Array(arr) => {
if let Ok(i) = usize::from_str(tok.as_ref()) {
if let Some(item) = arr.get(i) {
v = item;
continue;
} }
tok = &tok[1..]; };
let Some(i) = tok.find('~') else {
s.push_str(tok);
break;
};
tilde = i;
} }
Ok(Cow::Owned(s)) _ => {}
}
let loc = UrlFrag::format(v_url, self.as_str());
return Err(CompileError::JsonPointerNotFound(loc));
} }
Ok(v)
}
pub(crate) fn lookup<'a>( pub(crate) fn as_str(&self) -> &str {
&self, &self.0
mut v: &'a Value, }
v_url: &Url,
) -> Result<&'a Value, CompileError> {
for tok in self.0.split('/').skip(1) {
let Ok(tok) = Self::unescape(tok) else {
let loc = UrlFrag::format(v_url, self.as_str());
return Err(CompileError::InvalidJsonPointer(loc));
};
match v {
Value::Object(obj) => {
if let Some(pvalue) = obj.get(tok.as_ref()) {
v = pvalue;
continue;
}
}
Value::Array(arr) => {
if let Ok(i) = usize::from_str(tok.as_ref()) {
if let Some(item) = arr.get(i) {
v = item;
continue;
}
};
}
_ => {}
}
let loc = UrlFrag::format(v_url, self.as_str());
return Err(CompileError::JsonPointerNotFound(loc));
}
Ok(v)
}
pub(crate) fn as_str(&self) -> &str { pub(crate) fn is_empty(&self) -> bool {
&self.0 self.0.is_empty()
} }
pub(crate) fn is_empty(&self) -> bool { pub(crate) fn concat(&self, next: &Self) -> Self {
self.0.is_empty() JsonPointer(format!("{}{}", self.0, next.0))
} }
pub(crate) fn concat(&self, next: &Self) -> Self { pub(crate) fn append(&self, tok: &str) -> Self {
JsonPointer(format!("{}{}", self.0, next.0)) Self(format!("{}/{}", self, Self::escape(tok)))
} }
pub(crate) fn append(&self, tok: &str) -> Self { pub(crate) fn append2(&self, tok1: &str, tok2: &str) -> Self {
Self(format!("{}/{}", self, Self::escape(tok))) Self(format!(
} "{}/{}/{}",
self,
pub(crate) fn append2(&self, tok1: &str, tok2: &str) -> Self { Self::escape(tok1),
Self(format!( Self::escape(tok2)
"{}/{}/{}", ))
self, }
Self::escape(tok1),
Self::escape(tok2)
))
}
} }
impl Display for JsonPointer { impl Display for JsonPointer {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.0.fmt(f) self.0.fmt(f)
} }
} }
impl Borrow<str> for JsonPointer { impl Borrow<str> for JsonPointer {
fn borrow(&self) -> &str { fn borrow(&self) -> &str {
&self.0 &self.0
} }
} }
impl From<&str> for JsonPointer { impl From<&str> for JsonPointer {
fn from(value: &str) -> Self { fn from(value: &str) -> Self {
Self(value.into()) Self(value.into())
} }
} }
// -- // --
@ -133,297 +133,297 @@ impl From<&str> for JsonPointer {
pub(crate) struct Anchor(pub(crate) String); pub(crate) struct Anchor(pub(crate) String);
impl Display for Anchor { impl Display for Anchor {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.0.fmt(f) self.0.fmt(f)
} }
} }
impl Borrow<str> for Anchor { impl Borrow<str> for Anchor {
fn borrow(&self) -> &str { fn borrow(&self) -> &str {
&self.0 &self.0
} }
} }
impl From<&str> for Anchor { impl From<&str> for Anchor {
fn from(value: &str) -> Self { fn from(value: &str) -> Self {
Self(value.into()) Self(value.into())
} }
} }
// -- // --
#[derive(Debug, Clone, Eq, PartialEq)] #[derive(Debug, Clone, Eq, PartialEq)]
pub(crate) enum Fragment { pub(crate) enum Fragment {
Anchor(Anchor), Anchor(Anchor),
JsonPointer(JsonPointer), JsonPointer(JsonPointer),
} }
impl Fragment { impl Fragment {
pub(crate) fn split(s: &str) -> Result<(&str, Fragment), CompileError> { pub(crate) fn split(s: &str) -> Result<(&str, Fragment), CompileError> {
let (u, frag) = split(s); let (u, frag) = split(s);
let frag = percent_decode_str(frag) let frag = percent_decode_str(frag)
.decode_utf8() .decode_utf8()
.map_err(|src| CompileError::ParseUrlError { .map_err(|src| CompileError::ParseUrlError {
url: s.to_string(), url: s.to_string(),
src: src.into(), src: src.into(),
})? })?
.to_string(); .to_string();
let frag = if frag.is_empty() || frag.starts_with('/') { let frag = if frag.is_empty() || frag.starts_with('/') {
Fragment::JsonPointer(JsonPointer(frag)) Fragment::JsonPointer(JsonPointer(frag))
} else { } else {
Fragment::Anchor(Anchor(frag)) Fragment::Anchor(Anchor(frag))
}; };
Ok((u, frag)) Ok((u, frag))
} }
pub(crate) fn encode(frag: &str) -> String { pub(crate) fn encode(frag: &str) -> String {
// https://url.spec.whatwg.org/#fragment-percent-encode-set // https://url.spec.whatwg.org/#fragment-percent-encode-set
const FRAGMENT: &AsciiSet = &CONTROLS const FRAGMENT: &AsciiSet = &CONTROLS
.add(b'%') .add(b'%')
.add(b' ') .add(b' ')
.add(b'"') .add(b'"')
.add(b'<') .add(b'<')
.add(b'>') .add(b'>')
.add(b'`'); .add(b'`');
percent_encoding::utf8_percent_encode(frag, FRAGMENT).to_string() percent_encoding::utf8_percent_encode(frag, FRAGMENT).to_string()
} }
pub(crate) fn as_str(&self) -> &str { pub(crate) fn as_str(&self) -> &str {
match self { match self {
Fragment::Anchor(s) => &s.0, Fragment::Anchor(s) => &s.0,
Fragment::JsonPointer(s) => &s.0, Fragment::JsonPointer(s) => &s.0,
}
} }
}
} }
// -- // --
#[derive(Clone)] #[derive(Clone)]
pub(crate) struct UrlFrag { pub(crate) struct UrlFrag {
pub(crate) url: Url, pub(crate) url: Url,
pub(crate) frag: Fragment, pub(crate) frag: Fragment,
} }
impl UrlFrag { impl UrlFrag {
pub(crate) fn absolute(input: &str) -> Result<UrlFrag, CompileError> { pub(crate) fn absolute(input: &str) -> Result<UrlFrag, CompileError> {
let (u, frag) = Fragment::split(input)?; let (u, frag) = Fragment::split(input)?;
// note: windows drive letter is treated as url scheme by url parser // note: windows drive letter is treated as url scheme by url parser
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
if std::env::consts::OS == "windows" && starts_with_windows_drive(u) { if std::env::consts::OS == "windows" && starts_with_windows_drive(u) {
let url = Url::from_file_path(u) let url = Url::from_file_path(u)
.map_err(|_| CompileError::Bug(format!("failed to convert {u} into url").into()))?; .map_err(|_| CompileError::Bug(format!("failed to convert {u} into url").into()))?;
return Ok(UrlFrag { url, frag }); return Ok(UrlFrag { url, frag });
}
match Url::parse(u) {
Ok(url) => Ok(UrlFrag { url, frag }),
#[cfg(not(target_arch = "wasm32"))]
Err(url::ParseError::RelativeUrlWithoutBase) => {
let p = std::path::absolute(u).map_err(|e| CompileError::ParseUrlError {
url: u.to_owned(),
src: e.into(),
})?;
let url = Url::from_file_path(p).map_err(|_| {
CompileError::Bug(format!("failed to convert {u} into url").into())
})?;
Ok(UrlFrag { url, frag })
}
Err(e) => Err(CompileError::ParseUrlError {
url: u.to_owned(),
src: e.into(),
}),
}
} }
pub(crate) fn join(url: &Url, input: &str) -> Result<UrlFrag, CompileError> { match Url::parse(u) {
let (input, frag) = Fragment::split(input)?; Ok(url) => Ok(UrlFrag { url, frag }),
if input.is_empty() { #[cfg(not(target_arch = "wasm32"))]
return Ok(UrlFrag { Err(url::ParseError::RelativeUrlWithoutBase) => {
url: url.clone(), let p = std::path::absolute(u).map_err(|e| CompileError::ParseUrlError {
frag, url: u.to_owned(),
}); src: e.into(),
} })?;
let url = url.join(input).map_err(|e| CompileError::ParseUrlError { let url = Url::from_file_path(p).map_err(|_| {
url: input.to_string(), CompileError::Bug(format!("failed to convert {u} into url").into())
src: e.into(),
})?; })?;
Ok(UrlFrag { url, frag }) Ok(UrlFrag { url, frag })
}
Err(e) => Err(CompileError::ParseUrlError {
url: u.to_owned(),
src: e.into(),
}),
} }
}
pub(crate) fn format(url: &Url, frag: &str) -> String { pub(crate) fn join(url: &Url, input: &str) -> Result<UrlFrag, CompileError> {
if frag.is_empty() { let (input, frag) = Fragment::split(input)?;
url.to_string() if input.is_empty() {
} else { return Ok(UrlFrag {
format!("{}#{}", url, Fragment::encode(frag)) url: url.clone(),
} frag,
});
} }
let url = url.join(input).map_err(|e| CompileError::ParseUrlError {
url: input.to_string(),
src: e.into(),
})?;
Ok(UrlFrag { url, frag })
}
pub(crate) fn format(url: &Url, frag: &str) -> String {
if frag.is_empty() {
url.to_string()
} else {
format!("{}#{}", url, Fragment::encode(frag))
}
}
} }
impl Display for UrlFrag { impl Display for UrlFrag {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}#{}", self.url, Fragment::encode(self.frag.as_str())) write!(f, "{}#{}", self.url, Fragment::encode(self.frag.as_str()))
} }
} }
// -- // --
#[derive(Debug, Clone, Eq, PartialEq, Hash)] #[derive(Debug, Clone, Eq, PartialEq, Hash)]
pub(crate) struct UrlPtr { pub(crate) struct UrlPtr {
pub(crate) url: Url, pub(crate) url: Url,
pub(crate) ptr: JsonPointer, pub(crate) ptr: JsonPointer,
} }
impl UrlPtr { impl UrlPtr {
pub(crate) fn lookup<'a>(&self, doc: &'a Value) -> Result<&'a Value, CompileError> { pub(crate) fn lookup<'a>(&self, doc: &'a Value) -> Result<&'a Value, CompileError> {
self.ptr.lookup(doc, &self.url) self.ptr.lookup(doc, &self.url)
} }
pub(crate) fn format(&self, tok: &str) -> String { pub(crate) fn format(&self, tok: &str) -> String {
format!( format!(
"{}#{}/{}", "{}#{}/{}",
self.url, self.url,
Fragment::encode(self.ptr.as_str()), Fragment::encode(self.ptr.as_str()),
Fragment::encode(JsonPointer::escape(tok).as_ref()), Fragment::encode(JsonPointer::escape(tok).as_ref()),
) )
} }
} }
impl Display for UrlPtr { impl Display for UrlPtr {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}#{}", self.url, Fragment::encode(self.ptr.as_str())) write!(f, "{}#{}", self.url, Fragment::encode(self.ptr.as_str()))
} }
} }
// -- // --
pub(crate) fn is_integer(v: &Value) -> bool { pub(crate) fn is_integer(v: &Value) -> bool {
match v { match v {
Value::Number(n) => { Value::Number(n) => {
n.is_i64() || n.is_u64() || n.as_f64().filter(|n| n.fract() == 0.0).is_some() n.is_i64() || n.is_u64() || n.as_f64().filter(|n| n.fract() == 0.0).is_some()
}
_ => false,
} }
_ => false,
}
} }
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
fn starts_with_windows_drive(p: &str) -> bool { fn starts_with_windows_drive(p: &str) -> bool {
p.chars().next().filter(char::is_ascii_uppercase).is_some() && p[1..].starts_with(":\\") p.chars().next().filter(char::is_ascii_uppercase).is_some() && p[1..].starts_with(":\\")
} }
/// returns single-quoted string /// returns single-quoted string
pub(crate) fn quote<T>(s: &T) -> String pub(crate) fn quote<T>(s: &T) -> String
where where
T: AsRef<str> + std::fmt::Debug + ?Sized, T: AsRef<str> + std::fmt::Debug + ?Sized,
{ {
let s = format!("{s:?}").replace(r#"\""#, "\"").replace('\'', r"\'"); let s = format!("{s:?}").replace(r#"\""#, "\"").replace('\'', r"\'");
format!("'{}'", &s[1..s.len() - 1]) format!("'{}'", &s[1..s.len() - 1])
} }
pub(crate) fn join_iter<T>(iterable: T, sep: &str) -> String pub(crate) fn join_iter<T>(iterable: T, sep: &str) -> String
where where
T: IntoIterator, T: IntoIterator,
T::Item: Display, T::Item: Display,
{ {
iterable iterable
.into_iter() .into_iter()
.map(|e| e.to_string()) .map(|e| e.to_string())
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(sep) .join(sep)
} }
pub(crate) fn escape(token: &str) -> Cow<'_, str> { pub(crate) fn escape(token: &str) -> Cow<'_, str> {
JsonPointer::escape(token) JsonPointer::escape(token)
} }
pub(crate) fn split(url: &str) -> (&str, &str) { pub(crate) fn split(url: &str) -> (&str, &str) {
if let Some(i) = url.find('#') { if let Some(i) = url.find('#') {
(&url[..i], &url[i + 1..]) (&url[..i], &url[i + 1..])
} else { } else {
(url, "") (url, "")
} }
} }
/// serde_json treats 0 and 0.0 not equal. so we cannot simply use v1==v2 /// serde_json treats 0 and 0.0 not equal. so we cannot simply use v1==v2
pub(crate) fn equals(v1: &Value, v2: &Value) -> bool { pub(crate) fn equals(v1: &Value, v2: &Value) -> bool {
match (v1, v2) { match (v1, v2) {
(Value::Null, Value::Null) => true, (Value::Null, Value::Null) => true,
(Value::Bool(b1), Value::Bool(b2)) => b1 == b2, (Value::Bool(b1), Value::Bool(b2)) => b1 == b2,
(Value::Number(n1), Value::Number(n2)) => { (Value::Number(n1), Value::Number(n2)) => {
if let (Some(n1), Some(n2)) = (n1.as_u64(), n2.as_u64()) { if let (Some(n1), Some(n2)) = (n1.as_u64(), n2.as_u64()) {
return n1 == n2; return n1 == n2;
} }
if let (Some(n1), Some(n2)) = (n1.as_i64(), n2.as_i64()) { if let (Some(n1), Some(n2)) = (n1.as_i64(), n2.as_i64()) {
return n1 == n2; return n1 == n2;
} }
if let (Some(n1), Some(n2)) = (n1.as_f64(), n2.as_f64()) { if let (Some(n1), Some(n2)) = (n1.as_f64(), n2.as_f64()) {
return n1 == n2; return n1 == n2;
} }
false false
}
(Value::String(s1), Value::String(s2)) => s1 == s2,
(Value::Array(arr1), Value::Array(arr2)) => {
if arr1.len() != arr2.len() {
return false;
}
arr1.iter().zip(arr2).all(|(e1, e2)| equals(e1, e2))
}
(Value::Object(obj1), Value::Object(obj2)) => {
if obj1.len() != obj2.len() {
return false;
}
for (k1, v1) in obj1 {
if let Some(v2) = obj2.get(k1) {
if !equals(v1, v2) {
return false;
}
} else {
return false;
}
}
true
}
_ => false,
} }
(Value::String(s1), Value::String(s2)) => s1 == s2,
(Value::Array(arr1), Value::Array(arr2)) => {
if arr1.len() != arr2.len() {
return false;
}
arr1.iter().zip(arr2).all(|(e1, e2)| equals(e1, e2))
}
(Value::Object(obj1), Value::Object(obj2)) => {
if obj1.len() != obj2.len() {
return false;
}
for (k1, v1) in obj1 {
if let Some(v2) = obj2.get(k1) {
if !equals(v1, v2) {
return false;
}
} else {
return false;
}
}
true
}
_ => false,
}
} }
pub(crate) fn duplicates(arr: &Vec<Value>) -> Option<(usize, usize)> { pub(crate) fn duplicates(arr: &Vec<Value>) -> Option<(usize, usize)> {
match arr.as_slice() { match arr.as_slice() {
[e0, e1] => { [e0, e1] => {
if equals(e0, e1) { if equals(e0, e1) {
return Some((0, 1)); return Some((0, 1));
} }
}
[e0, e1, e2] => {
if equals(e0, e1) {
return Some((0, 1));
} else if equals(e0, e2) {
return Some((0, 2));
} else if equals(e1, e2) {
return Some((1, 2));
}
}
_ => {
let len = arr.len();
if len <= 20 {
for i in 0..len - 1 {
for j in i + 1..len {
if equals(&arr[i], &arr[j]) {
return Some((i, j));
}
}
}
} else {
let mut seen = AHashMap::with_capacity(len);
for (i, item) in arr.iter().enumerate() {
if let Some(j) = seen.insert(HashedValue(item), i) {
return Some((j, i));
}
}
}
}
} }
None [e0, e1, e2] => {
if equals(e0, e1) {
return Some((0, 1));
} else if equals(e0, e2) {
return Some((0, 2));
} else if equals(e1, e2) {
return Some((1, 2));
}
}
_ => {
let len = arr.len();
if len <= 20 {
for i in 0..len - 1 {
for j in i + 1..len {
if equals(&arr[i], &arr[j]) {
return Some((i, j));
}
}
}
} else {
let mut seen = AHashMap::with_capacity(len);
for (i, item) in arr.iter().enumerate() {
if let Some(j) = seen.insert(HashedValue(item), i) {
return Some((j, i));
}
}
}
}
}
None
} }
// HashedValue -- // HashedValue --
@ -433,113 +433,113 @@ pub(crate) fn duplicates(arr: &Vec<Value>) -> Option<(usize, usize)> {
pub(crate) struct HashedValue<'a>(pub(crate) &'a Value); pub(crate) struct HashedValue<'a>(pub(crate) &'a Value);
impl PartialEq for HashedValue<'_> { impl PartialEq for HashedValue<'_> {
fn eq(&self, other: &Self) -> bool { fn eq(&self, other: &Self) -> bool {
equals(self.0, other.0) equals(self.0, other.0)
} }
} }
impl Eq for HashedValue<'_> {} impl Eq for HashedValue<'_> {}
impl Hash for HashedValue<'_> { impl Hash for HashedValue<'_> {
fn hash<H: Hasher>(&self, state: &mut H) { fn hash<H: Hasher>(&self, state: &mut H) {
match self.0 { match self.0 {
Value::Null => state.write_u32(3_221_225_473), // chosen randomly Value::Null => state.write_u32(3_221_225_473), // chosen randomly
Value::Bool(ref b) => b.hash(state), Value::Bool(ref b) => b.hash(state),
Value::Number(ref num) => { Value::Number(ref num) => {
if let Some(num) = num.as_f64() { if let Some(num) = num.as_f64() {
num.to_bits().hash(state); num.to_bits().hash(state);
} else if let Some(num) = num.as_u64() { } else if let Some(num) = num.as_u64() {
num.hash(state); num.hash(state);
} else if let Some(num) = num.as_i64() { } else if let Some(num) = num.as_i64() {
num.hash(state); num.hash(state);
}
}
Value::String(ref str) => str.hash(state),
Value::Array(ref arr) => {
for item in arr {
HashedValue(item).hash(state);
}
}
Value::Object(ref obj) => {
let mut hash = 0;
for (pname, pvalue) in obj {
// We have no way of building a new hasher of type `H`, so we
// hardcode using the default hasher of a hash map.
let mut hasher = AHasher::default();
pname.hash(&mut hasher);
HashedValue(pvalue).hash(&mut hasher);
hash ^= hasher.finish();
}
state.write_u64(hash);
}
} }
}
Value::String(ref str) => str.hash(state),
Value::Array(ref arr) => {
for item in arr {
HashedValue(item).hash(state);
}
}
Value::Object(ref obj) => {
let mut hash = 0;
for (pname, pvalue) in obj {
// We have no way of building a new hasher of type `H`, so we
// hardcode using the default hasher of a hash map.
let mut hasher = AHasher::default();
pname.hash(&mut hasher);
HashedValue(pvalue).hash(&mut hasher);
hash ^= hasher.finish();
}
state.write_u64(hash);
}
} }
}
} }
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use ahash::AHashMap; use ahash::AHashMap;
use serde_json::json; use serde_json::json;
use super::*; use super::*;
#[test] #[test]
fn test_quote() { fn test_quote() {
assert_eq!(quote(r#"abc"def'ghi"#), r#"'abc"def\'ghi'"#); assert_eq!(quote(r#"abc"def'ghi"#), r#"'abc"def\'ghi'"#);
}
#[test]
fn test_fragment_split() {
let tests = [
("#", Fragment::JsonPointer("".into())),
("#/a/b", Fragment::JsonPointer("/a/b".into())),
("#abcd", Fragment::Anchor("abcd".into())),
("#%61%62%63%64", Fragment::Anchor("abcd".into())),
(
"#%2F%61%62%63%64%2fef",
Fragment::JsonPointer("/abcd/ef".into()),
), // '/' is encoded
("#abcd+ef", Fragment::Anchor("abcd+ef".into())), // '+' should not traslate to space
];
for test in tests {
let (_, got) = Fragment::split(test.0).unwrap();
assert_eq!(got, test.1, "Fragment::split({:?})", test.0);
} }
}
#[test] #[test]
fn test_fragment_split() { fn test_unescape() {
let tests = [ let tests = [
("#", Fragment::JsonPointer("".into())), ("bar~0", Some("bar~")),
("#/a/b", Fragment::JsonPointer("/a/b".into())), ("bar~1", Some("bar/")),
("#abcd", Fragment::Anchor("abcd".into())), ("bar~01", Some("bar~1")),
("#%61%62%63%64", Fragment::Anchor("abcd".into())), ("bar~", None),
( ("bar~~", None),
"#%2F%61%62%63%64%2fef", ];
Fragment::JsonPointer("/abcd/ef".into()), for (tok, want) in tests {
), // '/' is encoded let res = JsonPointer::unescape(tok).ok();
("#abcd+ef", Fragment::Anchor("abcd+ef".into())), // '+' should not traslate to space let got = res.as_ref().map(|c| c.as_ref());
]; assert_eq!(got, want, "unescape({:?})", tok)
for test in tests {
let (_, got) = Fragment::split(test.0).unwrap();
assert_eq!(got, test.1, "Fragment::split({:?})", test.0);
}
} }
}
#[test] #[test]
fn test_unescape() { fn test_equals() {
let tests = [ let tests = [["1.0", "1"], ["-1.0", "-1"]];
("bar~0", Some("bar~")), for [a, b] in tests {
("bar~1", Some("bar/")), let a = serde_json::from_str(a).unwrap();
("bar~01", Some("bar~1")), let b = serde_json::from_str(b).unwrap();
("bar~", None), assert!(equals(&a, &b));
("bar~~", None),
];
for (tok, want) in tests {
let res = JsonPointer::unescape(tok).ok();
let got = res.as_ref().map(|c| c.as_ref());
assert_eq!(got, want, "unescape({:?})", tok)
}
} }
}
#[test] #[test]
fn test_equals() { fn test_hashed_value() {
let tests = [["1.0", "1"], ["-1.0", "-1"]]; let mut seen = AHashMap::with_capacity(10);
for [a, b] in tests { let (v1, v2) = (json!(2), json!(2.0));
let a = serde_json::from_str(a).unwrap(); assert!(equals(&v1, &v2));
let b = serde_json::from_str(b).unwrap(); assert!(seen.insert(HashedValue(&v1), 1).is_none());
assert!(equals(&a, &b)); assert!(seen.insert(HashedValue(&v2), 1).is_some());
} }
}
#[test]
fn test_hashed_value() {
let mut seen = AHashMap::with_capacity(10);
let (v1, v2) = (json!(2), json!(2.0));
assert!(equals(&v1, &v2));
assert!(seen.insert(HashedValue(&v1), 1).is_none());
assert!(seen.insert(HashedValue(&v2), 1).is_some());
}
} }

File diff suppressed because it is too large Load Diff

View File

@ -5,83 +5,83 @@ use serde_json::json;
#[test] #[test]
fn test_metaschema_resource() -> Result<(), Box<dyn Error>> { fn test_metaschema_resource() -> Result<(), Box<dyn Error>> {
let main_schema = json!({ let main_schema = json!({
"$schema": "http://tmp.com/meta.json", "$schema": "http://tmp.com/meta.json",
"type": "number" "type": "number"
}); });
let meta_schema = json!({ let meta_schema = json!({
"$schema": "https://json-schema.org/draft/2020-12/schema", "$schema": "https://json-schema.org/draft/2020-12/schema",
"$vocabulary": { "$vocabulary": {
"https://json-schema.org/draft/2020-12/vocab/applicator": true, "https://json-schema.org/draft/2020-12/vocab/applicator": true,
"https://json-schema.org/draft/2020-12/vocab/core": true "https://json-schema.org/draft/2020-12/vocab/core": true
}, },
"allOf": [ "allOf": [
{ "$ref": "https://json-schema.org/draft/2020-12/meta/applicator" }, { "$ref": "https://json-schema.org/draft/2020-12/meta/applicator" },
{ "$ref": "https://json-schema.org/draft/2020-12/meta/core" } { "$ref": "https://json-schema.org/draft/2020-12/meta/core" }
] ]
}); });
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.add_resource("schema.json", main_schema)?; compiler.add_resource("schema.json", main_schema)?;
compiler.add_resource("http://tmp.com/meta.json", meta_schema)?; compiler.add_resource("http://tmp.com/meta.json", meta_schema)?;
compiler.compile("schema.json", &mut schemas)?; compiler.compile("schema.json", &mut schemas)?;
Ok(()) Ok(())
} }
#[test] #[test]
fn test_compile_anchor() -> Result<(), Box<dyn Error>> { fn test_compile_anchor() -> Result<(), Box<dyn Error>> {
let schema = json!({ let schema = json!({
"$schema": "https://json-schema.org/draft/2020-12/schema", "$schema": "https://json-schema.org/draft/2020-12/schema",
"$defs": { "$defs": {
"x": { "x": {
"$anchor": "a1", "$anchor": "a1",
"type": "number" "type": "number"
} }
} }
}); });
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.add_resource("schema.json", schema)?; compiler.add_resource("schema.json", schema)?;
let sch_index1 = compiler.compile("schema.json#a1", &mut schemas)?; let sch_index1 = compiler.compile("schema.json#a1", &mut schemas)?;
let sch_index2 = compiler.compile("schema.json#/$defs/x", &mut schemas)?; let sch_index2 = compiler.compile("schema.json#/$defs/x", &mut schemas)?;
assert_eq!(sch_index1, sch_index2); assert_eq!(sch_index1, sch_index2);
Ok(()) Ok(())
} }
#[test] #[test]
fn test_compile_nonstd() -> Result<(), Box<dyn Error>> { fn test_compile_nonstd() -> Result<(), Box<dyn Error>> {
let schema = json!({ let schema = json!({
"components": { "components": {
"schemas": { "schemas": {
"foo" : { "foo" : {
"$schema": "https://json-schema.org/draft/2020-12/schema", "$schema": "https://json-schema.org/draft/2020-12/schema",
"$defs": { "$defs": {
"x": { "x": {
"$anchor": "a", "$anchor": "a",
"type": "number" "type": "number"
}, },
"y": { "y": {
"$id": "http://temp.com/y", "$id": "http://temp.com/y",
"type": "string" "type": "string"
}
},
"oneOf": [
{ "$ref": "#a" },
{ "$ref": "http://temp.com/y" }
]
}
} }
},
"oneOf": [
{ "$ref": "#a" },
{ "$ref": "http://temp.com/y" }
]
} }
}); }
}
});
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.add_resource("schema.json", schema)?; compiler.add_resource("schema.json", schema)?;
compiler.compile("schema.json#/components/schemas/foo", &mut schemas)?; compiler.compile("schema.json#/components/schemas/foo", &mut schemas)?;
Ok(()) Ok(())
} }

View File

@ -5,37 +5,37 @@ use serde_json::{Map, Value};
#[test] #[test]
fn test_debug() -> Result<(), Box<dyn Error>> { fn test_debug() -> Result<(), Box<dyn Error>> {
let test: Value = serde_json::from_reader(File::open("tests/debug.json")?)?; let test: Value = serde_json::from_reader(File::open("tests/debug.json")?)?;
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.enable_format_assertions(); compiler.enable_format_assertions();
compiler.enable_content_assertions(); compiler.enable_content_assertions();
let remotes = Remotes(test["remotes"].as_object().unwrap().clone()); let remotes = Remotes(test["remotes"].as_object().unwrap().clone());
compiler.use_loader(Box::new(remotes)); compiler.use_loader(Box::new(remotes));
let url = "http://debug.com/schema.json"; let url = "http://debug.com/schema.json";
compiler.add_resource(url, test["schema"].clone())?; compiler.add_resource(url, test["schema"].clone())?;
let sch = compiler.compile(url, &mut schemas)?; let sch = compiler.compile(url, &mut schemas)?;
let result = schemas.validate(&test["data"], sch); let result = schemas.validate(&test["data"], sch);
if let Err(e) = &result { if let Err(e) = &result {
for line in format!("{e}").lines() { for line in format!("{e}").lines() {
println!(" {line}"); println!(" {line}");
}
for line in format!("{e:#}").lines() {
println!(" {line}");
}
println!("{:#}", e.detailed_output());
} }
assert_eq!(result.is_ok(), test["valid"].as_bool().unwrap()); for line in format!("{e:#}").lines() {
Ok(()) println!(" {line}");
}
println!("{:#}", e.detailed_output());
}
assert_eq!(result.is_ok(), test["valid"].as_bool().unwrap());
Ok(())
} }
struct Remotes(Map<String, Value>); struct Remotes(Map<String, Value>);
impl UrlLoader for Remotes { impl UrlLoader for Remotes {
fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> { fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> {
if let Some(v) = self.0.get(url) { if let Some(v) = self.0.get(url) {
return Ok(v.clone()); return Ok(v.clone());
}
Err("remote not found")?
} }
Err("remote not found")?
}
} }

View File

@ -7,16 +7,16 @@ use url::Url;
#[test] #[test]
fn example_from_files() -> Result<(), Box<dyn Error>> { fn example_from_files() -> Result<(), Box<dyn Error>> {
let schema_file = "tests/examples/schema.json"; let schema_file = "tests/examples/schema.json";
let instance: Value = serde_json::from_reader(File::open("tests/examples/instance.json")?)?; let instance: Value = serde_json::from_reader(File::open("tests/examples/instance.json")?)?;
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
let sch_index = compiler.compile(schema_file, &mut schemas)?; let sch_index = compiler.compile(schema_file, &mut schemas)?;
let result = schemas.validate(&instance, sch_index); let result = schemas.validate(&instance, sch_index);
assert!(result.is_ok()); assert!(result.is_ok());
Ok(()) Ok(())
} }
/** /**
@ -31,200 +31,200 @@ to local file.
*/ */
#[test] #[test]
fn example_from_strings() -> Result<(), Box<dyn Error>> { fn example_from_strings() -> Result<(), Box<dyn Error>> {
let cat_schema: Value = json!({ let cat_schema: Value = json!({
"type": "object", "type": "object",
"properties": { "properties": {
"speak": { "const": "meow" } "speak": { "const": "meow" }
}, },
"required": ["speak"] "required": ["speak"]
}); });
let pet_schema: Value = json!({ let pet_schema: Value = json!({
"oneOf": [ "oneOf": [
{ "$ref": "dog.json" }, { "$ref": "dog.json" },
{ "$ref": "cat.json" } { "$ref": "cat.json" }
] ]
}); });
let instance: Value = json!({"speak": "bow"}); let instance: Value = json!({"speak": "bow"});
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.add_resource("tests/examples/pet.json", pet_schema)?; compiler.add_resource("tests/examples/pet.json", pet_schema)?;
compiler.add_resource("tests/examples/cat.json", cat_schema)?; compiler.add_resource("tests/examples/cat.json", cat_schema)?;
let sch_index = compiler.compile("tests/examples/pet.json", &mut schemas)?; let sch_index = compiler.compile("tests/examples/pet.json", &mut schemas)?;
let result = schemas.validate(&instance, sch_index); let result = schemas.validate(&instance, sch_index);
assert!(result.is_ok()); assert!(result.is_ok());
Ok(()) Ok(())
} }
#[test] #[test]
#[ignore] #[ignore]
fn example_from_https() -> Result<(), Box<dyn Error>> { fn example_from_https() -> Result<(), Box<dyn Error>> {
let schema_url = "https://json-schema.org/learn/examples/geographical-location.schema.json"; let schema_url = "https://json-schema.org/learn/examples/geographical-location.schema.json";
let instance: Value = json!({"latitude": 48.858093, "longitude": 2.294694}); let instance: Value = json!({"latitude": 48.858093, "longitude": 2.294694});
struct HttpUrlLoader; struct HttpUrlLoader;
impl UrlLoader for HttpUrlLoader { impl UrlLoader for HttpUrlLoader {
fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> { fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> {
let reader = ureq::get(url).call()?.into_reader(); let reader = ureq::get(url).call()?.into_reader();
Ok(serde_json::from_reader(reader)?) Ok(serde_json::from_reader(reader)?)
}
} }
}
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
let mut loader = SchemeUrlLoader::new(); let mut loader = SchemeUrlLoader::new();
loader.register("file", Box::new(FileLoader)); loader.register("file", Box::new(FileLoader));
loader.register("http", Box::new(HttpUrlLoader)); loader.register("http", Box::new(HttpUrlLoader));
loader.register("https", Box::new(HttpUrlLoader)); loader.register("https", Box::new(HttpUrlLoader));
compiler.use_loader(Box::new(loader)); compiler.use_loader(Box::new(loader));
let sch_index = compiler.compile(schema_url, &mut schemas)?; let sch_index = compiler.compile(schema_url, &mut schemas)?;
let result = schemas.validate(&instance, sch_index); let result = schemas.validate(&instance, sch_index);
assert!(result.is_ok()); assert!(result.is_ok());
Ok(()) Ok(())
} }
#[test] #[test]
fn example_from_yaml_files() -> Result<(), Box<dyn Error>> { fn example_from_yaml_files() -> Result<(), Box<dyn Error>> {
let schema_file = "tests/examples/schema.yml"; let schema_file = "tests/examples/schema.yml";
let instance: Value = serde_yaml::from_reader(File::open("tests/examples/instance.yml")?)?; let instance: Value = serde_yaml::from_reader(File::open("tests/examples/instance.yml")?)?;
struct FileUrlLoader; struct FileUrlLoader;
impl UrlLoader for FileUrlLoader { impl UrlLoader for FileUrlLoader {
fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> { fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> {
let url = Url::parse(url)?; let url = Url::parse(url)?;
let path = url.to_file_path().map_err(|_| "invalid file path")?; let path = url.to_file_path().map_err(|_| "invalid file path")?;
let file = File::open(&path)?; let file = File::open(&path)?;
if path if path
.extension() .extension()
.filter(|&ext| ext == "yaml" || ext == "yml") .filter(|&ext| ext == "yaml" || ext == "yml")
.is_some() .is_some()
{ {
Ok(serde_yaml::from_reader(file)?) Ok(serde_yaml::from_reader(file)?)
} else { } else {
Ok(serde_json::from_reader(file)?) Ok(serde_json::from_reader(file)?)
} }
}
} }
}
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
let mut loader = SchemeUrlLoader::new(); let mut loader = SchemeUrlLoader::new();
loader.register("file", Box::new(FileUrlLoader)); loader.register("file", Box::new(FileUrlLoader));
compiler.use_loader(Box::new(loader)); compiler.use_loader(Box::new(loader));
let sch_index = compiler.compile(schema_file, &mut schemas)?; let sch_index = compiler.compile(schema_file, &mut schemas)?;
let result = schemas.validate(&instance, sch_index); let result = schemas.validate(&instance, sch_index);
assert!(result.is_ok()); assert!(result.is_ok());
Ok(()) Ok(())
} }
#[test] #[test]
fn example_custom_format() -> Result<(), Box<dyn Error>> { fn example_custom_format() -> Result<(), Box<dyn Error>> {
let schema_url = "http://tmp/schema.json"; let schema_url = "http://tmp/schema.json";
let schema: Value = json!({"type": "string", "format": "palindrome"}); let schema: Value = json!({"type": "string", "format": "palindrome"});
let instance: Value = json!("step on no pets"); let instance: Value = json!("step on no pets");
fn is_palindrome(v: &Value) -> Result<(), Box<dyn Error>> { fn is_palindrome(v: &Value) -> Result<(), Box<dyn Error>> {
let Value::String(s) = v else { let Value::String(s) = v else {
return Ok(()); // applicable only on strings return Ok(()); // applicable only on strings
}; };
let mut chars = s.chars(); let mut chars = s.chars();
while let (Some(c1), Some(c2)) = (chars.next(), chars.next_back()) { while let (Some(c1), Some(c2)) = (chars.next(), chars.next_back()) {
if c1 != c2 { if c1 != c2 {
Err("char mismatch")?; Err("char mismatch")?;
} }
}
Ok(())
} }
let mut schemas = Schemas::new();
let mut compiler = Compiler::new();
compiler.enable_format_assertions(); // in draft2020-12 format assertions are not enabled by default
compiler.register_format(Format {
name: "palindrome",
func: is_palindrome,
});
compiler.add_resource(schema_url, schema)?;
let sch_index = compiler.compile(schema_url, &mut schemas)?;
let result = schemas.validate(&instance, sch_index);
assert!(result.is_ok());
Ok(()) Ok(())
}
let mut schemas = Schemas::new();
let mut compiler = Compiler::new();
compiler.enable_format_assertions(); // in draft2020-12 format assertions are not enabled by default
compiler.register_format(Format {
name: "palindrome",
func: is_palindrome,
});
compiler.add_resource(schema_url, schema)?;
let sch_index = compiler.compile(schema_url, &mut schemas)?;
let result = schemas.validate(&instance, sch_index);
assert!(result.is_ok());
Ok(())
} }
#[test] #[test]
fn example_custom_content_encoding() -> Result<(), Box<dyn Error>> { fn example_custom_content_encoding() -> Result<(), Box<dyn Error>> {
let schema_url = "http://tmp/schema.json"; let schema_url = "http://tmp/schema.json";
let schema: Value = json!({"type": "string", "contentEncoding": "hex"}); let schema: Value = json!({"type": "string", "contentEncoding": "hex"});
let instance: Value = json!("aBcdxyz"); let instance: Value = json!("aBcdxyz");
fn decode(b: u8) -> Result<u8, Box<dyn Error>> { fn decode(b: u8) -> Result<u8, Box<dyn Error>> {
match b { match b {
b'0'..=b'9' => Ok(b - b'0'), b'0'..=b'9' => Ok(b - b'0'),
b'a'..=b'f' => Ok(b - b'a' + 10), b'a'..=b'f' => Ok(b - b'a' + 10),
b'A'..=b'F' => Ok(b - b'A' + 10), b'A'..=b'F' => Ok(b - b'A' + 10),
_ => Err("decode_hex: non-hex char")?, _ => Err("decode_hex: non-hex char")?,
}
} }
fn decode_hex(s: &str) -> Result<Vec<u8>, Box<dyn Error>> { }
if s.len() % 2 != 0 { fn decode_hex(s: &str) -> Result<Vec<u8>, Box<dyn Error>> {
Err("decode_hex: odd length")?; if s.len() % 2 != 0 {
} Err("decode_hex: odd length")?;
let mut bytes = s.bytes();
let mut out = Vec::with_capacity(s.len() / 2);
for _ in 0..out.len() {
if let (Some(b1), Some(b2)) = (bytes.next(), bytes.next()) {
out.push(decode(b1)? << 4 | decode(b2)?);
} else {
Err("decode_hex: non-ascii char")?;
}
}
Ok(out)
} }
let mut bytes = s.bytes();
let mut out = Vec::with_capacity(s.len() / 2);
for _ in 0..out.len() {
if let (Some(b1), Some(b2)) = (bytes.next(), bytes.next()) {
out.push(decode(b1)? << 4 | decode(b2)?);
} else {
Err("decode_hex: non-ascii char")?;
}
}
Ok(out)
}
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.enable_content_assertions(); // content assertions are not enabled by default compiler.enable_content_assertions(); // content assertions are not enabled by default
compiler.register_content_encoding(Decoder { compiler.register_content_encoding(Decoder {
name: "hex", name: "hex",
func: decode_hex, func: decode_hex,
}); });
compiler.add_resource(schema_url, schema)?; compiler.add_resource(schema_url, schema)?;
let sch_index = compiler.compile(schema_url, &mut schemas)?; let sch_index = compiler.compile(schema_url, &mut schemas)?;
let result = schemas.validate(&instance, sch_index); let result = schemas.validate(&instance, sch_index);
assert!(result.is_err()); assert!(result.is_err());
Ok(()) Ok(())
} }
#[test] #[test]
fn example_custom_content_media_type() -> Result<(), Box<dyn Error>> { fn example_custom_content_media_type() -> Result<(), Box<dyn Error>> {
let schema_url = "http://tmp/schema.json"; let schema_url = "http://tmp/schema.json";
let schema: Value = json!({"type": "string", "contentMediaType": "application/yaml"}); let schema: Value = json!({"type": "string", "contentMediaType": "application/yaml"});
let instance: Value = json!("name:foobar"); let instance: Value = json!("name:foobar");
fn check_yaml(bytes: &[u8], deserialize: bool) -> Result<Option<Value>, Box<dyn Error>> { fn check_yaml(bytes: &[u8], deserialize: bool) -> Result<Option<Value>, Box<dyn Error>> {
if deserialize { if deserialize {
return Ok(Some(serde_yaml::from_slice(bytes)?)); return Ok(Some(serde_yaml::from_slice(bytes)?));
}
serde_yaml::from_slice::<IgnoredAny>(bytes)?;
Ok(None)
} }
serde_yaml::from_slice::<IgnoredAny>(bytes)?;
Ok(None)
}
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.enable_content_assertions(); // content assertions are not enabled by default compiler.enable_content_assertions(); // content assertions are not enabled by default
compiler.register_content_media_type(MediaType { compiler.register_content_media_type(MediaType {
name: "application/yaml", name: "application/yaml",
json_compatible: true, json_compatible: true,
func: check_yaml, func: check_yaml,
}); });
compiler.add_resource(schema_url, schema)?; compiler.add_resource(schema_url, schema)?;
let sch_index = compiler.compile(schema_url, &mut schemas)?; let sch_index = compiler.compile(schema_url, &mut schemas)?;
let result = schemas.validate(&instance, sch_index); let result = schemas.validate(&instance, sch_index);
assert!(result.is_ok()); assert!(result.is_ok());
Ok(()) Ok(())
} }

View File

@ -3,42 +3,42 @@ use std::fs;
use boon::{CompileError, Compiler, Schemas}; use boon::{CompileError, Compiler, Schemas};
fn test(path: &str) -> Result<(), CompileError> { fn test(path: &str) -> Result<(), CompileError> {
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.compile(path, &mut schemas)?; compiler.compile(path, &mut schemas)?;
Ok(()) Ok(())
} }
#[test] #[test]
fn test_absolute() -> Result<(), CompileError> { fn test_absolute() -> Result<(), CompileError> {
let path = fs::canonicalize("tests/examples/schema.json").unwrap(); let path = fs::canonicalize("tests/examples/schema.json").unwrap();
test(path.to_string_lossy().as_ref()) test(path.to_string_lossy().as_ref())
} }
#[test] #[test]
fn test_relative_slash() -> Result<(), CompileError> { fn test_relative_slash() -> Result<(), CompileError> {
test("tests/examples/schema.json") test("tests/examples/schema.json")
} }
#[test] #[test]
#[cfg(windows)] #[cfg(windows)]
fn test_relative_backslash() -> Result<(), CompileError> { fn test_relative_backslash() -> Result<(), CompileError> {
test("tests\\examples\\schema.json") test("tests\\examples\\schema.json")
} }
#[test] #[test]
fn test_absolutei_space() -> Result<(), CompileError> { fn test_absolutei_space() -> Result<(), CompileError> {
let path = fs::canonicalize("tests/examples/sample schema.json").unwrap(); let path = fs::canonicalize("tests/examples/sample schema.json").unwrap();
test(path.to_string_lossy().as_ref()) test(path.to_string_lossy().as_ref())
} }
#[test] #[test]
fn test_relative_slash_space() -> Result<(), CompileError> { fn test_relative_slash_space() -> Result<(), CompileError> {
test("tests/examples/sample schema.json") test("tests/examples/sample schema.json")
} }
#[test] #[test]
#[cfg(windows)] #[cfg(windows)]
fn test_relative_backslash_space() -> Result<(), CompileError> { fn test_relative_backslash_space() -> Result<(), CompileError> {
test("tests\\examples\\sample schema.json") test("tests\\examples\\sample schema.json")
} }

View File

@ -6,62 +6,62 @@ use serde_json::Value;
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
struct Test { struct Test {
description: String, description: String,
remotes: Option<HashMap<String, Value>>, remotes: Option<HashMap<String, Value>>,
schema: Value, schema: Value,
errors: Option<Vec<String>>, errors: Option<Vec<String>>,
} }
#[test] #[test]
fn test_invalid_schemas() -> Result<(), Box<dyn Error>> { fn test_invalid_schemas() -> Result<(), Box<dyn Error>> {
let file = File::open("tests/invalid-schemas.json")?; let file = File::open("tests/invalid-schemas.json")?;
let tests: Vec<Test> = serde_json::from_reader(file)?; let tests: Vec<Test> = serde_json::from_reader(file)?;
for test in tests { for test in tests {
println!("{}", test.description); println!("{}", test.description);
match compile(&test) { match compile(&test) {
Ok(_) => { Ok(_) => {
if test.errors.is_some() { if test.errors.is_some() {
Err("want compilation to fail")? Err("want compilation to fail")?
}
}
Err(e) => {
println!(" {e}");
let error = format!("{e:?}");
let Some(errors) = &test.errors else {
Err("want compilation to succeed")?
};
for want in errors {
if !error.contains(want) {
println!(" got {error}");
println!(" want {want}");
panic!("error mismatch");
}
}
}
} }
}
Err(e) => {
println!(" {e}");
let error = format!("{e:?}");
let Some(errors) = &test.errors else {
Err("want compilation to succeed")?
};
for want in errors {
if !error.contains(want) {
println!(" got {error}");
println!(" want {want}");
panic!("error mismatch");
}
}
}
} }
Ok(()) }
Ok(())
} }
fn compile(test: &Test) -> Result<(), CompileError> { fn compile(test: &Test) -> Result<(), CompileError> {
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
let url = "http://fake.com/schema.json"; let url = "http://fake.com/schema.json";
if let Some(remotes) = &test.remotes { if let Some(remotes) = &test.remotes {
compiler.use_loader(Box::new(Remotes(remotes.clone()))); compiler.use_loader(Box::new(Remotes(remotes.clone())));
} }
compiler.add_resource(url, test.schema.clone())?; compiler.add_resource(url, test.schema.clone())?;
compiler.compile(url, &mut schemas)?; compiler.compile(url, &mut schemas)?;
Ok(()) Ok(())
} }
struct Remotes(HashMap<String, Value>); struct Remotes(HashMap<String, Value>);
impl UrlLoader for Remotes { impl UrlLoader for Remotes {
fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> { fn load(&self, url: &str) -> Result<Value, Box<dyn Error>> {
if let Some(v) = self.0.get(url) { if let Some(v) = self.0.get(url) {
return Ok(v.clone()); return Ok(v.clone());
}
Err("remote not found")?
} }
Err("remote not found")?
}
} }

View File

@ -6,117 +6,117 @@ use serde_json::Value;
#[test] #[test]
fn test_suites() -> Result<(), Box<dyn Error>> { fn test_suites() -> Result<(), Box<dyn Error>> {
if let Ok(suite) = env::var("TEST_SUITE") { if let Ok(suite) = env::var("TEST_SUITE") {
test_suite(&suite)?; test_suite(&suite)?;
} else { } else {
test_suite("tests/JSON-Schema-Test-Suite")?; test_suite("tests/JSON-Schema-Test-Suite")?;
test_suite("tests/Extra-Suite")?; test_suite("tests/Extra-Suite")?;
} }
Ok(()) Ok(())
} }
fn test_suite(suite: &str) -> Result<(), Box<dyn Error>> { fn test_suite(suite: &str) -> Result<(), Box<dyn Error>> {
test_folder(suite, "draft2019-09", Draft::V2019_09)?; test_folder(suite, "draft2019-09", Draft::V2019_09)?;
test_folder(suite, "draft2020-12", Draft::V2020_12)?; test_folder(suite, "draft2020-12", Draft::V2020_12)?;
Ok(()) Ok(())
} }
fn test_folder(suite: &str, folder: &str, draft: Draft) -> Result<(), Box<dyn Error>> { fn test_folder(suite: &str, folder: &str, draft: Draft) -> Result<(), Box<dyn Error>> {
let output_schema_url = format!( let output_schema_url = format!(
"https://json-schema.org/draft/{}/output/schema", "https://json-schema.org/draft/{}/output/schema",
folder.strip_prefix("draft").unwrap() folder.strip_prefix("draft").unwrap()
); );
let prefix = Path::new(suite).join("output-tests"); let prefix = Path::new(suite).join("output-tests");
let folder = prefix.join(folder); let folder = prefix.join(folder);
let content = folder.join("content"); let content = folder.join("content");
if !content.is_dir() { if !content.is_dir() {
return Ok(()); return Ok(());
} }
let output_schema: Value = let output_schema: Value =
serde_json::from_reader(File::open(folder.join("output-schema.json"))?)?; serde_json::from_reader(File::open(folder.join("output-schema.json"))?)?;
for entry in content.read_dir()? { for entry in content.read_dir()? {
let entry = entry?; let entry = entry?;
if !entry.file_type()?.is_file() { if !entry.file_type()?.is_file() {
continue; continue;
}; };
let entry_path = entry.path(); let entry_path = entry.path();
println!("{}", entry_path.strip_prefix(&prefix)?.to_str().unwrap()); println!("{}", entry_path.strip_prefix(&prefix)?.to_str().unwrap());
let groups: Vec<Group> = serde_json::from_reader(File::open(entry_path)?)?; let groups: Vec<Group> = serde_json::from_reader(File::open(entry_path)?)?;
for group in groups { for group in groups {
println!(" {}", group.description); println!(" {}", group.description);
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.set_default_draft(draft); compiler.set_default_draft(draft);
let schema_url = "http://output-tests/schema"; let schema_url = "http://output-tests/schema";
compiler.add_resource(schema_url, group.schema)?; compiler.add_resource(schema_url, group.schema)?;
let sch = compiler.compile(schema_url, &mut schemas)?; let sch = compiler.compile(schema_url, &mut schemas)?;
for test in group.tests { for test in group.tests {
println!(" {}", test.description); println!(" {}", test.description);
match schemas.validate(&test.data, sch) { match schemas.validate(&test.data, sch) {
Ok(_) => println!(" validation success"), Ok(_) => println!(" validation success"),
Err(e) => { Err(e) => {
if let Some(sch) = test.output.basic { if let Some(sch) = test.output.basic {
let mut schemas = Schemas::new(); let mut schemas = Schemas::new();
let mut compiler = Compiler::new(); let mut compiler = Compiler::new();
compiler.set_default_draft(draft); compiler.set_default_draft(draft);
compiler.add_resource(&output_schema_url, output_schema.clone())?; compiler.add_resource(&output_schema_url, output_schema.clone())?;
let schema_url = "http://output-tests/schema"; let schema_url = "http://output-tests/schema";
compiler.add_resource(schema_url, sch)?; compiler.add_resource(schema_url, sch)?;
let sch = compiler.compile(schema_url, &mut schemas)?; let sch = compiler.compile(schema_url, &mut schemas)?;
let basic: Value = serde_json::from_str(&e.basic_output().to_string())?; let basic: Value = serde_json::from_str(&e.basic_output().to_string())?;
let result = schemas.validate(&basic, sch); let result = schemas.validate(&basic, sch);
if let Err(e) = result { if let Err(e) = result {
println!("{basic:#}\n"); println!("{basic:#}\n");
for line in format!("{e}").lines() { for line in format!("{e}").lines() {
println!(" {line}"); println!(" {line}");
}
panic!("basic output did not match");
}
}
if let Some(sch) = test.output.detailed {
let mut schemas = Schemas::new();
let mut compiler = Compiler::new();
compiler.set_default_draft(draft);
compiler.add_resource(&output_schema_url, output_schema.clone())?;
let schema_url = "http://output-tests/schema";
compiler.add_resource(schema_url, sch)?;
let sch = compiler.compile(schema_url, &mut schemas)?;
let detailed: Value =
serde_json::from_str(&e.detailed_output().to_string())?;
let result = schemas.validate(&detailed, sch);
if let Err(e) = result {
println!("{detailed:#}\n");
for line in format!("{e}").lines() {
println!(" {line}");
}
panic!("detailed output did not match");
}
}
}
} }
panic!("basic output did not match");
}
} }
if let Some(sch) = test.output.detailed {
let mut schemas = Schemas::new();
let mut compiler = Compiler::new();
compiler.set_default_draft(draft);
compiler.add_resource(&output_schema_url, output_schema.clone())?;
let schema_url = "http://output-tests/schema";
compiler.add_resource(schema_url, sch)?;
let sch = compiler.compile(schema_url, &mut schemas)?;
let detailed: Value =
serde_json::from_str(&e.detailed_output().to_string())?;
let result = schemas.validate(&detailed, sch);
if let Err(e) = result {
println!("{detailed:#}\n");
for line in format!("{e}").lines() {
println!(" {line}");
}
panic!("detailed output did not match");
}
}
}
} }
}
} }
}
Ok(()) Ok(())
} }
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
struct Group { struct Group {
description: String, description: String,
schema: Value, schema: Value,
tests: Vec<Test>, tests: Vec<Test>,
} }
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
struct Test { struct Test {
description: String, description: String,
data: Value, data: Value,
output: Output, output: Output,
} }
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
struct Output { struct Output {
basic: Option<Value>, basic: Option<Value>,
detailed: Option<Value>, detailed: Option<Value>,
} }

View File

@ -5,116 +5,116 @@ use serde::{Deserialize, Serialize};
use serde_json::Value; use serde_json::Value;
static SKIP: [&str; 2] = [ static SKIP: [&str; 2] = [
"zeroTerminatedFloats.json", // only draft4: this behavior is changed in later drafts "zeroTerminatedFloats.json", // only draft4: this behavior is changed in later drafts
"float-overflow.json", "float-overflow.json",
]; ];
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
struct Group { struct Group {
description: String, description: String,
schema: Value, schema: Value,
tests: Vec<Test>, tests: Vec<Test>,
} }
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
struct Test { struct Test {
description: String, description: String,
data: Value, data: Value,
valid: bool, valid: bool,
} }
#[test] #[test]
fn test_suites() -> Result<(), Box<dyn Error>> { fn test_suites() -> Result<(), Box<dyn Error>> {
if let Ok(suite) = env::var("TEST_SUITE") { if let Ok(suite) = env::var("TEST_SUITE") {
test_suite(&suite)?; test_suite(&suite)?;
} else { } else {
test_suite("tests/JSON-Schema-Test-Suite")?; test_suite("tests/JSON-Schema-Test-Suite")?;
test_suite("tests/Extra-Test-Suite")?; test_suite("tests/Extra-Test-Suite")?;
} }
Ok(()) Ok(())
} }
fn test_suite(suite: &str) -> Result<(), Box<dyn Error>> { fn test_suite(suite: &str) -> Result<(), Box<dyn Error>> {
if !Path::new(suite).exists() { if !Path::new(suite).exists() {
Err(format!("test suite {suite} does not exist"))?; Err(format!("test suite {suite} does not exist"))?;
} }
test_dir(suite, "draft4", Draft::V4)?; test_dir(suite, "draft4", Draft::V4)?;
test_dir(suite, "draft6", Draft::V6)?; test_dir(suite, "draft6", Draft::V6)?;
test_dir(suite, "draft7", Draft::V7)?; test_dir(suite, "draft7", Draft::V7)?;
test_dir(suite, "draft2019-09", Draft::V2019_09)?; test_dir(suite, "draft2019-09", Draft::V2019_09)?;
test_dir(suite, "draft2020-12", Draft::V2020_12)?; test_dir(suite, "draft2020-12", Draft::V2020_12)?;
Ok(()) Ok(())
} }
fn test_dir(suite: &str, path: &str, draft: Draft) -> Result<(), Box<dyn Error>> { fn test_dir(suite: &str, path: &str, draft: Draft) -> Result<(), Box<dyn Error>> {
let prefix = Path::new(suite).join("tests"); let prefix = Path::new(suite).join("tests");
let dir = prefix.join(path); let dir = prefix.join(path);
if !dir.is_dir() { if !dir.is_dir() {
return Ok(()); return Ok(());
}
for entry in dir.read_dir()? {
let entry = entry?;
let file_type = entry.file_type()?;
let tmp_entry_path = entry.path();
let entry_path = tmp_entry_path.strip_prefix(&prefix)?.to_str().unwrap();
if file_type.is_file() {
if !SKIP.iter().any(|n| OsStr::new(n) == entry.file_name()) {
test_file(suite, entry_path, draft)?;
}
} else if file_type.is_dir() {
test_dir(suite, entry_path, draft)?;
} }
for entry in dir.read_dir()? { }
let entry = entry?; Ok(())
let file_type = entry.file_type()?;
let tmp_entry_path = entry.path();
let entry_path = tmp_entry_path.strip_prefix(&prefix)?.to_str().unwrap();
if file_type.is_file() {
if !SKIP.iter().any(|n| OsStr::new(n) == entry.file_name()) {
test_file(suite, entry_path, draft)?;
}
} else if file_type.is_dir() {
test_dir(suite, entry_path, draft)?;
}
}
Ok(())
} }
fn test_file(suite: &str, path: &str, draft: Draft) -> Result<(), Box<dyn Error>> { fn test_file(suite: &str, path: &str, draft: Draft) -> Result<(), Box<dyn Error>> {
println!("FILE: {path}"); println!("FILE: {path}");
let path = Path::new(suite).join("tests").join(path); let path = Path::new(suite).join("tests").join(path);
let optional = path.components().any(|comp| comp.as_os_str() == "optional"); let optional = path.components().any(|comp| comp.as_os_str() == "optional");
let file = File::open(path)?; let file = File::open(path)?;
let url = "http://testsuite.com/schema.json"; let url = "http://testsuite.com/schema.json";
let groups: Vec<Group> = serde_json::from_reader(file)?; let groups: Vec<Group> = serde_json::from_reader(file)?;
for group in groups { for group in groups {
println!("{}", group.description); println!("{}", group.description);
let mut schemas = Schemas::default(); let mut schemas = Schemas::default();
let mut compiler = Compiler::default(); let mut compiler = Compiler::default();
compiler.set_default_draft(draft); compiler.set_default_draft(draft);
if optional { if optional {
compiler.enable_format_assertions(); compiler.enable_format_assertions();
compiler.enable_content_assertions(); compiler.enable_content_assertions();
}
compiler.use_loader(Box::new(RemotesLoader(suite.to_owned())));
compiler.add_resource(url, group.schema)?;
let sch_index = compiler.compile(url, &mut schemas)?;
for test in group.tests {
println!(" {}", test.description);
let result = schemas.validate(&test.data, sch_index);
if let Err(e) = &result {
for line in format!("{e}").lines() {
println!(" {line}");
}
for line in format!("{e:#}").lines() {
println!(" {line}");
}
}
assert_eq!(result.is_ok(), test.valid);
}
} }
Ok(()) compiler.use_loader(Box::new(RemotesLoader(suite.to_owned())));
compiler.add_resource(url, group.schema)?;
let sch_index = compiler.compile(url, &mut schemas)?;
for test in group.tests {
println!(" {}", test.description);
let result = schemas.validate(&test.data, sch_index);
if let Err(e) = &result {
for line in format!("{e}").lines() {
println!(" {line}");
}
for line in format!("{e:#}").lines() {
println!(" {line}");
}
}
assert_eq!(result.is_ok(), test.valid);
}
}
Ok(())
} }
struct RemotesLoader(String); struct RemotesLoader(String);
impl UrlLoader for RemotesLoader { impl UrlLoader for RemotesLoader {
fn load(&self, url: &str) -> Result<Value, Box<dyn std::error::Error>> { fn load(&self, url: &str) -> Result<Value, Box<dyn std::error::Error>> {
// remotes folder -- // remotes folder --
if let Some(path) = url.strip_prefix("http://localhost:1234/") { if let Some(path) = url.strip_prefix("http://localhost:1234/") {
let path = Path::new(&self.0).join("remotes").join(path); let path = Path::new(&self.0).join("remotes").join(path);
let file = File::open(path)?; let file = File::open(path)?;
let json: Value = serde_json::from_reader(file)?; let json: Value = serde_json::from_reader(file)?;
return Ok(json); return Ok(json);
}
Err("no internet")?
} }
Err("no internet")?
}
} }