Compare commits
2 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| e45265b242 | |||
| ec867f142f |
20
GEMINI.md
20
GEMINI.md
@ -166,6 +166,17 @@ It evaluates as an **Independent Declarative Rules Engine**. Every `Case` block
|
||||
### Format Leniency for Empty Strings
|
||||
To simplify frontend form validation, format validators specifically for `uuid`, `date-time`, and `email` explicitly allow empty strings (`""`), treating them as "present but unset".
|
||||
|
||||
### Filters & Conditions
|
||||
In the Punc architecture, filters are automatically synthesized, strongly-typed JSON Schema boundaries that dictate the exact querying capabilities for any given entity or enum. They are completely generated for you; you never write them manually.
|
||||
|
||||
* **Conditions**: A condition schema is the contract defining the mathematical operations allowed on a primitive field. For example, a `string.condition` allows `$eq`, `$ne`, `$gt`, `$gte`, `$lt`, `$lte`, `$of` (IN), and `$nof` (NOT IN).
|
||||
* **Enum Conditions**: When JSPG synthesizes an enum, it dynamically generates an `<enum>.condition` (e.g., `address_kind.condition`). This strongly-typed condition perfectly mirrors the operations of a `string.condition`, but strictly limits the arrays and inputs of `$eq`, `$ne`, `$of`, and `$nof` to the exact variations defined by that Enum. This context ensures that UI generators know exactly when to render `<Select>` dropdowns instead of generic `<Text>` boxes.
|
||||
* **Filters**: A filter schema (e.g., `person.filter`) is an object containing condition properties used to filter entities. It natively supports structural composition:
|
||||
* **Inherited Properties**: Filters automatically inherit all valid database columns from their base type schema, immediately converting them to their respective `.condition` schemas.
|
||||
* **Relational Proxies**: If a table has a foreign key to another table, the filter automatically generates a proxy property pointing to the related entity's filter (e.g., the `person` filter automatically gains an `organization` property that points to `organization.filter`), allowing infinitely deep nested queries natively.
|
||||
* **Logical Operators (`$and`, `$or`)**: Every filter automatically includes `$and` and `$or` arrays, which recursively accept the exact same filter schema, allowing complex logical grouping.
|
||||
* **Ad-Hoc Extensions (`ad_hoc`)**: Fields stored purely in JSONB bubbles that lack formal database columns can still be queried using the `ad_hoc` object, which passes standard, unvalidated string conditions.
|
||||
|
||||
---
|
||||
|
||||
## 3. Database
|
||||
@ -238,19 +249,14 @@ The Merger provides an automated, high-performance graph synchronization engine.
|
||||
The Queryer transforms Postgres into a pre-compiled Semantic Query Engine, designed to serve the exact shape of Punc responses directly via SQL.
|
||||
|
||||
### API Reference
|
||||
* `jspg_query(schema_id text, filters jsonb) -> jsonb`: Compiles the JSON Schema AST of `schema_id` directly into pre-planned, nested multi-JOIN SQL execution trees. Processes `filters` structurally.
|
||||
* `jspg_query(schema_id text, filter jsonb) -> jsonb`: Compiles the JSON Schema AST of `schema_id` directly into pre-planned, nested multi-JOIN SQL execution trees. Processes the `filter` structurally.
|
||||
|
||||
### Core Features
|
||||
|
||||
* **Caching Strategy (DashMap SQL Caching)**: The Queryer securely caches its compiled, static SQL string templates per schema permutation inside the `GLOBAL_JSPG` concurrent `DashMap`. This eliminates recursive AST schema crawling on consecutive requests. Furthermore, it evaluates the strings via Postgres SPI (Server Programming Interface) Prepared Statements, leveraging native database caching of execution plans for extreme performance.
|
||||
* **Schema-to-SQL Compilation**: Compiles JSON Schema ASTs spanning deep arrays directly into static, pre-planned SQL multi-JOIN queries. This explicitly features the `Smart Merge` evaluation engine which natively translates properties through `type` inheritances, mapping JSON fields specifically to their physical database table aliases during translation.
|
||||
* **Root Null-Stripping Optimization**: Unlike traditional nested document builders, the Queryer intelligently defers Postgres' natively recursive `jsonb_strip_nulls` execution to the absolute apex of the compiled query pipeline. The compiler organically layers millions of rapid `jsonb_build_object()` sub-query allocations instantly, wrapping them in a singular overarching pass. This strips all empty optionals uniformly before exiting the database, maximizing CPU throughput.
|
||||
* **Dynamic Filtering**: Binds parameters natively through `cue.filters` objects. The queryer enforces a strict, structured, MongoDB-style operator syntax to map incoming JSON request constraints directly to their originating structural table columns. Filters support both flat path notation (e.g., `"contacts/is_primary": {...}`) and deeply nested recursive JSON structures (e.g., `{"contacts": {"is_primary": {...}}}`). The queryer recursively traverses and flattens these structures at AST compilation time.
|
||||
* **Equality / Inequality**: `{"$eq": value}`, `{"$ne": value}` automatically map to `=` and `!=`.
|
||||
* **Comparison**: `{"$gt": ...}`, `{"$gte": ...}`, `{"$lt": ...}`, `{"$lte": ...}` directly compile to Postgres comparison operators (`> `, `>=`, `<`, `<=`).
|
||||
* **Array Inclusion**: `{"$of": [values]}`, `{"$nof": [values]}` use native `jsonb_array_elements_text()` bindings to enforce `IN` and `NOT IN` logic without runtime SQL injection risks.
|
||||
* **Text Matching (ILIKE)**: Evaluates `$eq` or `$ne` against string fields containing the `%` character natively into Postgres `ILIKE` and `NOT ILIKE` partial substring matches.
|
||||
* **Type Casting**: Safely resolves dynamic combinations by casting values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`).
|
||||
* **Dynamic Filter Execution**: Evaluates the structured `filter` payload and recursively traverses and flattens its paths at AST compilation time. It safely binds parameter constraints using standard operations (e.g., mapping `$eq` to `=`, `$of` to `IN`, `$gt` to `>`) and automatically casts values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`). Text matching naturally evaluates `$eq` or `$ne` against string fields containing the `%` character natively into Postgres `ILIKE` and `NOT ILIKE`.
|
||||
* **Polymorphic SQL Generation (`family`)**: Compiles `family` properties by analyzing the **Physical Database Variations**, *not* the schema descendants.
|
||||
* **The Dot Convention**: When a schema requests `family: "target.schema"`, the compiler extracts the base type (e.g. `schema`) and looks up its Physical Table definition.
|
||||
* **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into sub-queries for each variation. To ensure safe resolution, the compiler dynamically evaluates correlation boundaries: it attempts standard Relational Edge discovery first. If no explicit relational edge exists (indicating pure Table Inheritance rather than a standard foreign-key graph relationship), it safely invokes a **Table Parity Fallback**. This generates an explicit ID correlation constraint (`AND inner.id = outer.id`), perfectly binding the structural variations back to the parent row to eliminate Cartesian products.
|
||||
|
||||
@ -3,7 +3,24 @@
|
||||
"description": "Filter Synthesis Object-Oriented Composition",
|
||||
"database": {
|
||||
"puncs": [],
|
||||
"enums": [],
|
||||
"enums": [
|
||||
{
|
||||
"id": "enum1",
|
||||
"name": "gender",
|
||||
"module": "core",
|
||||
"source": "gender",
|
||||
"schemas": {
|
||||
"gender": {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
"male",
|
||||
"female",
|
||||
"other"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
"relations": [
|
||||
{
|
||||
"id": "rel1",
|
||||
@ -46,6 +63,9 @@
|
||||
"billing_address": {
|
||||
"type": "address"
|
||||
},
|
||||
"gender": {
|
||||
"type": "gender"
|
||||
},
|
||||
"birth_date": {
|
||||
"type": "string",
|
||||
"format": "date-time"
|
||||
@ -170,6 +190,49 @@
|
||||
"expect": {
|
||||
"success": true,
|
||||
"schemas": {
|
||||
"gender": {},
|
||||
"gender.condition": {
|
||||
"type": "condition",
|
||||
"compiledPropertyNames": [
|
||||
"$eq",
|
||||
"$ne",
|
||||
"$nof",
|
||||
"$of",
|
||||
"kind"
|
||||
],
|
||||
"properties": {
|
||||
"$eq": {
|
||||
"type": [
|
||||
"gender",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$ne": {
|
||||
"type": [
|
||||
"gender",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$nof": {
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"type": "gender"
|
||||
}
|
||||
},
|
||||
"$of": {
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
],
|
||||
"items": {
|
||||
"type": "gender"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"person": {},
|
||||
"person.filter": {
|
||||
"compiledPropertyNames": [
|
||||
@ -180,6 +243,7 @@
|
||||
"billing_address",
|
||||
"birth_date",
|
||||
"first_name",
|
||||
"gender",
|
||||
"tags"
|
||||
],
|
||||
"properties": {
|
||||
@ -193,6 +257,7 @@
|
||||
"billing_address",
|
||||
"birth_date",
|
||||
"first_name",
|
||||
"gender",
|
||||
"tags"
|
||||
],
|
||||
"type": "person.filter"
|
||||
@ -212,6 +277,7 @@
|
||||
"billing_address",
|
||||
"birth_date",
|
||||
"first_name",
|
||||
"gender",
|
||||
"tags"
|
||||
],
|
||||
"type": "person.filter"
|
||||
@ -262,6 +328,12 @@
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"gender": {
|
||||
"type": [
|
||||
"gender.condition",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"tags": {
|
||||
"type": [
|
||||
"string.condition",
|
||||
|
||||
87
src/database/compile/condition.rs
Normal file
87
src/database/compile/condition.rs
Normal file
@ -0,0 +1,87 @@
|
||||
use crate::database::object::{SchemaObject, SchemaTypeOrArray};
|
||||
use crate::database::schema::Schema;
|
||||
use crate::database::r#enum::Enum;
|
||||
use std::collections::BTreeMap;
|
||||
use std::sync::Arc;
|
||||
|
||||
impl Enum {
|
||||
pub fn compile_condition(&self) -> Schema {
|
||||
let mut props = BTreeMap::new();
|
||||
let enum_name = &self.name;
|
||||
|
||||
let mut eq_obj = SchemaObject::default();
|
||||
eq_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
enum_name.clone(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
props.insert(
|
||||
"$eq".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: eq_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut ne_obj = SchemaObject::default();
|
||||
ne_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
enum_name.clone(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
props.insert(
|
||||
"$ne".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: ne_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut of_obj = SchemaObject::default();
|
||||
of_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"array".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
of_obj.items = Some(Arc::new(Schema {
|
||||
obj: SchemaObject {
|
||||
type_: Some(SchemaTypeOrArray::Single(enum_name.clone())),
|
||||
..Default::default()
|
||||
},
|
||||
always_fail: false,
|
||||
}));
|
||||
props.insert(
|
||||
"$of".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: of_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut nof_obj = SchemaObject::default();
|
||||
nof_obj.type_ = Some(SchemaTypeOrArray::Multiple(vec![
|
||||
"array".to_string(),
|
||||
"null".to_string(),
|
||||
]));
|
||||
nof_obj.items = Some(Arc::new(Schema {
|
||||
obj: SchemaObject {
|
||||
type_: Some(SchemaTypeOrArray::Single(enum_name.clone())),
|
||||
..Default::default()
|
||||
},
|
||||
always_fail: false,
|
||||
}));
|
||||
props.insert(
|
||||
"$nof".to_string(),
|
||||
Arc::new(Schema {
|
||||
obj: nof_obj,
|
||||
always_fail: false,
|
||||
}),
|
||||
);
|
||||
|
||||
let mut cond_obj = SchemaObject::default();
|
||||
cond_obj.type_ = Some(SchemaTypeOrArray::Single("condition".to_string()));
|
||||
cond_obj.properties = Some(props);
|
||||
|
||||
Schema {
|
||||
obj: cond_obj,
|
||||
always_fail: false,
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -42,7 +42,7 @@ impl Schema {
|
||||
key.clone(),
|
||||
Arc::new(inline_schema),
|
||||
);
|
||||
} else if let Some(mut filter_type) = Self::resolve_filter_type(child) {
|
||||
} else if let Some(mut filter_type) = Self::resolve_filter_type(child, _db) {
|
||||
filter_type.push("null".to_string());
|
||||
|
||||
let mut child_obj = SchemaObject::default();
|
||||
@ -117,16 +117,16 @@ impl Schema {
|
||||
None
|
||||
}
|
||||
|
||||
fn resolve_filter_type(schema: &Arc<Schema>) -> Option<Vec<String>> {
|
||||
fn resolve_filter_type(schema: &Arc<Schema>, db: &Database) -> Option<Vec<String>> {
|
||||
if let Some(type_) = &schema.obj.type_ {
|
||||
match type_ {
|
||||
SchemaTypeOrArray::Single(t) => {
|
||||
return Self::map_filter_string(t, schema);
|
||||
return Self::map_filter_string(t, schema, db);
|
||||
}
|
||||
SchemaTypeOrArray::Multiple(types) => {
|
||||
for t in types {
|
||||
if t != "null" {
|
||||
return Self::map_filter_string(t, schema);
|
||||
return Self::map_filter_string(t, schema, db);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -135,7 +135,7 @@ impl Schema {
|
||||
None
|
||||
}
|
||||
|
||||
fn map_filter_string(t: &str, schema: &Arc<Schema>) -> Option<Vec<String>> {
|
||||
fn map_filter_string(t: &str, schema: &Arc<Schema>, db: &Database) -> Option<Vec<String>> {
|
||||
match t {
|
||||
"string" => {
|
||||
if let Some(fmt) = &schema.obj.format {
|
||||
@ -151,15 +151,19 @@ impl Schema {
|
||||
"object" => None, // Inline structures are ignored in Composed References
|
||||
"array" => {
|
||||
if let Some(items) = &schema.obj.items {
|
||||
return Self::resolve_filter_type(items);
|
||||
return Self::resolve_filter_type(items, db);
|
||||
}
|
||||
None
|
||||
},
|
||||
"null" => None,
|
||||
custom => {
|
||||
if db.enums.contains_key(custom) {
|
||||
Some(vec![format!("{}.condition", custom)])
|
||||
} else {
|
||||
// Assume anything else is a Relational cross-boundary that already has its own .filter dynamically built
|
||||
Some(vec![format!("{}.filter", custom)])
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,4 +1,5 @@
|
||||
pub mod collection;
|
||||
pub mod condition;
|
||||
pub mod edges;
|
||||
pub mod filter;
|
||||
pub mod polymorphism;
|
||||
|
||||
@ -191,9 +191,10 @@ impl Database {
|
||||
}
|
||||
|
||||
pub fn compile(&mut self, errors: &mut Vec<crate::drop::Error>) {
|
||||
// Phase 1: Registration
|
||||
self.collect_schemas(errors);
|
||||
|
||||
// Formally evaluate properties with strict 3-pass Ordered Graph execution natively
|
||||
// Phase 2: Formally evaluate properties with strict 3-pass Ordered Graph execution natively
|
||||
for (_, enum_def) in &self.enums {
|
||||
for (schema_id, schema_arc) in &enum_def.schemas {
|
||||
let root_id = schema_id.split('/').next().unwrap_or(schema_id);
|
||||
@ -219,7 +220,25 @@ impl Database {
|
||||
}
|
||||
}
|
||||
|
||||
// Phase 2: Synthesize Composed Filter References
|
||||
// Phase 3: Synthesize Virtual Boundaries
|
||||
let mut compile_ids = self.compile_filters(errors);
|
||||
let mut condition_ids = self.compile_conditions();
|
||||
compile_ids.append(&mut condition_ids);
|
||||
|
||||
// Phase 4: Compile Virtual Boundaries
|
||||
// Now actively compile the newly injected schemas to lock all nested compose references natively
|
||||
for (_, id) in compile_ids {
|
||||
if let Some(schema_arc) = self.schemas.get(&id).cloned() {
|
||||
let root_id = id.split('/').next().unwrap_or(&id);
|
||||
schema_arc
|
||||
.as_ref()
|
||||
.compile(self, root_id, id.clone(), errors);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Synthesizes Composed Filter References for all table-backed boundaries.
|
||||
fn compile_filters(&mut self, errors: &mut Vec<crate::drop::Error>) -> Vec<(String, String)> {
|
||||
let mut filter_schemas = Vec::new();
|
||||
for (type_name, type_def) in &self.types {
|
||||
for (id, schema_arc) in &type_def.schemas {
|
||||
@ -246,21 +265,30 @@ impl Database {
|
||||
t.schemas.insert(id, filter_arc);
|
||||
}
|
||||
}
|
||||
filter_ids
|
||||
}
|
||||
|
||||
// Now actively compile the newly injected filters to lock all nested compose references natively
|
||||
for (type_name, id) in filter_ids {
|
||||
if let Some(filter_arc) = self
|
||||
.types
|
||||
.get(&type_name)
|
||||
.and_then(|t| t.schemas.get(&id))
|
||||
.cloned()
|
||||
{
|
||||
let root_id = id.split('/').next().unwrap_or(&id);
|
||||
filter_arc
|
||||
.as_ref()
|
||||
.compile(self, root_id, id.clone(), errors);
|
||||
/// Synthesizes strong Enum Conditions mirroring the string.condition capabilities.
|
||||
fn compile_conditions(&mut self) -> Vec<(String, String)> {
|
||||
let mut enum_conditions = Vec::new();
|
||||
for (enum_name, enum_def) in &self.enums {
|
||||
let cond_schema = enum_def.compile_condition();
|
||||
enum_conditions.push((
|
||||
enum_name.clone(),
|
||||
format!("{}.condition", enum_name),
|
||||
Arc::new(cond_schema),
|
||||
));
|
||||
}
|
||||
|
||||
let mut condition_ids = Vec::new();
|
||||
for (enum_name, id, cond_arc) in enum_conditions {
|
||||
condition_ids.push((enum_name.clone(), id.clone()));
|
||||
self.schemas.insert(id.clone(), cond_arc.clone());
|
||||
if let Some(e) = self.enums.get_mut(&enum_name) {
|
||||
e.schemas.insert(id.clone(), cond_arc.clone());
|
||||
}
|
||||
}
|
||||
condition_ids
|
||||
}
|
||||
|
||||
fn collect_schemas(&mut self, errors: &mut Vec<crate::drop::Error>) {
|
||||
|
||||
109
test_output.txt
Normal file
109
test_output.txt
Normal file
@ -0,0 +1,109 @@
|
||||
Finished `test` profile [unoptimized + debuginfo] target(s) in 0.43s
|
||||
Running unittests src/lib.rs (target/debug/deps/jspg-d3f18ff3a7e2b386)
|
||||
|
||||
running 1 test
|
||||
test tests::test_filter_0_0 ... FAILED
|
||||
|
||||
failures:
|
||||
|
||||
---- tests::test_filter_0_0 stdout ----
|
||||
TEST COMPILE ERROR FOR 'Assert filter generation map accurately represents strongly typed conditions natively.': Detailed Schema Match Failure for 'gender.condition'!
|
||||
|
||||
Expected:
|
||||
{
|
||||
"compiledPropertyNames": [
|
||||
"$eq",
|
||||
"$ne",
|
||||
"$nof",
|
||||
"$of"
|
||||
],
|
||||
"properties": {
|
||||
"$eq": {
|
||||
"type": [
|
||||
"gender",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$ne": {
|
||||
"type": [
|
||||
"gender",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$nof": {
|
||||
"items": {
|
||||
"type": "gender"
|
||||
},
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$of": {
|
||||
"items": {
|
||||
"type": "gender"
|
||||
},
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
}
|
||||
|
||||
Actual:
|
||||
{
|
||||
"compiledPropertyNames": [
|
||||
"$eq",
|
||||
"$ne",
|
||||
"$nof",
|
||||
"$of",
|
||||
"kind"
|
||||
],
|
||||
"properties": {
|
||||
"$eq": {
|
||||
"type": [
|
||||
"gender",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$ne": {
|
||||
"type": [
|
||||
"gender",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$nof": {
|
||||
"items": {
|
||||
"type": "gender"
|
||||
},
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"$of": {
|
||||
"items": {
|
||||
"type": "gender"
|
||||
},
|
||||
"type": [
|
||||
"array",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "condition"
|
||||
}
|
||||
|
||||
thread 'tests::test_filter_0_0' (118346550) panicked at src/tests/fixtures.rs:539:54:
|
||||
called `Result::unwrap()` on an `Err` value: "[Filter Synthesis Object-Oriented Composition] Compile Test 'Assert filter generation map accurately represents strongly typed conditions natively.' failed. Error: Detailed Schema Match Failure for 'gender.condition'!\n\nExpected:\n{\n \"compiledPropertyNames\": [\n \"$eq\",\n \"$ne\",\n \"$nof\",\n \"$of\"\n ],\n \"properties\": {\n \"$eq\": {\n \"type\": [\n \"gender\",\n \"null\"\n ]\n },\n \"$ne\": {\n \"type\": [\n \"gender\",\n \"null\"\n ]\n },\n \"$nof\": {\n \"items\": {\n \"type\": \"gender\"\n },\n \"type\": [\n \"array\",\n \"null\"\n ]\n },\n \"$of\": {\n \"items\": {\n \"type\": \"gender\"\n },\n \"type\": [\n \"array\",\n \"null\"\n ]\n }\n },\n \"type\": \"object\"\n}\n\nActual:\n{\n \"compiledPropertyNames\": [\n \"$eq\",\n \"$ne\",\n \"$nof\",\n \"$of\",\n \"kind\"\n ],\n \"properties\": {\n \"$eq\": {\n \"type\": [\n \"gender\",\n \"null\"\n ]\n },\n \"$ne\": {\n \"type\": [\n \"gender\",\n \"null\"\n ]\n },\n \"$nof\": {\n \"items\": {\n \"type\": \"gender\"\n },\n \"type\": [\n \"array\",\n \"null\"\n ]\n },\n \"$of\": {\n \"items\": {\n \"type\": \"gender\"\n },\n \"type\": [\n \"array\",\n \"null\"\n ]\n }\n },\n \"type\": \"condition\"\n}"
|
||||
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
|
||||
|
||||
|
||||
failures:
|
||||
tests::test_filter_0_0
|
||||
|
||||
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 1362 filtered out; finished in 0.00s
|
||||
|
||||
error: test failed, to rerun pass `--lib`
|
||||
Reference in New Issue
Block a user