Compare commits
10 Commits
1.0.67
...
091007006d
| Author | SHA1 | Date | |
|---|---|---|---|
| 091007006d | |||
| 3d66a7fc3c | |||
| e1314496dd | |||
| 70a27b430d | |||
| e078b8a74b | |||
| c2c0e62c2d | |||
| ebb97b3509 | |||
| 5d18847f32 | |||
| 4a33e29628 | |||
| d8fc286e94 |
5
.vscode/extensions.json
vendored
Normal file
5
.vscode/extensions.json
vendored
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
{
|
||||||
|
"recommendations": [
|
||||||
|
"rust-lang.rust-analyzer"
|
||||||
|
]
|
||||||
|
}
|
||||||
@ -39,10 +39,6 @@ crate-type = ["cdylib", "lib"]
|
|||||||
name = "pgrx_embed_jspg"
|
name = "pgrx_embed_jspg"
|
||||||
path = "src/bin/pgrx_embed.rs"
|
path = "src/bin/pgrx_embed.rs"
|
||||||
|
|
||||||
[[bin]]
|
|
||||||
name = "ast_explore"
|
|
||||||
path = "src/bin/ast_explore.rs"
|
|
||||||
|
|
||||||
[features]
|
[features]
|
||||||
default = ["pg18"]
|
default = ["pg18"]
|
||||||
pg18 = ["pgrx/pg18", "pgrx-tests/pg18" ]
|
pg18 = ["pgrx/pg18", "pgrx-tests/pg18" ]
|
||||||
|
|||||||
@ -43,7 +43,7 @@ JSPG implements specific extensions to the Draft 2020-12 standard to support the
|
|||||||
#### A. Polymorphism & Referencing (`$ref`, `$family`, and Native Types)
|
#### A. Polymorphism & Referencing (`$ref`, `$family`, and Native Types)
|
||||||
* **Native Type Discrimination (`variations`)**: Schemas defined inside a Postgres `type` are Entities. The validator securely and implicitly manages their `"type"` property. If an entity inherits from `user`, incoming JSON can safely define `{"type": "person"}` without errors, thanks to `compiled_variations` inheritance.
|
* **Native Type Discrimination (`variations`)**: Schemas defined inside a Postgres `type` are Entities. The validator securely and implicitly manages their `"type"` property. If an entity inherits from `user`, incoming JSON can safely define `{"type": "person"}` without errors, thanks to `compiled_variations` inheritance.
|
||||||
* **Structural Inheritance & Viral Infection (`$ref`)**: `$ref` is used exclusively for structural inheritance, *never* for union creation. A Punc request schema that `$ref`s an Entity virally inherits all physical database polymorphism rules for that target.
|
* **Structural Inheritance & Viral Infection (`$ref`)**: `$ref` is used exclusively for structural inheritance, *never* for union creation. A Punc request schema that `$ref`s an Entity virally inherits all physical database polymorphism rules for that target.
|
||||||
* **Shape Polymorphism (`$family`)**: Auto-expands polymorphic API lists based on an abstract Descendants Graph. If `{"$family": "widget"}` is used, JSPG evaluates the JSON against every schema that `$ref`s widget.
|
* **Shape Polymorphism (`$family`)**: Auto-expands polymorphic API lists based on an abstract **Descendants Graph**. If `{"$family": "widget"}` is used, the Validator dynamically identifies *every* schema in the registry that `$ref`s `widget` (e.g., `stock.widget`, `task.widget`) and evaluates the JSON against all of them.
|
||||||
* **Strict Matches & Depth Heuristic**: Polymorphic structures MUST match exactly **one** schema permutation. If multiple inherited struct permutations pass, JSPG applies the **Depth Heuristic Tie-Breaker**, selecting the candidate deepest in the inheritance tree.
|
* **Strict Matches & Depth Heuristic**: Polymorphic structures MUST match exactly **one** schema permutation. If multiple inherited struct permutations pass, JSPG applies the **Depth Heuristic Tie-Breaker**, selecting the candidate deepest in the inheritance tree.
|
||||||
|
|
||||||
#### B. Dot-Notation Schema Resolution & Database Mapping
|
#### B. Dot-Notation Schema Resolution & Database Mapping
|
||||||
@ -103,6 +103,10 @@ The Queryer transforms Postgres into a pre-compiled Semantic Query Engine via th
|
|||||||
* **Array Inclusion**: `{"$in": [values]}`, `{"$nin": [values]}` use native `jsonb_array_elements_text()` bindings to enforce `IN` and `NOT IN` logic without runtime SQL injection risks.
|
* **Array Inclusion**: `{"$in": [values]}`, `{"$nin": [values]}` use native `jsonb_array_elements_text()` bindings to enforce `IN` and `NOT IN` logic without runtime SQL injection risks.
|
||||||
* **Text Matching (ILIKE)**: Evaluates `$eq` or `$ne` against string fields containing the `%` character natively into Postgres `ILIKE` and `NOT ILIKE` partial substring matches.
|
* **Text Matching (ILIKE)**: Evaluates `$eq` or `$ne` against string fields containing the `%` character natively into Postgres `ILIKE` and `NOT ILIKE` partial substring matches.
|
||||||
* **Type Casting**: Safely resolves dynamic combinations by casting values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`).
|
* **Type Casting**: Safely resolves dynamic combinations by casting values instantly into the physical database types mapped in the schema (e.g. parsing `uuid` bindings to `::uuid`, formatting DateTimes to `::timestamptz`, and numbers to `::numeric`).
|
||||||
|
* **Polymorphic SQL Generation (`$family`)**: Compiles `$family` properties by analyzing the **Physical Database Variations**, *not* the schema descendants.
|
||||||
|
* **The Dot Convention**: When a schema requests `$family: "target.schema"`, the compiler extracts the base type (e.g. `schema`) and looks up its Physical Table definition.
|
||||||
|
* **Multi-Table Branching**: If the Physical Table is a parent to other tables (e.g. `organization` has variations `["organization", "bot", "person"]`), the compiler generates a dynamic `CASE WHEN type = '...' THEN ...` query, expanding into `JOIN`s for each variation.
|
||||||
|
* **Single-Table Bypass**: If the Physical Table is a leaf node with only one variation (e.g. `person` has variations `["person"]`), the compiler cleanly bypasses `CASE` generation and compiles a simple `SELECT` across the base table, as all schema extensions (e.g. `light.person`, `full.person`) are guaranteed to reside in the exact same physical row.
|
||||||
|
|
||||||
### The Stem Engine
|
### The Stem Engine
|
||||||
|
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@ -1,17 +0,0 @@
|
|||||||
use sqlparser::dialect::PostgreSqlDialect;
|
|
||||||
use sqlparser::parser::Parser;
|
|
||||||
use std::env;
|
|
||||||
|
|
||||||
fn main() {
|
|
||||||
let sql = "SELECT t1_obj_t1_addresses_t1_target_t2.archived, t1.id FROM person t1 JOIN address t1_obj_t1_addresses ON true";
|
|
||||||
let dialect = PostgreSqlDialect {};
|
|
||||||
|
|
||||||
match Parser::parse_sql(&dialect, sql) {
|
|
||||||
Ok(ast) => {
|
|
||||||
println!("{:#?}", ast);
|
|
||||||
}
|
|
||||||
Err(e) => {
|
|
||||||
println!("Error: {:?}", e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@ -24,20 +24,28 @@ impl DatabaseExecutor for SpiExecutor {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
Spi::connect(|client| {
|
pgrx::PgTryBuilder::new(|| {
|
||||||
match client.select(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
Spi::connect(|client| {
|
||||||
Ok(tup_table) => {
|
pgrx::notice!("JSPG_SQL: {}", sql);
|
||||||
let mut results = Vec::new();
|
match client.select(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
||||||
for row in tup_table {
|
Ok(tup_table) => {
|
||||||
if let Ok(Some(jsonb)) = row.get::<pgrx::JsonB>(1) {
|
let mut results = Vec::new();
|
||||||
results.push(jsonb.0);
|
for row in tup_table {
|
||||||
|
if let Ok(Some(jsonb)) = row.get::<pgrx::JsonB>(1) {
|
||||||
|
results.push(jsonb.0);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
Ok(Value::Array(results))
|
||||||
}
|
}
|
||||||
Ok(Value::Array(results))
|
Err(e) => Err(format!("SPI Query Fetch Failure: {}", e)),
|
||||||
}
|
}
|
||||||
Err(e) => Err(format!("SPI Query Fetch Failure: {}", e)),
|
})
|
||||||
}
|
|
||||||
})
|
})
|
||||||
|
.catch_others(|cause| {
|
||||||
|
pgrx::warning!("JSPG Caught Native Postgres Error: {:?}", cause);
|
||||||
|
Err(format!("{:?}", cause))
|
||||||
|
})
|
||||||
|
.execute()
|
||||||
}
|
}
|
||||||
|
|
||||||
fn execute(&self, sql: &str, args: Option<&[Value]>) -> Result<(), String> {
|
fn execute(&self, sql: &str, args: Option<&[Value]>) -> Result<(), String> {
|
||||||
@ -52,12 +60,20 @@ impl DatabaseExecutor for SpiExecutor {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
Spi::connect_mut(|client| {
|
pgrx::PgTryBuilder::new(|| {
|
||||||
match client.update(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
Spi::connect_mut(|client| {
|
||||||
Ok(_) => Ok(()),
|
pgrx::notice!("JSPG_SQL: {}", sql);
|
||||||
Err(e) => Err(format!("SPI Execution Failure: {}", e)),
|
match client.update(sql, Some(args_with_oid.len() as i64), &args_with_oid) {
|
||||||
}
|
Ok(_) => Ok(()),
|
||||||
|
Err(e) => Err(format!("SPI Execution Failure: {}", e)),
|
||||||
|
}
|
||||||
|
})
|
||||||
})
|
})
|
||||||
|
.catch_others(|cause| {
|
||||||
|
pgrx::warning!("JSPG Caught Native Postgres Error: {:?}", cause);
|
||||||
|
Err(format!("{:?}", cause))
|
||||||
|
})
|
||||||
|
.execute()
|
||||||
}
|
}
|
||||||
|
|
||||||
fn auth_user_id(&self) -> Result<String, String> {
|
fn auth_user_id(&self) -> Result<String, String> {
|
||||||
|
|||||||
@ -32,7 +32,7 @@ pub struct Database {
|
|||||||
pub enums: HashMap<String, Enum>,
|
pub enums: HashMap<String, Enum>,
|
||||||
pub types: HashMap<String, Type>,
|
pub types: HashMap<String, Type>,
|
||||||
pub puncs: HashMap<String, Punc>,
|
pub puncs: HashMap<String, Punc>,
|
||||||
pub relations: HashMap<String, Relation>,
|
pub relations: HashMap<(String, String), Vec<Relation>>,
|
||||||
pub schemas: HashMap<String, Schema>,
|
pub schemas: HashMap<String, Schema>,
|
||||||
// Map of Schema ID -> { Entity Type -> Target Subschema Arc }
|
// Map of Schema ID -> { Entity Type -> Target Subschema Arc }
|
||||||
pub stems: HashMap<String, HashMap<String, Arc<Stem>>>,
|
pub stems: HashMap<String, HashMap<String, Arc<Stem>>>,
|
||||||
@ -74,11 +74,12 @@ impl Database {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
let mut raw_relations = Vec::new();
|
||||||
if let Some(arr) = val.get("relations").and_then(|v| v.as_array()) {
|
if let Some(arr) = val.get("relations").and_then(|v| v.as_array()) {
|
||||||
for item in arr {
|
for item in arr {
|
||||||
match serde_json::from_value::<Relation>(item.clone()) {
|
match serde_json::from_value::<Relation>(item.clone()) {
|
||||||
Ok(def) => {
|
Ok(def) => {
|
||||||
db.relations.insert(def.constraint.clone(), def);
|
raw_relations.push(def);
|
||||||
}
|
}
|
||||||
Err(e) => println!("DATABASE RELATION PARSE FAILED: {:?}", e),
|
Err(e) => println!("DATABASE RELATION PARSE FAILED: {:?}", e),
|
||||||
}
|
}
|
||||||
@ -107,7 +108,7 @@ impl Database {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
db.compile()?;
|
db.compile(raw_relations)?;
|
||||||
Ok(db)
|
Ok(db)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -138,10 +139,11 @@ impl Database {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// Organizes the graph of the database, compiling regex, format functions, and caching relationships.
|
/// Organizes the graph of the database, compiling regex, format functions, and caching relationships.
|
||||||
pub fn compile(&mut self) -> Result<(), crate::drop::Drop> {
|
pub fn compile(&mut self, raw_relations: Vec<Relation>) -> Result<(), crate::drop::Drop> {
|
||||||
self.collect_schemas();
|
self.collect_schemas();
|
||||||
self.collect_depths();
|
self.collect_depths();
|
||||||
self.collect_descendants();
|
self.collect_descendants();
|
||||||
|
self.collect_relations(raw_relations);
|
||||||
self.compile_schemas();
|
self.compile_schemas();
|
||||||
self.collect_stems()?;
|
self.collect_stems()?;
|
||||||
|
|
||||||
@ -226,6 +228,95 @@ impl Database {
|
|||||||
self.descendants = descendants;
|
self.descendants = descendants;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn collect_relations(&mut self, raw_relations: Vec<Relation>) {
|
||||||
|
let mut edges: HashMap<(String, String), Vec<Relation>> = HashMap::new();
|
||||||
|
|
||||||
|
// For every relation, map it across all polymorphic inheritance permutations
|
||||||
|
for relation in raw_relations {
|
||||||
|
if let Some(_source_type_def) = self.types.get(&relation.source_type) {
|
||||||
|
if let Some(_dest_type_def) = self.types.get(&relation.destination_type) {
|
||||||
|
let mut src_descendants = Vec::new();
|
||||||
|
let mut dest_descendants = Vec::new();
|
||||||
|
|
||||||
|
for (t_name, t_def) in &self.types {
|
||||||
|
if t_def.hierarchy.contains(&relation.source_type) {
|
||||||
|
src_descendants.push(t_name.clone());
|
||||||
|
}
|
||||||
|
if t_def.hierarchy.contains(&relation.destination_type) {
|
||||||
|
dest_descendants.push(t_name.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for p_type in &src_descendants {
|
||||||
|
for c_type in &dest_descendants {
|
||||||
|
// Ignore entity <-> entity generic fallbacks, they aren't useful edges
|
||||||
|
if p_type == "entity" && c_type == "entity" {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Forward edge
|
||||||
|
edges
|
||||||
|
.entry((p_type.clone(), c_type.clone()))
|
||||||
|
.or_default()
|
||||||
|
.push(relation.clone());
|
||||||
|
|
||||||
|
// Reverse edge (only if types are different to avoid duplicating self-referential edges like activity parent_id)
|
||||||
|
if p_type != c_type {
|
||||||
|
edges
|
||||||
|
.entry((c_type.clone(), p_type.clone()))
|
||||||
|
.or_default()
|
||||||
|
.push(relation.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
self.relations = edges;
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn get_relation(
|
||||||
|
&self,
|
||||||
|
parent_type: &str,
|
||||||
|
child_type: &str,
|
||||||
|
prop_name: &str,
|
||||||
|
relative_keys: Option<&Vec<String>>,
|
||||||
|
) -> Option<&Relation> {
|
||||||
|
if let Some(relations) = self
|
||||||
|
.relations
|
||||||
|
.get(&(parent_type.to_string(), child_type.to_string()))
|
||||||
|
{
|
||||||
|
if relations.len() == 1 {
|
||||||
|
return Some(&relations[0]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reduce ambiguity with prefix
|
||||||
|
for rel in relations {
|
||||||
|
if let Some(prefix) = &rel.prefix {
|
||||||
|
if prefix == prop_name {
|
||||||
|
return Some(rel);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reduce ambiguity by checking if relative payload OMITS the prefix (M:M heuristic)
|
||||||
|
if let Some(keys) = relative_keys {
|
||||||
|
let mut missing_prefix_rels = Vec::new();
|
||||||
|
for rel in relations {
|
||||||
|
if let Some(prefix) = &rel.prefix {
|
||||||
|
if !keys.contains(prefix) {
|
||||||
|
missing_prefix_rels.push(rel);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if missing_prefix_rels.len() == 1 {
|
||||||
|
return Some(missing_prefix_rels[0]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
None
|
||||||
|
}
|
||||||
|
|
||||||
fn collect_descendants_recursively(
|
fn collect_descendants_recursively(
|
||||||
target: &str,
|
target: &str,
|
||||||
direct_refs: &HashMap<String, Vec<String>>,
|
direct_refs: &HashMap<String, Vec<String>>,
|
||||||
@ -335,17 +426,14 @@ impl Database {
|
|||||||
if let (Some(pt), Some(prop)) = (&parent_type, &property_name) {
|
if let (Some(pt), Some(prop)) = (&parent_type, &property_name) {
|
||||||
let expected_col = format!("{}_id", prop);
|
let expected_col = format!("{}_id", prop);
|
||||||
let mut found = false;
|
let mut found = false;
|
||||||
for rel in db.relations.values() {
|
|
||||||
if (rel.source_type == *pt && rel.destination_type == entity_type)
|
if let Some(rel) = db.get_relation(pt, &entity_type, prop, None) {
|
||||||
|| (rel.source_type == entity_type && rel.destination_type == *pt)
|
if rel.source_columns.contains(&expected_col) {
|
||||||
{
|
relation_col = Some(expected_col.clone());
|
||||||
if rel.source_columns.contains(&expected_col) {
|
found = true;
|
||||||
relation_col = Some(expected_col.clone());
|
|
||||||
found = true;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if !found {
|
if !found {
|
||||||
relation_col = Some(expected_col);
|
relation_col = Some(expected_col);
|
||||||
}
|
}
|
||||||
|
|||||||
@ -67,6 +67,10 @@ pub struct Error {
|
|||||||
#[derive(Debug, Serialize, Deserialize, Clone)]
|
#[derive(Debug, Serialize, Deserialize, Clone)]
|
||||||
pub struct ErrorDetails {
|
pub struct ErrorDetails {
|
||||||
pub path: String,
|
pub path: String,
|
||||||
// Extensions can be added here (package, cause, etc)
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
// For now, validator only provides path
|
pub cause: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub context: Option<Vec<String>>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub schema: Option<String>,
|
||||||
}
|
}
|
||||||
|
|||||||
@ -31,6 +31,9 @@ fn jspg_failure() -> JsonB {
|
|||||||
message: "JSPG extension has not been initialized via jspg_setup".to_string(),
|
message: "JSPG extension has not been initialized via jspg_setup".to_string(),
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: "".to_string(),
|
path: "".to_string(),
|
||||||
|
cause: None,
|
||||||
|
context: None,
|
||||||
|
schema: None,
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
let drop = crate::drop::Drop::with_errors(vec![error]);
|
let drop = crate::drop::Drop::with_errors(vec![error]);
|
||||||
|
|||||||
@ -21,27 +21,27 @@ impl Merger {
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub fn merge(&self, data: Value) -> crate::drop::Drop {
|
pub fn merge(&self, data: Value) -> crate::drop::Drop {
|
||||||
let mut val_resolved = Value::Null;
|
|
||||||
let mut notifications_queue = Vec::new();
|
let mut notifications_queue = Vec::new();
|
||||||
|
|
||||||
let result = self.merge_internal(data, &mut notifications_queue);
|
let result = self.merge_internal(data, &mut notifications_queue);
|
||||||
|
|
||||||
match result {
|
let val_resolved = match result {
|
||||||
Ok(val) => {
|
Ok(val) => val,
|
||||||
val_resolved = val;
|
|
||||||
}
|
|
||||||
Err(msg) => {
|
Err(msg) => {
|
||||||
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||||
code: "MERGE_FAILED".to_string(),
|
code: "MERGE_FAILED".to_string(),
|
||||||
message: msg,
|
message: msg,
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: "".to_string(),
|
path: "".to_string(),
|
||||||
|
cause: None,
|
||||||
|
context: None,
|
||||||
|
schema: None,
|
||||||
},
|
},
|
||||||
}]);
|
}]);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
// Execute the globally collected, pre-ordered notifications last!
|
// Execute the globally collected, pre-ordered notifications last!
|
||||||
for notify_sql in notifications_queue {
|
for notify_sql in notifications_queue {
|
||||||
if let Err(e) = self.db.execute(¬ify_sql, None) {
|
if let Err(e) = self.db.execute(¬ify_sql, None) {
|
||||||
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||||
@ -49,6 +49,9 @@ impl Merger {
|
|||||||
message: format!("Executor Error in pre-ordered notify: {:?}", e),
|
message: format!("Executor Error in pre-ordered notify: {:?}", e),
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: "".to_string(),
|
path: "".to_string(),
|
||||||
|
cause: None,
|
||||||
|
context: None,
|
||||||
|
schema: None,
|
||||||
},
|
},
|
||||||
}]);
|
}]);
|
||||||
}
|
}
|
||||||
@ -82,7 +85,11 @@ impl Merger {
|
|||||||
crate::drop::Drop::success_with_val(stripped_val)
|
crate::drop::Drop::success_with_val(stripped_val)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub(crate) fn merge_internal(&self, data: Value, notifications: &mut Vec<String>) -> Result<Value, String> {
|
pub(crate) fn merge_internal(
|
||||||
|
&self,
|
||||||
|
data: Value,
|
||||||
|
notifications: &mut Vec<String>,
|
||||||
|
) -> Result<Value, String> {
|
||||||
match data {
|
match data {
|
||||||
Value::Array(items) => self.merge_array(items, notifications),
|
Value::Array(items) => self.merge_array(items, notifications),
|
||||||
Value::Object(map) => self.merge_object(map, notifications),
|
Value::Object(map) => self.merge_object(map, notifications),
|
||||||
@ -90,7 +97,11 @@ impl Merger {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn merge_array(&self, items: Vec<Value>, notifications: &mut Vec<String>) -> Result<Value, String> {
|
fn merge_array(
|
||||||
|
&self,
|
||||||
|
items: Vec<Value>,
|
||||||
|
notifications: &mut Vec<String>,
|
||||||
|
) -> Result<Value, String> {
|
||||||
let mut resolved_items = Vec::new();
|
let mut resolved_items = Vec::new();
|
||||||
for item in items {
|
for item in items {
|
||||||
let resolved = self.merge_internal(item, notifications)?;
|
let resolved = self.merge_internal(item, notifications)?;
|
||||||
@ -99,7 +110,11 @@ impl Merger {
|
|||||||
Ok(Value::Array(resolved_items))
|
Ok(Value::Array(resolved_items))
|
||||||
}
|
}
|
||||||
|
|
||||||
fn merge_object(&self, obj: serde_json::Map<String, Value>, notifications: &mut Vec<String>) -> Result<Value, String> {
|
fn merge_object(
|
||||||
|
&self,
|
||||||
|
obj: serde_json::Map<String, Value>,
|
||||||
|
notifications: &mut Vec<String>,
|
||||||
|
) -> Result<Value, String> {
|
||||||
let queue_start = notifications.len();
|
let queue_start = notifications.len();
|
||||||
|
|
||||||
let type_name = match obj.get("type").and_then(|v| v.as_str()) {
|
let type_name = match obj.get("type").and_then(|v| v.as_str()) {
|
||||||
@ -158,7 +173,21 @@ impl Merger {
|
|||||||
_ => continue,
|
_ => continue,
|
||||||
};
|
};
|
||||||
|
|
||||||
let relative_relation = self.get_entity_relation(type_def, &relative, &relation_name)?;
|
// Attempt to extract relative object type name
|
||||||
|
let relative_type_name = match relative.get("type").and_then(|v| v.as_str()) {
|
||||||
|
Some(t) => t,
|
||||||
|
None => continue,
|
||||||
|
};
|
||||||
|
|
||||||
|
let relative_keys: Vec<String> = relative.keys().cloned().collect();
|
||||||
|
|
||||||
|
// Call central Database O(1) graph logic
|
||||||
|
let relative_relation = self.db.get_relation(
|
||||||
|
&type_def.name,
|
||||||
|
relative_type_name,
|
||||||
|
&relation_name,
|
||||||
|
Some(&relative_keys),
|
||||||
|
);
|
||||||
|
|
||||||
if let Some(relation) = relative_relation {
|
if let Some(relation) = relative_relation {
|
||||||
let parent_is_source = type_def.hierarchy.contains(&relation.source_type);
|
let parent_is_source = type_def.hierarchy.contains(&relation.source_type);
|
||||||
@ -247,7 +276,21 @@ impl Merger {
|
|||||||
_ => continue,
|
_ => continue,
|
||||||
};
|
};
|
||||||
|
|
||||||
let relative_relation = self.get_entity_relation(type_def, first_relative, &relation_name)?;
|
// Attempt to extract relative object type name
|
||||||
|
let relative_type_name = match first_relative.get("type").and_then(|v| v.as_str()) {
|
||||||
|
Some(t) => t,
|
||||||
|
None => continue,
|
||||||
|
};
|
||||||
|
|
||||||
|
let relative_keys: Vec<String> = first_relative.keys().cloned().collect();
|
||||||
|
|
||||||
|
// Call central Database O(1) graph logic
|
||||||
|
let relative_relation = self.db.get_relation(
|
||||||
|
&type_def.name,
|
||||||
|
relative_type_name,
|
||||||
|
&relation_name,
|
||||||
|
Some(&relative_keys),
|
||||||
|
);
|
||||||
|
|
||||||
if let Some(relation) = relative_relation {
|
if let Some(relation) = relative_relation {
|
||||||
let mut relative_responses = Vec::new();
|
let mut relative_responses = Vec::new();
|
||||||
@ -266,10 +309,11 @@ impl Merger {
|
|||||||
&entity_fields,
|
&entity_fields,
|
||||||
);
|
);
|
||||||
|
|
||||||
let merged_relative = match self.merge_internal(Value::Object(relative_item), notifications)? {
|
let merged_relative =
|
||||||
Value::Object(m) => m,
|
match self.merge_internal(Value::Object(relative_item), notifications)? {
|
||||||
_ => continue,
|
Value::Object(m) => m,
|
||||||
};
|
_ => continue,
|
||||||
|
};
|
||||||
|
|
||||||
relative_responses.push(Value::Object(merged_relative));
|
relative_responses.push(Value::Object(merged_relative));
|
||||||
}
|
}
|
||||||
@ -760,101 +804,7 @@ impl Merger {
|
|||||||
changes
|
changes
|
||||||
}
|
}
|
||||||
|
|
||||||
fn reduce_entity_relations(
|
// Helper Functions
|
||||||
&self,
|
|
||||||
mut matching_relations: Vec<crate::database::relation::Relation>,
|
|
||||||
relative: &serde_json::Map<String, Value>,
|
|
||||||
relation_name: &str,
|
|
||||||
) -> Result<Option<crate::database::relation::Relation>, String> {
|
|
||||||
if matching_relations.is_empty() {
|
|
||||||
return Ok(None);
|
|
||||||
}
|
|
||||||
if matching_relations.len() == 1 {
|
|
||||||
return Ok(Some(matching_relations.pop().unwrap()));
|
|
||||||
}
|
|
||||||
|
|
||||||
let exact_match: Vec<_> = matching_relations
|
|
||||||
.iter()
|
|
||||||
.filter(|r| r.prefix.as_deref() == Some(relation_name))
|
|
||||||
.cloned()
|
|
||||||
.collect();
|
|
||||||
if exact_match.len() == 1 {
|
|
||||||
return Ok(Some(exact_match.into_iter().next().unwrap()));
|
|
||||||
}
|
|
||||||
|
|
||||||
matching_relations.retain(|r| {
|
|
||||||
if let Some(prefix) = &r.prefix {
|
|
||||||
!relative.contains_key(prefix)
|
|
||||||
} else {
|
|
||||||
true
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
if matching_relations.len() == 1 {
|
|
||||||
Ok(Some(matching_relations.pop().unwrap()))
|
|
||||||
} else {
|
|
||||||
let constraints: Vec<_> = matching_relations
|
|
||||||
.iter()
|
|
||||||
.map(|r| r.constraint.clone())
|
|
||||||
.collect();
|
|
||||||
Err(format!(
|
|
||||||
"AMBIGUOUS_TYPE_RELATIONS: Could not reduce ambiguous type relations: {}",
|
|
||||||
constraints.join(", ")
|
|
||||||
))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn get_entity_relation(
|
|
||||||
&self,
|
|
||||||
entity_type: &crate::database::r#type::Type,
|
|
||||||
relative: &serde_json::Map<String, Value>,
|
|
||||||
relation_name: &str,
|
|
||||||
) -> Result<Option<crate::database::relation::Relation>, String> {
|
|
||||||
let relative_type_name = match relative.get("type").and_then(|v| v.as_str()) {
|
|
||||||
Some(t) => t,
|
|
||||||
None => return Ok(None),
|
|
||||||
};
|
|
||||||
|
|
||||||
let relative_type = match self.db.types.get(relative_type_name) {
|
|
||||||
Some(t) => t,
|
|
||||||
None => return Ok(None),
|
|
||||||
};
|
|
||||||
|
|
||||||
let mut relative_relations: Vec<crate::database::relation::Relation> = Vec::new();
|
|
||||||
|
|
||||||
for r in self.db.relations.values() {
|
|
||||||
if r.source_type != "entity" && r.destination_type != "entity" {
|
|
||||||
let condition1 = relative_type.hierarchy.contains(&r.source_type)
|
|
||||||
&& entity_type.hierarchy.contains(&r.destination_type);
|
|
||||||
let condition2 = entity_type.hierarchy.contains(&r.source_type)
|
|
||||||
&& relative_type.hierarchy.contains(&r.destination_type);
|
|
||||||
|
|
||||||
if condition1 || condition2 {
|
|
||||||
relative_relations.push(r.clone());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
let mut relative_relation =
|
|
||||||
self.reduce_entity_relations(relative_relations, relative, relation_name)?;
|
|
||||||
|
|
||||||
if relative_relation.is_none() {
|
|
||||||
let mut poly_relations: Vec<crate::database::relation::Relation> = Vec::new();
|
|
||||||
for r in self.db.relations.values() {
|
|
||||||
if r.destination_type == "entity" {
|
|
||||||
let condition1 = relative_type.hierarchy.contains(&r.source_type);
|
|
||||||
let condition2 = entity_type.hierarchy.contains(&r.source_type);
|
|
||||||
|
|
||||||
if condition1 || condition2 {
|
|
||||||
poly_relations.push(r.clone());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
relative_relation = self.reduce_entity_relations(poly_relations, relative, relation_name)?;
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(relative_relation)
|
|
||||||
}
|
|
||||||
|
|
||||||
fn apply_entity_relation(
|
fn apply_entity_relation(
|
||||||
source_entity: &mut serde_json::Map<String, Value>,
|
source_entity: &mut serde_json::Map<String, Value>,
|
||||||
|
|||||||
@ -47,7 +47,19 @@ impl SqlCompiler {
|
|||||||
|
|
||||||
// We expect the top level to typically be an Object or Array
|
// We expect the top level to typically be an Object or Array
|
||||||
let is_stem_query = stem_path.is_some();
|
let is_stem_query = stem_path.is_some();
|
||||||
let (sql, _) = self.walk_schema(target_schema, "t1", None, filter_keys, is_stem_query, 0, String::new())?;
|
let mut alias_counter: usize = 0;
|
||||||
|
let (sql, _) = self.walk_schema(
|
||||||
|
target_schema,
|
||||||
|
"t1",
|
||||||
|
None,
|
||||||
|
None,
|
||||||
|
None,
|
||||||
|
filter_keys,
|
||||||
|
is_stem_query,
|
||||||
|
0,
|
||||||
|
String::new(),
|
||||||
|
&mut alias_counter,
|
||||||
|
)?;
|
||||||
Ok(sql)
|
Ok(sql)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -57,11 +69,14 @@ impl SqlCompiler {
|
|||||||
&self,
|
&self,
|
||||||
schema: &crate::database::schema::Schema,
|
schema: &crate::database::schema::Schema,
|
||||||
parent_alias: &str,
|
parent_alias: &str,
|
||||||
|
parent_table_aliases: Option<&std::collections::HashMap<String, String>>,
|
||||||
|
parent_type_def: Option<&crate::database::r#type::Type>,
|
||||||
prop_name_context: Option<&str>,
|
prop_name_context: Option<&str>,
|
||||||
filter_keys: &[String],
|
filter_keys: &[String],
|
||||||
is_stem_query: bool,
|
is_stem_query: bool,
|
||||||
depth: usize,
|
depth: usize,
|
||||||
current_path: String,
|
current_path: String,
|
||||||
|
alias_counter: &mut usize,
|
||||||
) -> Result<(String, String), String> {
|
) -> Result<(String, String), String> {
|
||||||
// Determine the base schema type (could be an array, object, or literal)
|
// Determine the base schema type (could be an array, object, or literal)
|
||||||
match &schema.obj.type_ {
|
match &schema.obj.type_ {
|
||||||
@ -80,23 +95,29 @@ impl SqlCompiler {
|
|||||||
items,
|
items,
|
||||||
type_def,
|
type_def,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
prop_name_context,
|
prop_name_context,
|
||||||
true,
|
true,
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth,
|
depth,
|
||||||
next_path,
|
next_path,
|
||||||
|
alias_counter,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
let (item_sql, _) = self.walk_schema(
|
let (item_sql, _) = self.walk_schema(
|
||||||
items,
|
items,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
prop_name_context,
|
prop_name_context,
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth + 1,
|
depth + 1,
|
||||||
next_path,
|
next_path,
|
||||||
|
alias_counter,
|
||||||
)?;
|
)?;
|
||||||
return Ok((
|
return Ok((
|
||||||
format!("(SELECT jsonb_agg({}) FROM TODO)", item_sql),
|
format!("(SELECT jsonb_agg({}) FROM TODO)", item_sql),
|
||||||
@ -125,12 +146,15 @@ impl SqlCompiler {
|
|||||||
schema,
|
schema,
|
||||||
type_def,
|
type_def,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
prop_name_context,
|
prop_name_context,
|
||||||
false,
|
false,
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth,
|
depth,
|
||||||
current_path,
|
current_path,
|
||||||
|
alias_counter,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -141,40 +165,63 @@ impl SqlCompiler {
|
|||||||
return self.walk_schema(
|
return self.walk_schema(
|
||||||
target_schema,
|
target_schema,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
prop_name_context,
|
prop_name_context,
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth,
|
depth,
|
||||||
current_path,
|
current_path,
|
||||||
|
alias_counter,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
return Err(format!("Unresolved $ref: {}", ref_id));
|
return Err(format!("Unresolved $ref: {}", ref_id));
|
||||||
}
|
}
|
||||||
// Handle $family Polymorphism fallbacks for relations
|
// Handle $family Polymorphism fallbacks for relations
|
||||||
if let Some(family_target) = &schema.obj.family {
|
if let Some(family_target) = &schema.obj.family {
|
||||||
let mut all_targets = vec![family_target.clone()];
|
let base_type_name = family_target.split('.').next_back().unwrap_or(family_target).to_string();
|
||||||
if let Some(schema_id) = &schema.obj.id {
|
|
||||||
if let Some(descendants) = self.db.descendants.get(schema_id) {
|
if let Some(type_def) = self.db.types.get(&base_type_name) {
|
||||||
all_targets.extend(descendants.clone());
|
if type_def.variations.len() == 1 {
|
||||||
|
let mut bypass_schema = crate::database::schema::Schema::default();
|
||||||
|
bypass_schema.obj.r#ref = Some(family_target.clone());
|
||||||
|
return self.walk_schema(
|
||||||
|
&std::sync::Arc::new(bypass_schema),
|
||||||
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
|
prop_name_context,
|
||||||
|
filter_keys,
|
||||||
|
is_stem_query,
|
||||||
|
depth,
|
||||||
|
current_path,
|
||||||
|
alias_counter,
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
let mut family_schemas = Vec::new();
|
let mut sorted_variations: Vec<String> = type_def.variations.iter().cloned().collect();
|
||||||
for target in all_targets {
|
sorted_variations.sort();
|
||||||
let mut ref_schema = crate::database::schema::Schema::default();
|
|
||||||
ref_schema.obj.r#ref = Some(target);
|
|
||||||
family_schemas.push(std::sync::Arc::new(ref_schema));
|
|
||||||
}
|
|
||||||
|
|
||||||
return self.compile_one_of(
|
let mut family_schemas = Vec::new();
|
||||||
&family_schemas,
|
for variation in &sorted_variations {
|
||||||
parent_alias,
|
let mut ref_schema = crate::database::schema::Schema::default();
|
||||||
prop_name_context,
|
ref_schema.obj.r#ref = Some(variation.clone());
|
||||||
filter_keys,
|
family_schemas.push(std::sync::Arc::new(ref_schema));
|
||||||
is_stem_query,
|
}
|
||||||
depth,
|
|
||||||
current_path,
|
return self.compile_one_of(
|
||||||
);
|
&family_schemas,
|
||||||
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
|
prop_name_context,
|
||||||
|
filter_keys,
|
||||||
|
is_stem_query,
|
||||||
|
depth,
|
||||||
|
current_path,
|
||||||
|
alias_counter,
|
||||||
|
);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Handle oneOf Polymorphism fallbacks for relations
|
// Handle oneOf Polymorphism fallbacks for relations
|
||||||
@ -182,11 +229,14 @@ impl SqlCompiler {
|
|||||||
return self.compile_one_of(
|
return self.compile_one_of(
|
||||||
one_of,
|
one_of,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
prop_name_context,
|
prop_name_context,
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth,
|
depth,
|
||||||
current_path,
|
current_path,
|
||||||
|
alias_counter,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -195,10 +245,13 @@ impl SqlCompiler {
|
|||||||
return self.compile_inline_object(
|
return self.compile_inline_object(
|
||||||
props,
|
props,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth,
|
depth,
|
||||||
current_path,
|
current_path,
|
||||||
|
alias_counter,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -241,17 +294,18 @@ impl SqlCompiler {
|
|||||||
schema: &crate::database::schema::Schema,
|
schema: &crate::database::schema::Schema,
|
||||||
type_def: &crate::database::r#type::Type,
|
type_def: &crate::database::r#type::Type,
|
||||||
parent_alias: &str,
|
parent_alias: &str,
|
||||||
|
parent_table_aliases: Option<&std::collections::HashMap<String, String>>,
|
||||||
|
parent_type_def: Option<&crate::database::r#type::Type>,
|
||||||
prop_name: Option<&str>,
|
prop_name: Option<&str>,
|
||||||
is_array: bool,
|
is_array: bool,
|
||||||
filter_keys: &[String],
|
filter_keys: &[String],
|
||||||
is_stem_query: bool,
|
is_stem_query: bool,
|
||||||
depth: usize,
|
depth: usize,
|
||||||
current_path: String,
|
current_path: String,
|
||||||
|
alias_counter: &mut usize,
|
||||||
) -> Result<(String, String), String> {
|
) -> Result<(String, String), String> {
|
||||||
let local_ctx = format!("{}_{}", parent_alias, prop_name.unwrap_or("obj"));
|
|
||||||
|
|
||||||
// 1. Build FROM clauses and table aliases
|
// 1. Build FROM clauses and table aliases
|
||||||
let (table_aliases, from_clauses) = self.build_hierarchy_from_clauses(type_def, &local_ctx);
|
let (table_aliases, from_clauses) = self.build_hierarchy_from_clauses(type_def, alias_counter);
|
||||||
|
|
||||||
// 2. Map properties and build jsonb_build_object args
|
// 2. Map properties and build jsonb_build_object args
|
||||||
let mut select_args = self.map_properties_to_aliases(
|
let mut select_args = self.map_properties_to_aliases(
|
||||||
@ -263,39 +317,79 @@ impl SqlCompiler {
|
|||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth,
|
depth,
|
||||||
¤t_path,
|
¤t_path,
|
||||||
|
alias_counter,
|
||||||
)?;
|
)?;
|
||||||
|
|
||||||
// 2.5 Inject polymorphism directly into the query object
|
// 2.5 Inject polymorphism directly into the query object
|
||||||
if let Some(family_target) = &schema.obj.family {
|
if let Some(family_target) = &schema.obj.family {
|
||||||
let mut family_schemas = Vec::new();
|
let base_type_name = family_target.split('.').next_back().unwrap_or(family_target).to_string();
|
||||||
if let Some(base_type) = self.db.types.get(family_target) {
|
|
||||||
let mut sorted_targets: Vec<String> = base_type.variations.iter().cloned().collect();
|
|
||||||
// Ensure the base type is included if not listed in variations by default
|
|
||||||
if !sorted_targets.contains(family_target) {
|
|
||||||
sorted_targets.push(family_target.clone());
|
|
||||||
}
|
|
||||||
sorted_targets.sort();
|
|
||||||
|
|
||||||
for target in sorted_targets {
|
|
||||||
let mut ref_schema = crate::database::schema::Schema::default();
|
|
||||||
ref_schema.obj.r#ref = Some(target);
|
|
||||||
family_schemas.push(std::sync::Arc::new(ref_schema));
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
// Fallback for types not strictly defined in physical DB
|
|
||||||
let mut ref_schema = crate::database::schema::Schema::default();
|
|
||||||
ref_schema.obj.r#ref = Some(family_target.clone());
|
|
||||||
family_schemas.push(std::sync::Arc::new(ref_schema));
|
|
||||||
}
|
|
||||||
|
|
||||||
let base_alias = table_aliases.get(&type_def.name).cloned().unwrap_or_else(|| parent_alias.to_string());
|
if let Some(fam_type_def) = self.db.types.get(&base_type_name) {
|
||||||
select_args.push(format!("'id', {}.id", base_alias));
|
if fam_type_def.variations.len() == 1 {
|
||||||
let (case_sql, _) = self.compile_one_of(&family_schemas, &base_alias, None, filter_keys, is_stem_query, depth, current_path.clone())?;
|
let mut bypass_schema = crate::database::schema::Schema::default();
|
||||||
select_args.push(format!("'type', {}", case_sql));
|
bypass_schema.obj.r#ref = Some(family_target.clone());
|
||||||
|
|
||||||
|
let mut bypassed_args = self.map_properties_to_aliases(
|
||||||
|
&bypass_schema,
|
||||||
|
type_def,
|
||||||
|
&table_aliases,
|
||||||
|
parent_alias,
|
||||||
|
filter_keys,
|
||||||
|
is_stem_query,
|
||||||
|
depth,
|
||||||
|
¤t_path,
|
||||||
|
alias_counter,
|
||||||
|
)?;
|
||||||
|
select_args.append(&mut bypassed_args);
|
||||||
|
} else {
|
||||||
|
let mut family_schemas = Vec::new();
|
||||||
|
let mut sorted_fam_variations: Vec<String> = fam_type_def.variations.iter().cloned().collect();
|
||||||
|
sorted_fam_variations.sort();
|
||||||
|
|
||||||
|
for variation in &sorted_fam_variations {
|
||||||
|
let mut ref_schema = crate::database::schema::Schema::default();
|
||||||
|
ref_schema.obj.r#ref = Some(variation.clone());
|
||||||
|
family_schemas.push(std::sync::Arc::new(ref_schema));
|
||||||
|
}
|
||||||
|
|
||||||
|
let base_alias = table_aliases
|
||||||
|
.get(&type_def.name)
|
||||||
|
.cloned()
|
||||||
|
.unwrap_or_else(|| parent_alias.to_string());
|
||||||
|
select_args.push(format!("'id', {}.id", base_alias));
|
||||||
|
let (case_sql, _) = self.compile_one_of(
|
||||||
|
&family_schemas,
|
||||||
|
&base_alias,
|
||||||
|
Some(&table_aliases),
|
||||||
|
parent_type_def,
|
||||||
|
None,
|
||||||
|
filter_keys,
|
||||||
|
is_stem_query,
|
||||||
|
depth,
|
||||||
|
current_path.clone(),
|
||||||
|
alias_counter,
|
||||||
|
)?;
|
||||||
|
select_args.push(format!("'type', {}", case_sql));
|
||||||
|
}
|
||||||
|
}
|
||||||
} else if let Some(one_of) = &schema.obj.one_of {
|
} else if let Some(one_of) = &schema.obj.one_of {
|
||||||
let base_alias = table_aliases.get(&type_def.name).cloned().unwrap_or_else(|| parent_alias.to_string());
|
let base_alias = table_aliases
|
||||||
|
.get(&type_def.name)
|
||||||
|
.cloned()
|
||||||
|
.unwrap_or_else(|| parent_alias.to_string());
|
||||||
select_args.push(format!("'id', {}.id", base_alias));
|
select_args.push(format!("'id', {}.id", base_alias));
|
||||||
let (case_sql, _) = self.compile_one_of(one_of, &base_alias, None, filter_keys, is_stem_query, depth, current_path.clone())?;
|
let (case_sql, _) = self.compile_one_of(
|
||||||
|
one_of,
|
||||||
|
&base_alias,
|
||||||
|
Some(&table_aliases),
|
||||||
|
parent_type_def,
|
||||||
|
None,
|
||||||
|
filter_keys,
|
||||||
|
is_stem_query,
|
||||||
|
depth,
|
||||||
|
current_path.clone(),
|
||||||
|
alias_counter,
|
||||||
|
)?;
|
||||||
select_args.push(format!("'type', {}", case_sql));
|
select_args.push(format!("'type', {}", case_sql));
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -311,6 +405,8 @@ impl SqlCompiler {
|
|||||||
type_def,
|
type_def,
|
||||||
&table_aliases,
|
&table_aliases,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
prop_name,
|
prop_name,
|
||||||
filter_keys,
|
filter_keys,
|
||||||
¤t_path,
|
¤t_path,
|
||||||
@ -342,19 +438,20 @@ impl SqlCompiler {
|
|||||||
fn build_hierarchy_from_clauses(
|
fn build_hierarchy_from_clauses(
|
||||||
&self,
|
&self,
|
||||||
type_def: &crate::database::r#type::Type,
|
type_def: &crate::database::r#type::Type,
|
||||||
local_ctx: &str,
|
alias_counter: &mut usize,
|
||||||
) -> (std::collections::HashMap<String, String>, Vec<String>) {
|
) -> (std::collections::HashMap<String, String>, Vec<String>) {
|
||||||
let mut table_aliases = std::collections::HashMap::new();
|
let mut table_aliases = std::collections::HashMap::new();
|
||||||
let mut from_clauses = Vec::new();
|
let mut from_clauses = Vec::new();
|
||||||
|
|
||||||
for (i, table_name) in type_def.hierarchy.iter().enumerate() {
|
for (i, table_name) in type_def.hierarchy.iter().enumerate() {
|
||||||
let alias = format!("{}_t{}", local_ctx, i + 1);
|
*alias_counter += 1;
|
||||||
|
let alias = format!("{}_{}", table_name, alias_counter);
|
||||||
table_aliases.insert(table_name.clone(), alias.clone());
|
table_aliases.insert(table_name.clone(), alias.clone());
|
||||||
|
|
||||||
if i == 0 {
|
if i == 0 {
|
||||||
from_clauses.push(format!("agreego.{} {}", table_name, alias));
|
from_clauses.push(format!("agreego.{} {}", table_name, alias));
|
||||||
} else {
|
} else {
|
||||||
let prev_alias = format!("{}_t{}", local_ctx, i);
|
let prev_alias = format!("{}_{}", type_def.hierarchy[i - 1], *alias_counter - 1);
|
||||||
from_clauses.push(format!(
|
from_clauses.push(format!(
|
||||||
"JOIN agreego.{} {} ON {}.id = {}.id",
|
"JOIN agreego.{} {} ON {}.id = {}.id",
|
||||||
table_name, alias, alias, prev_alias
|
table_name, alias, alias, prev_alias
|
||||||
@ -374,12 +471,16 @@ impl SqlCompiler {
|
|||||||
is_stem_query: bool,
|
is_stem_query: bool,
|
||||||
depth: usize,
|
depth: usize,
|
||||||
current_path: &str,
|
current_path: &str,
|
||||||
|
alias_counter: &mut usize,
|
||||||
) -> Result<Vec<String>, String> {
|
) -> Result<Vec<String>, String> {
|
||||||
let mut select_args = Vec::new();
|
let mut select_args = Vec::new();
|
||||||
let grouped_fields = type_def.grouped_fields.as_ref().and_then(|v| v.as_object());
|
let grouped_fields = type_def.grouped_fields.as_ref().and_then(|v| v.as_object());
|
||||||
let merged_props = self.get_merged_properties(schema);
|
let merged_props = self.get_merged_properties(schema);
|
||||||
|
let mut sorted_keys: Vec<&String> = merged_props.keys().collect();
|
||||||
|
sorted_keys.sort();
|
||||||
|
|
||||||
for (prop_key, prop_schema) in &merged_props {
|
for prop_key in sorted_keys {
|
||||||
|
let prop_schema = &merged_props[prop_key];
|
||||||
let mut owner_alias = table_aliases
|
let mut owner_alias = table_aliases
|
||||||
.get("entity")
|
.get("entity")
|
||||||
.cloned()
|
.cloned()
|
||||||
@ -400,16 +501,20 @@ impl SqlCompiler {
|
|||||||
}
|
}
|
||||||
|
|
||||||
let is_object_or_array = match &prop_schema.obj.type_ {
|
let is_object_or_array = match &prop_schema.obj.type_ {
|
||||||
Some(crate::database::schema::SchemaTypeOrArray::Single(s)) => s == "object" || s == "array",
|
Some(crate::database::schema::SchemaTypeOrArray::Single(s)) => {
|
||||||
Some(crate::database::schema::SchemaTypeOrArray::Multiple(v)) => v.contains(&"object".to_string()) || v.contains(&"array".to_string()),
|
s == "object" || s == "array"
|
||||||
_ => false
|
}
|
||||||
|
Some(crate::database::schema::SchemaTypeOrArray::Multiple(v)) => {
|
||||||
|
v.contains(&"object".to_string()) || v.contains(&"array".to_string())
|
||||||
|
}
|
||||||
|
_ => false,
|
||||||
};
|
};
|
||||||
|
|
||||||
let is_primitive = prop_schema.obj.r#ref.is_none()
|
let is_primitive = prop_schema.obj.r#ref.is_none()
|
||||||
&& prop_schema.obj.items.is_none()
|
&& prop_schema.obj.items.is_none()
|
||||||
&& prop_schema.obj.properties.is_none()
|
&& prop_schema.obj.properties.is_none()
|
||||||
&& prop_schema.obj.one_of.is_none()
|
&& prop_schema.obj.one_of.is_none()
|
||||||
&& !is_object_or_array;
|
&& !is_object_or_array;
|
||||||
|
|
||||||
if is_primitive {
|
if is_primitive {
|
||||||
if let Some(ft) = type_def.field_types.as_ref().and_then(|v| v.as_object()) {
|
if let Some(ft) = type_def.field_types.as_ref().and_then(|v| v.as_object()) {
|
||||||
@ -428,11 +533,14 @@ impl SqlCompiler {
|
|||||||
let (val_sql, val_type) = self.walk_schema(
|
let (val_sql, val_type) = self.walk_schema(
|
||||||
prop_schema,
|
prop_schema,
|
||||||
&owner_alias,
|
&owner_alias,
|
||||||
|
Some(table_aliases),
|
||||||
|
Some(type_def), // Pass current type_def as parent_type_def for child properties
|
||||||
Some(prop_key),
|
Some(prop_key),
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth + 1,
|
depth + 1,
|
||||||
next_path,
|
next_path,
|
||||||
|
alias_counter,
|
||||||
)?;
|
)?;
|
||||||
|
|
||||||
if val_type != "abort" {
|
if val_type != "abort" {
|
||||||
@ -448,6 +556,8 @@ impl SqlCompiler {
|
|||||||
type_def: &crate::database::r#type::Type,
|
type_def: &crate::database::r#type::Type,
|
||||||
table_aliases: &std::collections::HashMap<String, String>,
|
table_aliases: &std::collections::HashMap<String, String>,
|
||||||
parent_alias: &str,
|
parent_alias: &str,
|
||||||
|
parent_table_aliases: Option<&std::collections::HashMap<String, String>>,
|
||||||
|
parent_type_def: Option<&crate::database::r#type::Type>,
|
||||||
prop_name: Option<&str>,
|
prop_name: Option<&str>,
|
||||||
filter_keys: &[String],
|
filter_keys: &[String],
|
||||||
current_path: &str,
|
current_path: &str,
|
||||||
@ -457,8 +567,13 @@ impl SqlCompiler {
|
|||||||
.cloned()
|
.cloned()
|
||||||
.unwrap_or_else(|| "err".to_string());
|
.unwrap_or_else(|| "err".to_string());
|
||||||
|
|
||||||
|
let entity_alias = table_aliases
|
||||||
|
.get("entity")
|
||||||
|
.cloned()
|
||||||
|
.unwrap_or_else(|| base_alias.clone());
|
||||||
|
|
||||||
let mut where_clauses = Vec::new();
|
let mut where_clauses = Vec::new();
|
||||||
where_clauses.push(format!("NOT {}.archived", base_alias));
|
where_clauses.push(format!("NOT {}.archived", entity_alias));
|
||||||
|
|
||||||
for (i, filter_key) in filter_keys.iter().enumerate() {
|
for (i, filter_key) in filter_keys.iter().enumerate() {
|
||||||
let mut parts = filter_key.split(':');
|
let mut parts = filter_key.split(':');
|
||||||
@ -486,115 +601,171 @@ impl SqlCompiler {
|
|||||||
let mut filter_alias = base_alias.clone();
|
let mut filter_alias = base_alias.clone();
|
||||||
|
|
||||||
if let Some(gf) = type_def.grouped_fields.as_ref().and_then(|v| v.as_object()) {
|
if let Some(gf) = type_def.grouped_fields.as_ref().and_then(|v| v.as_object()) {
|
||||||
for (t_name, fields_val) in gf {
|
for (t_name, fields_val) in gf {
|
||||||
if let Some(fields_arr) = fields_val.as_array() {
|
if let Some(fields_arr) = fields_val.as_array() {
|
||||||
if fields_arr.iter().any(|v| v.as_str() == Some(field_name)) {
|
if fields_arr.iter().any(|v| v.as_str() == Some(field_name)) {
|
||||||
filter_alias = table_aliases
|
filter_alias = table_aliases
|
||||||
.get(t_name)
|
.get(t_name)
|
||||||
.cloned()
|
.cloned()
|
||||||
.unwrap_or_else(|| base_alias.clone());
|
.unwrap_or_else(|| base_alias.clone());
|
||||||
break;
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut is_ilike = false;
|
||||||
|
let mut cast = "";
|
||||||
|
|
||||||
|
if let Some(field_types) = type_def.field_types.as_ref().and_then(|v| v.as_object()) {
|
||||||
|
if let Some(pg_type_val) = field_types.get(field_name) {
|
||||||
|
if let Some(pg_type) = pg_type_val.as_str() {
|
||||||
|
if pg_type == "uuid" {
|
||||||
|
cast = "::uuid";
|
||||||
|
} else if pg_type == "boolean" || pg_type == "bool" {
|
||||||
|
cast = "::boolean";
|
||||||
|
} else if pg_type.contains("timestamp") || pg_type == "timestamptz" || pg_type == "date"
|
||||||
|
{
|
||||||
|
cast = "::timestamptz";
|
||||||
|
} else if pg_type == "numeric"
|
||||||
|
|| pg_type.contains("int")
|
||||||
|
|| pg_type == "real"
|
||||||
|
|| pg_type == "double precision"
|
||||||
|
{
|
||||||
|
cast = "::numeric";
|
||||||
|
} else if pg_type == "text" || pg_type.contains("char") {
|
||||||
|
let mut is_enum = false;
|
||||||
|
if let Some(props) = &schema.obj.properties {
|
||||||
|
if let Some(ps) = props.get(field_name) {
|
||||||
|
is_enum = ps.obj.enum_.is_some();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !is_enum {
|
||||||
|
is_ilike = true;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
let mut is_ilike = false;
|
let param_index = i + 1;
|
||||||
let mut cast = "";
|
let p_val = format!("${}#>>'{{}}'", param_index);
|
||||||
|
|
||||||
if let Some(field_types) = type_def.field_types.as_ref().and_then(|v| v.as_object()) {
|
if op == "$in" || op == "$nin" {
|
||||||
if let Some(pg_type_val) = field_types.get(field_name) {
|
let sql_op = if op == "$in" { "IN" } else { "NOT IN" };
|
||||||
if let Some(pg_type) = pg_type_val.as_str() {
|
let subquery = format!(
|
||||||
if pg_type == "uuid" {
|
"(SELECT value{} FROM jsonb_array_elements_text(({})::jsonb))",
|
||||||
cast = "::uuid";
|
cast, p_val
|
||||||
} else if pg_type == "boolean" || pg_type == "bool" {
|
);
|
||||||
cast = "::boolean";
|
where_clauses.push(format!(
|
||||||
} else if pg_type.contains("timestamp")
|
"{}.{} {} {}",
|
||||||
|| pg_type == "timestamptz"
|
filter_alias, field_name, sql_op, subquery
|
||||||
|| pg_type == "date"
|
));
|
||||||
{
|
} else {
|
||||||
cast = "::timestamptz";
|
let sql_op = match op {
|
||||||
} else if pg_type == "numeric"
|
"$eq" => {
|
||||||
|| pg_type.contains("int")
|
if is_ilike {
|
||||||
|| pg_type == "real"
|
"ILIKE"
|
||||||
|| pg_type == "double precision"
|
} else {
|
||||||
{
|
"="
|
||||||
cast = "::numeric";
|
|
||||||
} else if pg_type == "text" || pg_type.contains("char") {
|
|
||||||
let mut is_enum = false;
|
|
||||||
if let Some(props) = &schema.obj.properties {
|
|
||||||
if let Some(ps) = props.get(field_name) {
|
|
||||||
is_enum = ps.obj.enum_.is_some();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if !is_enum {
|
|
||||||
is_ilike = true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
"$ne" => {
|
||||||
|
if is_ilike {
|
||||||
|
"NOT ILIKE"
|
||||||
|
} else {
|
||||||
|
"!="
|
||||||
|
}
|
||||||
|
}
|
||||||
|
"$gt" => ">",
|
||||||
|
"$gte" => ">=",
|
||||||
|
"$lt" => "<",
|
||||||
|
"$lte" => "<=",
|
||||||
|
_ => {
|
||||||
|
if is_ilike {
|
||||||
|
"ILIKE"
|
||||||
|
} else {
|
||||||
|
"="
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
let param_index = i + 1;
|
let param_sql = if is_ilike && (op == "$eq" || op == "$ne") {
|
||||||
let p_val = format!("${}#>>'{{}}'", param_index);
|
p_val
|
||||||
|
|
||||||
if op == "$in" || op == "$nin" {
|
|
||||||
let sql_op = if op == "$in" { "IN" } else { "NOT IN" };
|
|
||||||
let subquery = format!(
|
|
||||||
"(SELECT value{} FROM jsonb_array_elements_text(({})::jsonb))",
|
|
||||||
cast, p_val
|
|
||||||
);
|
|
||||||
where_clauses.push(format!(
|
|
||||||
"{}.{} {} {}",
|
|
||||||
filter_alias, field_name, sql_op, subquery
|
|
||||||
));
|
|
||||||
} else {
|
} else {
|
||||||
let sql_op = match op {
|
format!("({}){}", p_val, cast)
|
||||||
"$eq" => {
|
};
|
||||||
if is_ilike {
|
|
||||||
"ILIKE"
|
|
||||||
} else {
|
|
||||||
"="
|
|
||||||
}
|
|
||||||
}
|
|
||||||
"$ne" => {
|
|
||||||
if is_ilike {
|
|
||||||
"NOT ILIKE"
|
|
||||||
} else {
|
|
||||||
"!="
|
|
||||||
}
|
|
||||||
}
|
|
||||||
"$gt" => ">",
|
|
||||||
"$gte" => ">=",
|
|
||||||
"$lt" => "<",
|
|
||||||
"$lte" => "<=",
|
|
||||||
_ => {
|
|
||||||
if is_ilike {
|
|
||||||
"ILIKE"
|
|
||||||
} else {
|
|
||||||
"="
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
let param_sql = if is_ilike && (op == "$eq" || op == "$ne") {
|
where_clauses.push(format!(
|
||||||
p_val
|
"{}.{} {} {}",
|
||||||
} else {
|
filter_alias, field_name, sql_op, param_sql
|
||||||
format!("({}){}", p_val, cast)
|
));
|
||||||
};
|
|
||||||
|
|
||||||
where_clauses.push(format!(
|
|
||||||
"{}.{} {} {}",
|
|
||||||
filter_alias, field_name, sql_op, param_sql
|
|
||||||
));
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(prop) = prop_name {
|
if let Some(prop) = prop_name {
|
||||||
if prop == "target" || prop == "source" {
|
// Find what type the parent alias is actually mapping to
|
||||||
where_clauses.push(format!("{}.id = {}.{}_id", base_alias, parent_alias, prop));
|
let mut relation_alias = parent_alias.to_string();
|
||||||
} else {
|
|
||||||
where_clauses.push(format!("{}.parent_id = {}.id", base_alias, parent_alias));
|
let mut relation_resolved = false;
|
||||||
|
if let Some(parent_type) = parent_type_def {
|
||||||
|
if let Some(relation) = self
|
||||||
|
.db
|
||||||
|
.get_relation(&parent_type.name, &type_def.name, prop, None)
|
||||||
|
{
|
||||||
|
let source_col = &relation.source_columns[0];
|
||||||
|
let dest_col = &relation.destination_columns[0];
|
||||||
|
|
||||||
|
let mut possible_relation_alias = None;
|
||||||
|
if let Some(pta) = parent_table_aliases {
|
||||||
|
if let Some(a) = pta.get(&relation.source_type) {
|
||||||
|
possible_relation_alias = Some(a.clone());
|
||||||
|
} else if let Some(a) = pta.get(&relation.destination_type) {
|
||||||
|
possible_relation_alias = Some(a.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if let Some(pa) = possible_relation_alias {
|
||||||
|
relation_alias = pa;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine directionality based on the Relation metadata
|
||||||
|
if relation.source_type == parent_type.name
|
||||||
|
|| parent_type.hierarchy.contains(&relation.source_type)
|
||||||
|
{
|
||||||
|
// Parent is the source
|
||||||
|
where_clauses.push(format!(
|
||||||
|
"{}.{} = {}.{}",
|
||||||
|
relation_alias, source_col, base_alias, dest_col
|
||||||
|
));
|
||||||
|
relation_resolved = true;
|
||||||
|
} else if relation.destination_type == parent_type.name
|
||||||
|
|| parent_type.hierarchy.contains(&relation.destination_type)
|
||||||
|
{
|
||||||
|
// Parent is the destination
|
||||||
|
where_clauses.push(format!(
|
||||||
|
"{}.{} = {}.{}",
|
||||||
|
base_alias, source_col, relation_alias, dest_col
|
||||||
|
));
|
||||||
|
relation_resolved = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !relation_resolved {
|
||||||
|
// Fallback heuristics for unmapped polymorphism or abstract models
|
||||||
|
if prop == "target" || prop == "source" {
|
||||||
|
if let Some(pta) = parent_table_aliases {
|
||||||
|
if let Some(a) = pta.get("relationship") {
|
||||||
|
relation_alias = a.clone();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
where_clauses.push(format!(
|
||||||
|
"{}.id = {}.{}_id",
|
||||||
|
base_alias, relation_alias, prop
|
||||||
|
));
|
||||||
|
} else {
|
||||||
|
where_clauses.push(format!("{}.parent_id = {}.id", base_alias, relation_alias));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -605,10 +776,13 @@ impl SqlCompiler {
|
|||||||
&self,
|
&self,
|
||||||
props: &std::collections::BTreeMap<String, std::sync::Arc<crate::database::schema::Schema>>,
|
props: &std::collections::BTreeMap<String, std::sync::Arc<crate::database::schema::Schema>>,
|
||||||
parent_alias: &str,
|
parent_alias: &str,
|
||||||
|
parent_table_aliases: Option<&std::collections::HashMap<String, String>>,
|
||||||
|
parent_type_def: Option<&crate::database::r#type::Type>,
|
||||||
filter_keys: &[String],
|
filter_keys: &[String],
|
||||||
is_stem_query: bool,
|
is_stem_query: bool,
|
||||||
depth: usize,
|
depth: usize,
|
||||||
current_path: String,
|
current_path: String,
|
||||||
|
alias_counter: &mut usize,
|
||||||
) -> Result<(String, String), String> {
|
) -> Result<(String, String), String> {
|
||||||
let mut build_args = Vec::new();
|
let mut build_args = Vec::new();
|
||||||
for (k, v) in props {
|
for (k, v) in props {
|
||||||
@ -617,15 +791,18 @@ impl SqlCompiler {
|
|||||||
} else {
|
} else {
|
||||||
format!("{}.{}", current_path, k)
|
format!("{}.{}", current_path, k)
|
||||||
};
|
};
|
||||||
|
|
||||||
let (child_sql, val_type) = self.walk_schema(
|
let (child_sql, val_type) = self.walk_schema(
|
||||||
v,
|
v,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
Some(k),
|
Some(k),
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth + 1,
|
depth + 1,
|
||||||
next_path,
|
next_path,
|
||||||
|
alias_counter,
|
||||||
)?;
|
)?;
|
||||||
if val_type == "abort" {
|
if val_type == "abort" {
|
||||||
continue;
|
continue;
|
||||||
@ -640,11 +817,14 @@ impl SqlCompiler {
|
|||||||
&self,
|
&self,
|
||||||
schemas: &[Arc<crate::database::schema::Schema>],
|
schemas: &[Arc<crate::database::schema::Schema>],
|
||||||
parent_alias: &str,
|
parent_alias: &str,
|
||||||
|
parent_table_aliases: Option<&std::collections::HashMap<String, String>>,
|
||||||
|
parent_type_def: Option<&crate::database::r#type::Type>,
|
||||||
prop_name_context: Option<&str>,
|
prop_name_context: Option<&str>,
|
||||||
filter_keys: &[String],
|
filter_keys: &[String],
|
||||||
is_stem_query: bool,
|
is_stem_query: bool,
|
||||||
depth: usize,
|
depth: usize,
|
||||||
current_path: String,
|
current_path: String,
|
||||||
|
alias_counter: &mut usize,
|
||||||
) -> Result<(String, String), String> {
|
) -> Result<(String, String), String> {
|
||||||
let mut case_statements = Vec::new();
|
let mut case_statements = Vec::new();
|
||||||
let type_col = if let Some(prop) = prop_name_context {
|
let type_col = if let Some(prop) = prop_name_context {
|
||||||
@ -657,16 +837,19 @@ impl SqlCompiler {
|
|||||||
if let Some(ref_id) = &option_schema.obj.r#ref {
|
if let Some(ref_id) = &option_schema.obj.r#ref {
|
||||||
// Find the physical type this ref maps to
|
// Find the physical type this ref maps to
|
||||||
let base_type_name = ref_id.split('.').next_back().unwrap_or("").to_string();
|
let base_type_name = ref_id.split('.').next_back().unwrap_or("").to_string();
|
||||||
|
|
||||||
// Generate the nested SQL for this specific target type
|
// Generate the nested SQL for this specific target type
|
||||||
let (val_sql, _) = self.walk_schema(
|
let (val_sql, _) = self.walk_schema(
|
||||||
option_schema,
|
option_schema,
|
||||||
parent_alias,
|
parent_alias,
|
||||||
|
parent_table_aliases,
|
||||||
|
parent_type_def,
|
||||||
prop_name_context,
|
prop_name_context,
|
||||||
filter_keys,
|
filter_keys,
|
||||||
is_stem_query,
|
is_stem_query,
|
||||||
depth,
|
depth,
|
||||||
current_path.clone(),
|
current_path.clone(),
|
||||||
|
alias_counter,
|
||||||
)?;
|
)?;
|
||||||
|
|
||||||
case_statements.push(format!(
|
case_statements.push(format!(
|
||||||
@ -680,10 +863,9 @@ impl SqlCompiler {
|
|||||||
return Ok(("NULL".to_string(), "string".to_string()));
|
return Ok(("NULL".to_string(), "string".to_string()));
|
||||||
}
|
}
|
||||||
|
|
||||||
let sql = format!(
|
case_statements.sort();
|
||||||
"CASE {} ELSE NULL END",
|
|
||||||
case_statements.join(" ")
|
let sql = format!("CASE {} ELSE NULL END", case_statements.join(" "));
|
||||||
);
|
|
||||||
|
|
||||||
Ok((sql, "object".to_string()))
|
Ok((sql, "object".to_string()))
|
||||||
}
|
}
|
||||||
|
|||||||
@ -32,9 +32,12 @@ impl Queryer {
|
|||||||
Err(msg) => {
|
Err(msg) => {
|
||||||
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
return crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||||
code: "FILTER_PARSE_FAILED".to_string(),
|
code: "FILTER_PARSE_FAILED".to_string(),
|
||||||
message: msg,
|
message: msg.clone(),
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: schema_id.to_string(),
|
path: "".to_string(), // filters apply to the root query
|
||||||
|
cause: Some(msg),
|
||||||
|
context: filters.map(|f| vec![f.to_string()]),
|
||||||
|
schema: Some(schema_id.to_string()),
|
||||||
},
|
},
|
||||||
}]);
|
}]);
|
||||||
}
|
}
|
||||||
@ -104,9 +107,12 @@ impl Queryer {
|
|||||||
}
|
}
|
||||||
Err(e) => Err(crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
Err(e) => Err(crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||||
code: "QUERY_COMPILATION_FAILED".to_string(),
|
code: "QUERY_COMPILATION_FAILED".to_string(),
|
||||||
message: e,
|
message: e.clone(),
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: schema_id.to_string(),
|
path: "".to_string(),
|
||||||
|
cause: Some(e),
|
||||||
|
context: None,
|
||||||
|
schema: Some(schema_id.to_string()),
|
||||||
},
|
},
|
||||||
}])),
|
}])),
|
||||||
}
|
}
|
||||||
@ -130,14 +136,20 @@ impl Queryer {
|
|||||||
code: "QUERY_FAILED".to_string(),
|
code: "QUERY_FAILED".to_string(),
|
||||||
message: format!("Expected array from generic query, got: {:?}", other),
|
message: format!("Expected array from generic query, got: {:?}", other),
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: schema_id.to_string(),
|
path: "".to_string(),
|
||||||
|
cause: Some(format!("Expected array, got {}", other)),
|
||||||
|
context: Some(vec![sql.to_string()]),
|
||||||
|
schema: Some(schema_id.to_string()),
|
||||||
},
|
},
|
||||||
}]),
|
}]),
|
||||||
Err(e) => crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
Err(e) => crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||||
code: "QUERY_FAILED".to_string(),
|
code: "QUERY_FAILED".to_string(),
|
||||||
message: format!("SPI error in queryer: {}", e),
|
message: format!("SPI error in queryer: {}", e),
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: schema_id.to_string(),
|
path: "".to_string(),
|
||||||
|
cause: Some(format!("SPI error in queryer: {}", e)),
|
||||||
|
context: Some(vec![sql.to_string()]),
|
||||||
|
schema: Some(schema_id.to_string()),
|
||||||
},
|
},
|
||||||
}]),
|
}]),
|
||||||
}
|
}
|
||||||
|
|||||||
@ -2,7 +2,6 @@ use crate::*;
|
|||||||
pub mod runner;
|
pub mod runner;
|
||||||
pub mod types;
|
pub mod types;
|
||||||
use serde_json::json;
|
use serde_json::json;
|
||||||
pub mod sql_validator;
|
|
||||||
|
|
||||||
// Database module tests moved to src/database/executors/mock.rs
|
// Database module tests moved to src/database/executors/mock.rs
|
||||||
|
|
||||||
|
|||||||
@ -1,19 +1,10 @@
|
|||||||
|
use crate::tests::types::Suite;
|
||||||
use serde::Deserialize;
|
use serde::Deserialize;
|
||||||
|
use serde_json::Value;
|
||||||
use std::collections::HashMap;
|
use std::collections::HashMap;
|
||||||
use std::fs;
|
use std::fs;
|
||||||
use std::sync::{Arc, OnceLock, RwLock};
|
use std::sync::{Arc, OnceLock, RwLock};
|
||||||
|
|
||||||
#[derive(Debug, Deserialize)]
|
|
||||||
pub struct TestSuite {
|
|
||||||
#[allow(dead_code)]
|
|
||||||
pub description: String,
|
|
||||||
pub database: serde_json::Value,
|
|
||||||
pub tests: Vec<TestCase>,
|
|
||||||
}
|
|
||||||
|
|
||||||
use crate::tests::types::TestCase;
|
|
||||||
use serde_json::Value;
|
|
||||||
|
|
||||||
pub fn deserialize_some<'de, D>(deserializer: D) -> Result<Option<Value>, D::Error>
|
pub fn deserialize_some<'de, D>(deserializer: D) -> Result<Option<Value>, D::Error>
|
||||||
where
|
where
|
||||||
D: serde::Deserializer<'de>,
|
D: serde::Deserializer<'de>,
|
||||||
@ -23,7 +14,7 @@ where
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Type alias for easier reading
|
// Type alias for easier reading
|
||||||
type CompiledSuite = Arc<Vec<(TestSuite, Arc<crate::database::Database>)>>;
|
type CompiledSuite = Arc<Vec<(Suite, Arc<crate::database::Database>)>>;
|
||||||
|
|
||||||
// Global cache mapping filename -> Vector of (Parsed JSON suite, Compiled Database)
|
// Global cache mapping filename -> Vector of (Parsed JSON suite, Compiled Database)
|
||||||
static CACHE: OnceLock<RwLock<HashMap<String, CompiledSuite>>> = OnceLock::new();
|
static CACHE: OnceLock<RwLock<HashMap<String, CompiledSuite>>> = OnceLock::new();
|
||||||
@ -46,7 +37,7 @@ fn get_cached_file(path: &str) -> CompiledSuite {
|
|||||||
} else {
|
} else {
|
||||||
let content =
|
let content =
|
||||||
fs::read_to_string(path).unwrap_or_else(|_| panic!("Failed to read file: {}", path));
|
fs::read_to_string(path).unwrap_or_else(|_| panic!("Failed to read file: {}", path));
|
||||||
let suites: Vec<TestSuite> = serde_json::from_str(&content)
|
let suites: Vec<Suite> = serde_json::from_str(&content)
|
||||||
.unwrap_or_else(|e| panic!("Failed to parse JSON in {}: {}", path, e));
|
.unwrap_or_else(|e| panic!("Failed to parse JSON in {}: {}", path, e));
|
||||||
|
|
||||||
let mut compiled_suites = Vec::new();
|
let mut compiled_suites = Vec::new();
|
||||||
|
|||||||
@ -1,156 +0,0 @@
|
|||||||
use sqlparser::ast::{
|
|
||||||
Expr, Join, JoinConstraint, JoinOperator, Query, Select, SelectItem, SetExpr, Statement,
|
|
||||||
TableFactor, TableWithJoins, Ident,
|
|
||||||
};
|
|
||||||
use sqlparser::dialect::PostgreSqlDialect;
|
|
||||||
use sqlparser::parser::Parser;
|
|
||||||
use std::collections::HashSet;
|
|
||||||
|
|
||||||
pub fn validate_semantic_sql(sql: &str) -> Result<(), String> {
|
|
||||||
let dialect = PostgreSqlDialect {};
|
|
||||||
let statements = match Parser::parse_sql(&dialect, sql) {
|
|
||||||
Ok(s) => s,
|
|
||||||
Err(e) => return Err(format!("SQL Syntax Error: {}\nSQL: {}", e, sql)),
|
|
||||||
};
|
|
||||||
|
|
||||||
for statement in statements {
|
|
||||||
validate_statement(&statement, sql)?;
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn validate_statement(stmt: &Statement, original_sql: &str) -> Result<(), String> {
|
|
||||||
match stmt {
|
|
||||||
Statement::Query(query) => validate_query(query, original_sql)?,
|
|
||||||
Statement::Insert(insert) => {
|
|
||||||
if let Some(query) = &insert.source {
|
|
||||||
validate_query(query, original_sql)?
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Statement::Update(update) => {
|
|
||||||
if let Some(expr) = &update.selection {
|
|
||||||
validate_expr(expr, &HashSet::new(), original_sql)?;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Statement::Delete(delete) => {
|
|
||||||
if let Some(expr) = &delete.selection {
|
|
||||||
validate_expr(expr, &HashSet::new(), original_sql)?;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
_ => {}
|
|
||||||
}
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn validate_query(query: &Query, original_sql: &str) -> Result<(), String> {
|
|
||||||
if let SetExpr::Select(select) = &*query.body {
|
|
||||||
validate_select(select, original_sql)?;
|
|
||||||
}
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn validate_select(select: &Select, original_sql: &str) -> Result<(), String> {
|
|
||||||
let mut available_aliases = HashSet::new();
|
|
||||||
|
|
||||||
// 1. Collect all declared table aliases in the FROM clause and JOINs
|
|
||||||
for table_with_joins in &select.from {
|
|
||||||
collect_aliases_from_table_factor(&table_with_joins.relation, &mut available_aliases);
|
|
||||||
for join in &table_with_joins.joins {
|
|
||||||
collect_aliases_from_table_factor(&join.relation, &mut available_aliases);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// 2. Validate all SELECT projection fields
|
|
||||||
for projection in &select.projection {
|
|
||||||
if let SelectItem::UnnamedExpr(expr) | SelectItem::ExprWithAlias { expr, .. } = projection {
|
|
||||||
validate_expr(expr, &available_aliases, original_sql)?;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// 3. Validate ON conditions in joins
|
|
||||||
for table_with_joins in &select.from {
|
|
||||||
for join in &table_with_joins.joins {
|
|
||||||
if let JoinOperator::Inner(JoinConstraint::On(expr))
|
|
||||||
| JoinOperator::LeftOuter(JoinConstraint::On(expr))
|
|
||||||
| JoinOperator::RightOuter(JoinConstraint::On(expr))
|
|
||||||
| JoinOperator::FullOuter(JoinConstraint::On(expr))
|
|
||||||
| JoinOperator::Join(JoinConstraint::On(expr)) = &join.join_operator
|
|
||||||
{
|
|
||||||
validate_expr(expr, &available_aliases, original_sql)?;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// 4. Validate WHERE conditions
|
|
||||||
if let Some(selection) = &select.selection {
|
|
||||||
validate_expr(selection, &available_aliases, original_sql)?;
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn collect_aliases_from_table_factor(tf: &TableFactor, aliases: &mut HashSet<String>) {
|
|
||||||
match tf {
|
|
||||||
TableFactor::Table { name, alias, .. } => {
|
|
||||||
if let Some(table_alias) = alias {
|
|
||||||
aliases.insert(table_alias.name.value.clone());
|
|
||||||
} else if let Some(last) = name.0.last() {
|
|
||||||
match last {
|
|
||||||
sqlparser::ast::ObjectNamePart::Identifier(i) => {
|
|
||||||
aliases.insert(i.value.clone());
|
|
||||||
}
|
|
||||||
_ => {}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
TableFactor::Derived { alias: Some(table_alias), .. } => {
|
|
||||||
aliases.insert(table_alias.name.value.clone());
|
|
||||||
}
|
|
||||||
_ => {}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn validate_expr(expr: &Expr, available_aliases: &HashSet<String>, sql: &str) -> Result<(), String> {
|
|
||||||
match expr {
|
|
||||||
Expr::CompoundIdentifier(idents) => {
|
|
||||||
if idents.len() == 2 {
|
|
||||||
let alias = &idents[0].value;
|
|
||||||
if !available_aliases.is_empty() && !available_aliases.contains(alias) {
|
|
||||||
return Err(format!(
|
|
||||||
"Semantic Error: Orchestrated query referenced table alias '{}' but it was not declared in the query's FROM/JOIN clauses.\nAvailable aliases: {:?}\nSQL: {}",
|
|
||||||
alias, available_aliases, sql
|
|
||||||
));
|
|
||||||
}
|
|
||||||
} else if idents.len() > 2 {
|
|
||||||
let alias = &idents[1].value; // In form schema.table.column, 'table' is idents[1]
|
|
||||||
if !available_aliases.is_empty() && !available_aliases.contains(alias) {
|
|
||||||
return Err(format!(
|
|
||||||
"Semantic Error: Orchestrated query referenced table '{}' but it was not mapped.\nAvailable aliases: {:?}\nSQL: {}",
|
|
||||||
alias, available_aliases, sql
|
|
||||||
));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Expr::BinaryOp { left, right, .. } => {
|
|
||||||
validate_expr(left, available_aliases, sql)?;
|
|
||||||
validate_expr(right, available_aliases, sql)?;
|
|
||||||
}
|
|
||||||
Expr::IsFalse(e) | Expr::IsNotFalse(e) | Expr::IsTrue(e) | Expr::IsNotTrue(e)
|
|
||||||
| Expr::IsNull(e) | Expr::IsNotNull(e) | Expr::InList { expr: e, .. }
|
|
||||||
| Expr::Nested(e) | Expr::UnaryOp { expr: e, .. } | Expr::Cast { expr: e, .. }
|
|
||||||
| Expr::Like { expr: e, .. } | Expr::ILike { expr: e, .. } | Expr::AnyOp { left: e, .. }
|
|
||||||
| Expr::AllOp { left: e, .. } => {
|
|
||||||
validate_expr(e, available_aliases, sql)?;
|
|
||||||
}
|
|
||||||
Expr::Function(func) => {
|
|
||||||
if let sqlparser::ast::FunctionArguments::List(args) = &func.args {
|
|
||||||
if let Some(sqlparser::ast::FunctionArg::Unnamed(sqlparser::ast::FunctionArgExpr::Expr(e))) = args.args.get(0) {
|
|
||||||
validate_expr(e, available_aliases, sql)?;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
_ => {}
|
|
||||||
}
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
@ -1,11 +1,11 @@
|
|||||||
use super::expect::ExpectBlock;
|
use super::expect::Expect;
|
||||||
use crate::database::Database;
|
use crate::database::Database;
|
||||||
use serde::Deserialize;
|
use serde::Deserialize;
|
||||||
use serde_json::Value;
|
use serde_json::Value;
|
||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
|
|
||||||
#[derive(Debug, Deserialize)]
|
#[derive(Debug, Deserialize)]
|
||||||
pub struct TestCase {
|
pub struct Case {
|
||||||
pub description: String,
|
pub description: String,
|
||||||
|
|
||||||
#[serde(default = "default_action")]
|
#[serde(default = "default_action")]
|
||||||
@ -30,14 +30,14 @@ pub struct TestCase {
|
|||||||
#[serde(default)]
|
#[serde(default)]
|
||||||
pub mocks: Option<serde_json::Value>,
|
pub mocks: Option<serde_json::Value>,
|
||||||
|
|
||||||
pub expect: Option<ExpectBlock>,
|
pub expect: Option<Expect>,
|
||||||
}
|
}
|
||||||
|
|
||||||
fn default_action() -> String {
|
fn default_action() -> String {
|
||||||
"validate".to_string()
|
"validate".to_string()
|
||||||
}
|
}
|
||||||
|
|
||||||
impl TestCase {
|
impl Case {
|
||||||
pub fn run_compile(&self, db: Arc<Database>) -> Result<(), String> {
|
pub fn run_compile(&self, db: Arc<Database>) -> Result<(), String> {
|
||||||
let expected_success = self.expect.as_ref().map(|e| e.success).unwrap_or(false);
|
let expected_success = self.expect.as_ref().map(|e| e.success).unwrap_or(false);
|
||||||
|
|
||||||
@ -138,6 +138,7 @@ impl TestCase {
|
|||||||
))
|
))
|
||||||
} else if let Some(expect) = &self.expect {
|
} else if let Some(expect) = &self.expect {
|
||||||
let queries = db.executor.get_queries();
|
let queries = db.executor.get_queries();
|
||||||
|
expect.assert_pattern(&queries)?;
|
||||||
expect.assert_sql(&queries)
|
expect.assert_sql(&queries)
|
||||||
} else {
|
} else {
|
||||||
Ok(())
|
Ok(())
|
||||||
@ -176,6 +177,7 @@ impl TestCase {
|
|||||||
))
|
))
|
||||||
} else if let Some(expect) = &self.expect {
|
} else if let Some(expect) = &self.expect {
|
||||||
let queries = db.executor.get_queries();
|
let queries = db.executor.get_queries();
|
||||||
|
expect.assert_pattern(&queries)?;
|
||||||
expect.assert_sql(&queries)
|
expect.assert_sql(&queries)
|
||||||
} else {
|
} else {
|
||||||
Ok(())
|
Ok(())
|
||||||
|
|||||||
22
src/tests/types/expect/mod.rs
Normal file
22
src/tests/types/expect/mod.rs
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
pub mod pattern;
|
||||||
|
pub mod sql;
|
||||||
|
|
||||||
|
use serde::Deserialize;
|
||||||
|
use std::collections::HashMap;
|
||||||
|
|
||||||
|
#[derive(Debug, Deserialize)]
|
||||||
|
#[serde(untagged)]
|
||||||
|
pub enum SqlExpectation {
|
||||||
|
Single(String),
|
||||||
|
Multi(Vec<String>),
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Deserialize)]
|
||||||
|
pub struct Expect {
|
||||||
|
pub success: bool,
|
||||||
|
pub result: Option<serde_json::Value>,
|
||||||
|
pub errors: Option<Vec<serde_json::Value>>,
|
||||||
|
pub stems: Option<HashMap<String, HashMap<String, serde_json::Value>>>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub sql: Option<Vec<SqlExpectation>>,
|
||||||
|
}
|
||||||
@ -1,30 +1,13 @@
|
|||||||
|
use super::Expect;
|
||||||
use regex::Regex;
|
use regex::Regex;
|
||||||
use serde::Deserialize;
|
|
||||||
use std::collections::HashMap;
|
use std::collections::HashMap;
|
||||||
|
|
||||||
#[derive(Debug, Deserialize)]
|
impl Expect {
|
||||||
#[serde(untagged)]
|
|
||||||
pub enum SqlExpectation {
|
|
||||||
Single(String),
|
|
||||||
Multi(Vec<String>),
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Deserialize)]
|
|
||||||
pub struct ExpectBlock {
|
|
||||||
pub success: bool,
|
|
||||||
pub result: Option<serde_json::Value>,
|
|
||||||
pub errors: Option<Vec<serde_json::Value>>,
|
|
||||||
pub stems: Option<HashMap<String, HashMap<String, serde_json::Value>>>,
|
|
||||||
#[serde(default)]
|
|
||||||
pub sql: Option<Vec<SqlExpectation>>,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl ExpectBlock {
|
|
||||||
/// Advanced SQL execution assertion algorithm ported from `assert.go`.
|
/// Advanced SQL execution assertion algorithm ported from `assert.go`.
|
||||||
/// This compares two arrays of strings, one containing {{uuid:name}} or {{timestamp}} placeholders,
|
/// This compares two arrays of strings, one containing {{uuid:name}} or {{timestamp}} placeholders,
|
||||||
/// and the other containing actual executed database queries. It ensures that placeholder UUIDs
|
/// and the other containing actual executed database queries. It ensures that placeholder UUIDs
|
||||||
/// are consistently mapped to the same actual UUIDs across all lines, and strictly validates line-by-line sequences.
|
/// are consistently mapped to the same actual UUIDs across all lines, and strictly validates line-by-line sequences.
|
||||||
pub fn assert_sql(&self, actual: &[String]) -> Result<(), String> {
|
pub fn assert_pattern(&self, actual: &[String]) -> Result<(), String> {
|
||||||
let patterns = match &self.sql {
|
let patterns = match &self.sql {
|
||||||
Some(s) => s,
|
Some(s) => s,
|
||||||
None => return Ok(()),
|
None => return Ok(()),
|
||||||
@ -39,12 +22,6 @@ impl ExpectBlock {
|
|||||||
));
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
for query in actual {
|
|
||||||
if let Err(e) = crate::tests::sql_validator::validate_semantic_sql(query) {
|
|
||||||
return Err(e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
let ws_re = Regex::new(r"\s+").unwrap();
|
let ws_re = Regex::new(r"\s+").unwrap();
|
||||||
|
|
||||||
let types = HashMap::from([
|
let types = HashMap::from([
|
||||||
@ -82,8 +59,8 @@ impl ExpectBlock {
|
|||||||
let aline = clean_str(aline_raw);
|
let aline = clean_str(aline_raw);
|
||||||
|
|
||||||
let pattern_str_raw = match pattern_expect {
|
let pattern_str_raw = match pattern_expect {
|
||||||
SqlExpectation::Single(s) => s.clone(),
|
super::SqlExpectation::Single(s) => s.clone(),
|
||||||
SqlExpectation::Multi(m) => m.join(" "),
|
super::SqlExpectation::Multi(m) => m.join(" "),
|
||||||
};
|
};
|
||||||
|
|
||||||
let pattern_str = clean_str(&pattern_str_raw);
|
let pattern_str = clean_str(&pattern_str_raw);
|
||||||
206
src/tests/types/expect/sql.rs
Normal file
206
src/tests/types/expect/sql.rs
Normal file
@ -0,0 +1,206 @@
|
|||||||
|
use super::Expect;
|
||||||
|
use sqlparser::ast::{Expr, Query, SelectItem, Statement, TableFactor};
|
||||||
|
use sqlparser::dialect::PostgreSqlDialect;
|
||||||
|
use sqlparser::parser::Parser;
|
||||||
|
use std::collections::HashSet;
|
||||||
|
|
||||||
|
impl Expect {
|
||||||
|
pub fn assert_sql(&self, actual: &[String]) -> Result<(), String> {
|
||||||
|
for query in actual {
|
||||||
|
if let Err(e) = Self::validate_semantic_sql(query) {
|
||||||
|
return Err(e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn validate_semantic_sql(sql: &str) -> Result<(), String> {
|
||||||
|
let dialect = PostgreSqlDialect {};
|
||||||
|
let statements = match Parser::parse_sql(&dialect, sql) {
|
||||||
|
Ok(s) => s,
|
||||||
|
Err(e) => return Err(format!("SQL Syntax Error: {}\nSQL: {}", e, sql)),
|
||||||
|
};
|
||||||
|
|
||||||
|
for statement in statements {
|
||||||
|
Self::validate_statement(&statement, sql)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn validate_statement(stmt: &Statement, original_sql: &str) -> Result<(), String> {
|
||||||
|
match stmt {
|
||||||
|
Statement::Query(query) => Self::validate_query(query, &HashSet::new(), original_sql)?,
|
||||||
|
Statement::Insert(insert) => {
|
||||||
|
if let Some(query) = &insert.source {
|
||||||
|
Self::validate_query(query, &HashSet::new(), original_sql)?
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Statement::Update(update) => {
|
||||||
|
if let Some(expr) = &update.selection {
|
||||||
|
Self::validate_expr(expr, &HashSet::new(), original_sql)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Statement::Delete(delete) => {
|
||||||
|
if let Some(expr) = &delete.selection {
|
||||||
|
Self::validate_expr(expr, &HashSet::new(), original_sql)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn validate_query(
|
||||||
|
query: &Query,
|
||||||
|
available_aliases: &HashSet<String>,
|
||||||
|
original_sql: &str,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
if let sqlparser::ast::SetExpr::Select(select) = &*query.body {
|
||||||
|
Self::validate_select(&select, available_aliases, original_sql)?;
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn validate_select(
|
||||||
|
select: &sqlparser::ast::Select,
|
||||||
|
parent_aliases: &HashSet<String>,
|
||||||
|
original_sql: &str,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
let mut available_aliases = parent_aliases.clone();
|
||||||
|
|
||||||
|
// 1. Collect all declared table aliases in the FROM clause and JOINs
|
||||||
|
for table_with_joins in &select.from {
|
||||||
|
Self::collect_aliases_from_table_factor(&table_with_joins.relation, &mut available_aliases);
|
||||||
|
for join in &table_with_joins.joins {
|
||||||
|
Self::collect_aliases_from_table_factor(&join.relation, &mut available_aliases);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Validate all SELECT projection fields
|
||||||
|
for projection in &select.projection {
|
||||||
|
if let SelectItem::UnnamedExpr(expr) | SelectItem::ExprWithAlias { expr, .. } = projection {
|
||||||
|
Self::validate_expr(expr, &available_aliases, original_sql)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Validate ON conditions in joins
|
||||||
|
for table_with_joins in &select.from {
|
||||||
|
for join in &table_with_joins.joins {
|
||||||
|
if let sqlparser::ast::JoinOperator::Inner(sqlparser::ast::JoinConstraint::On(expr))
|
||||||
|
| sqlparser::ast::JoinOperator::LeftOuter(sqlparser::ast::JoinConstraint::On(expr))
|
||||||
|
| sqlparser::ast::JoinOperator::RightOuter(sqlparser::ast::JoinConstraint::On(expr))
|
||||||
|
| sqlparser::ast::JoinOperator::FullOuter(sqlparser::ast::JoinConstraint::On(expr))
|
||||||
|
| sqlparser::ast::JoinOperator::Join(sqlparser::ast::JoinConstraint::On(expr)) =
|
||||||
|
&join.join_operator
|
||||||
|
{
|
||||||
|
Self::validate_expr(expr, &available_aliases, original_sql)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 4. Validate WHERE conditions
|
||||||
|
if let Some(selection) = &select.selection {
|
||||||
|
Self::validate_expr(selection, &available_aliases, original_sql)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn collect_aliases_from_table_factor(tf: &TableFactor, aliases: &mut HashSet<String>) {
|
||||||
|
match tf {
|
||||||
|
TableFactor::Table { name, alias, .. } => {
|
||||||
|
if let Some(table_alias) = alias {
|
||||||
|
aliases.insert(table_alias.name.value.clone());
|
||||||
|
} else if let Some(last) = name.0.last() {
|
||||||
|
match last {
|
||||||
|
sqlparser::ast::ObjectNamePart::Identifier(i) => {
|
||||||
|
aliases.insert(i.value.clone());
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
TableFactor::Derived {
|
||||||
|
subquery,
|
||||||
|
alias: Some(table_alias),
|
||||||
|
..
|
||||||
|
} => {
|
||||||
|
aliases.insert(table_alias.name.value.clone());
|
||||||
|
// A derived table is technically a nested scope which is opaque outside, but for pure semantic checks
|
||||||
|
// its internal contents should be validated purely within its own scope (not leaking external aliases in, usually)
|
||||||
|
// but Postgres allows lateral correlation. We will validate its interior with an empty scope.
|
||||||
|
let _ = Self::validate_query(subquery, &HashSet::new(), "");
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn validate_expr(
|
||||||
|
expr: &Expr,
|
||||||
|
available_aliases: &HashSet<String>,
|
||||||
|
sql: &str,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
match expr {
|
||||||
|
Expr::CompoundIdentifier(idents) => {
|
||||||
|
if idents.len() == 2 {
|
||||||
|
let alias = &idents[0].value;
|
||||||
|
if !available_aliases.is_empty() && !available_aliases.contains(alias) {
|
||||||
|
return Err(format!(
|
||||||
|
"Semantic Error: Orchestrated query referenced table alias '{}' but it was not declared in the query's FROM/JOIN clauses.\nAvailable aliases: {:?}\nSQL: {}",
|
||||||
|
alias, available_aliases, sql
|
||||||
|
));
|
||||||
|
}
|
||||||
|
} else if idents.len() > 2 {
|
||||||
|
let alias = &idents[1].value; // In form schema.table.column, 'table' is idents[1]
|
||||||
|
if !available_aliases.is_empty() && !available_aliases.contains(alias) {
|
||||||
|
return Err(format!(
|
||||||
|
"Semantic Error: Orchestrated query referenced table '{}' but it was not mapped.\nAvailable aliases: {:?}\nSQL: {}",
|
||||||
|
alias, available_aliases, sql
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Expr::Subquery(subquery) => Self::validate_query(subquery, available_aliases, sql)?,
|
||||||
|
Expr::Exists { subquery, .. } => Self::validate_query(subquery, available_aliases, sql)?,
|
||||||
|
Expr::InSubquery {
|
||||||
|
expr: e, subquery, ..
|
||||||
|
} => {
|
||||||
|
Self::validate_expr(e, available_aliases, sql)?;
|
||||||
|
Self::validate_query(subquery, available_aliases, sql)?;
|
||||||
|
}
|
||||||
|
Expr::BinaryOp { left, right, .. } => {
|
||||||
|
Self::validate_expr(left, available_aliases, sql)?;
|
||||||
|
Self::validate_expr(right, available_aliases, sql)?;
|
||||||
|
}
|
||||||
|
Expr::IsFalse(e)
|
||||||
|
| Expr::IsNotFalse(e)
|
||||||
|
| Expr::IsTrue(e)
|
||||||
|
| Expr::IsNotTrue(e)
|
||||||
|
| Expr::IsNull(e)
|
||||||
|
| Expr::IsNotNull(e)
|
||||||
|
| Expr::InList { expr: e, .. }
|
||||||
|
| Expr::Nested(e)
|
||||||
|
| Expr::UnaryOp { expr: e, .. }
|
||||||
|
| Expr::Cast { expr: e, .. }
|
||||||
|
| Expr::Like { expr: e, .. }
|
||||||
|
| Expr::ILike { expr: e, .. }
|
||||||
|
| Expr::AnyOp { left: e, .. }
|
||||||
|
| Expr::AllOp { left: e, .. } => {
|
||||||
|
Self::validate_expr(e, available_aliases, sql)?;
|
||||||
|
}
|
||||||
|
Expr::Function(func) => {
|
||||||
|
if let sqlparser::ast::FunctionArguments::List(args) = &func.args {
|
||||||
|
if let Some(sqlparser::ast::FunctionArg::Unnamed(sqlparser::ast::FunctionArgExpr::Expr(
|
||||||
|
e,
|
||||||
|
))) = args.args.get(0)
|
||||||
|
{
|
||||||
|
Self::validate_expr(e, available_aliases, sql)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
}
|
||||||
@ -2,6 +2,6 @@ pub mod case;
|
|||||||
pub mod expect;
|
pub mod expect;
|
||||||
pub mod suite;
|
pub mod suite;
|
||||||
|
|
||||||
pub use case::TestCase;
|
pub use case::Case;
|
||||||
pub use expect::ExpectBlock;
|
pub use expect::Expect;
|
||||||
pub use suite::TestSuite;
|
pub use suite::Suite;
|
||||||
|
|||||||
@ -1,10 +1,10 @@
|
|||||||
use super::case::TestCase;
|
use super::case::Case;
|
||||||
use serde::Deserialize;
|
use serde::Deserialize;
|
||||||
|
|
||||||
#[derive(Debug, Deserialize)]
|
#[derive(Debug, Deserialize)]
|
||||||
pub struct TestSuite {
|
pub struct Suite {
|
||||||
#[allow(dead_code)]
|
#[allow(dead_code)]
|
||||||
pub description: String,
|
pub description: String,
|
||||||
pub database: serde_json::Value,
|
pub database: serde_json::Value,
|
||||||
pub tests: Vec<TestCase>,
|
pub tests: Vec<Case>,
|
||||||
}
|
}
|
||||||
|
|||||||
@ -67,7 +67,12 @@ impl Validator {
|
|||||||
.map(|e| crate::drop::Error {
|
.map(|e| crate::drop::Error {
|
||||||
code: e.code,
|
code: e.code,
|
||||||
message: e.message,
|
message: e.message,
|
||||||
details: crate::drop::ErrorDetails { path: e.path },
|
details: crate::drop::ErrorDetails {
|
||||||
|
path: e.path,
|
||||||
|
cause: None,
|
||||||
|
context: None,
|
||||||
|
schema: None,
|
||||||
|
},
|
||||||
})
|
})
|
||||||
.collect();
|
.collect();
|
||||||
crate::drop::Drop::with_errors(errors)
|
crate::drop::Drop::with_errors(errors)
|
||||||
@ -76,7 +81,12 @@ impl Validator {
|
|||||||
Err(e) => crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
Err(e) => crate::drop::Drop::with_errors(vec![crate::drop::Error {
|
||||||
code: e.code,
|
code: e.code,
|
||||||
message: e.message,
|
message: e.message,
|
||||||
details: crate::drop::ErrorDetails { path: e.path },
|
details: crate::drop::ErrorDetails {
|
||||||
|
path: e.path,
|
||||||
|
cause: None,
|
||||||
|
context: None,
|
||||||
|
schema: None,
|
||||||
|
},
|
||||||
}]),
|
}]),
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
@ -84,7 +94,10 @@ impl Validator {
|
|||||||
code: "SCHEMA_NOT_FOUND".to_string(),
|
code: "SCHEMA_NOT_FOUND".to_string(),
|
||||||
message: format!("Schema {} not found", schema_id),
|
message: format!("Schema {} not found", schema_id),
|
||||||
details: crate::drop::ErrorDetails {
|
details: crate::drop::ErrorDetails {
|
||||||
path: "".to_string(),
|
path: "/".to_string(),
|
||||||
|
cause: None,
|
||||||
|
context: None,
|
||||||
|
schema: None,
|
||||||
},
|
},
|
||||||
}])
|
}])
|
||||||
}
|
}
|
||||||
|
|||||||
54
t10.json
Normal file
54
t10.json
Normal file
@ -0,0 +1,54 @@
|
|||||||
|
[
|
||||||
|
[
|
||||||
|
"(SELECT jsonb_build_object(",
|
||||||
|
" 'id', organization_1.id,",
|
||||||
|
" 'type', CASE",
|
||||||
|
" WHEN organization_1.type = 'person' THEN",
|
||||||
|
" ((SELECT jsonb_build_object(",
|
||||||
|
" 'age', person_3.age,",
|
||||||
|
" 'archived', entity_5.archived,",
|
||||||
|
" 'created_at', entity_5.created_at,",
|
||||||
|
" 'first_name', person_3.first_name,",
|
||||||
|
" 'id', entity_5.id,",
|
||||||
|
" 'last_name', person_3.last_name,",
|
||||||
|
" 'name', entity_5.name,",
|
||||||
|
" 'type', entity_5.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.person person_3",
|
||||||
|
" JOIN agreego.organization organization_4 ON organization_4.id = person_3.id",
|
||||||
|
" JOIN agreego.entity entity_5 ON entity_5.id = organization_4.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_5.archived))",
|
||||||
|
" WHEN organization_1.type = 'bot' THEN",
|
||||||
|
" ((SELECT jsonb_build_object(",
|
||||||
|
" 'archived', entity_8.archived,",
|
||||||
|
" 'created_at', entity_8.created_at,",
|
||||||
|
" 'id', entity_8.id,",
|
||||||
|
" 'name', entity_8.name,",
|
||||||
|
" 'token', bot_6.token,",
|
||||||
|
" 'type', entity_8.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.bot bot_6",
|
||||||
|
" JOIN agreego.organization organization_7 ON organization_7.id = bot_6.id",
|
||||||
|
" JOIN agreego.entity entity_8 ON entity_8.id = organization_7.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_8.archived))",
|
||||||
|
" WHEN organization_1.type = 'organization' THEN",
|
||||||
|
" ((SELECT jsonb_build_object(",
|
||||||
|
" 'archived', entity_10.archived,",
|
||||||
|
" 'created_at', entity_10.created_at,",
|
||||||
|
" 'id', entity_10.id,",
|
||||||
|
" 'name', entity_10.name,",
|
||||||
|
" 'type', entity_10.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.organization organization_9",
|
||||||
|
" JOIN agreego.entity entity_10 ON entity_10.id = organization_9.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_10.archived))",
|
||||||
|
" ELSE NULL END",
|
||||||
|
")",
|
||||||
|
"FROM agreego.organization organization_1",
|
||||||
|
"JOIN agreego.entity entity_2 ON entity_2.id = organization_1.id",
|
||||||
|
"WHERE NOT entity_2.archived)"
|
||||||
|
]
|
||||||
|
]
|
||||||
164
t4.json
Normal file
164
t4.json
Normal file
@ -0,0 +1,164 @@
|
|||||||
|
[
|
||||||
|
[
|
||||||
|
"(SELECT jsonb_build_object(",
|
||||||
|
" 'addresses',",
|
||||||
|
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||||
|
" 'archived', entity_6.archived,",
|
||||||
|
" 'created_at', entity_6.created_at,",
|
||||||
|
" 'id', entity_6.id,",
|
||||||
|
" 'is_primary', contact_4.is_primary,",
|
||||||
|
" 'name', entity_6.name,",
|
||||||
|
" 'target',",
|
||||||
|
" (SELECT jsonb_build_object(",
|
||||||
|
" 'archived', entity_8.archived,",
|
||||||
|
" 'city', address_7.city,",
|
||||||
|
" 'created_at', entity_8.created_at,",
|
||||||
|
" 'id', entity_8.id,",
|
||||||
|
" 'name', entity_8.name,",
|
||||||
|
" 'type', entity_8.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.address address_7",
|
||||||
|
" JOIN agreego.entity entity_8 ON entity_8.id = address_7.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_8.archived",
|
||||||
|
" AND relationship_5.target_id = address_7.id),",
|
||||||
|
" 'type', entity_6.type",
|
||||||
|
" )), '[]'::jsonb)",
|
||||||
|
" FROM agreego.contact contact_4",
|
||||||
|
" JOIN agreego.relationship relationship_5 ON relationship_5.id = contact_4.id",
|
||||||
|
" JOIN agreego.entity entity_6 ON entity_6.id = relationship_5.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_6.archived",
|
||||||
|
" AND contact_4.parent_id = entity_3.id),",
|
||||||
|
" 'age', person_1.age,",
|
||||||
|
" 'archived', entity_3.archived,",
|
||||||
|
" 'contacts',",
|
||||||
|
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||||
|
" 'archived', entity_11.archived,",
|
||||||
|
" 'created_at', entity_11.created_at,",
|
||||||
|
" 'id', entity_11.id,",
|
||||||
|
" 'is_primary', contact_9.is_primary,",
|
||||||
|
" 'name', entity_11.name,",
|
||||||
|
" 'target', CASE",
|
||||||
|
" WHEN entity_11.target_type = 'address' THEN",
|
||||||
|
" ((SELECT jsonb_build_object(",
|
||||||
|
" 'archived', entity_17.archived,",
|
||||||
|
" 'city', address_16.city,",
|
||||||
|
" 'created_at', entity_17.created_at,",
|
||||||
|
" 'id', entity_17.id,",
|
||||||
|
" 'name', entity_17.name,",
|
||||||
|
" 'type', entity_17.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.address address_16",
|
||||||
|
" JOIN agreego.entity entity_17 ON entity_17.id = address_16.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_17.archived",
|
||||||
|
" AND relationship_10.target_id = address_16.id))",
|
||||||
|
" WHEN entity_11.target_type = 'email_address' THEN",
|
||||||
|
" ((SELECT jsonb_build_object(",
|
||||||
|
" 'address', email_address_14.address,",
|
||||||
|
" 'archived', entity_15.archived,",
|
||||||
|
" 'created_at', entity_15.created_at,",
|
||||||
|
" 'id', entity_15.id,",
|
||||||
|
" 'name', entity_15.name,",
|
||||||
|
" 'type', entity_15.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.email_address email_address_14",
|
||||||
|
" JOIN agreego.entity entity_15 ON entity_15.id = email_address_14.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_15.archived",
|
||||||
|
" AND relationship_10.target_id = email_address_14.id))",
|
||||||
|
" WHEN entity_11.target_type = 'phone_number' THEN",
|
||||||
|
" ((SELECT jsonb_build_object(",
|
||||||
|
" 'archived', entity_13.archived,",
|
||||||
|
" 'created_at', entity_13.created_at,",
|
||||||
|
" 'id', entity_13.id,",
|
||||||
|
" 'name', entity_13.name,",
|
||||||
|
" 'number', phone_number_12.number,",
|
||||||
|
" 'type', entity_13.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.phone_number phone_number_12",
|
||||||
|
" JOIN agreego.entity entity_13 ON entity_13.id = phone_number_12.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_13.archived",
|
||||||
|
" AND relationship_10.target_id = phone_number_12.id))",
|
||||||
|
" ELSE NULL END,",
|
||||||
|
" 'type', entity_11.type",
|
||||||
|
" )), '[]'::jsonb)",
|
||||||
|
" FROM agreego.contact contact_9",
|
||||||
|
" JOIN agreego.relationship relationship_10 ON relationship_10.id = contact_9.id",
|
||||||
|
" JOIN agreego.entity entity_11 ON entity_11.id = relationship_10.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_11.archived",
|
||||||
|
" AND contact_9.parent_id = entity_3.id),",
|
||||||
|
" 'created_at', entity_3.created_at,",
|
||||||
|
" 'email_addresses',",
|
||||||
|
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||||
|
" 'archived', entity_20.archived,",
|
||||||
|
" 'created_at', entity_20.created_at,",
|
||||||
|
" 'id', entity_20.id,",
|
||||||
|
" 'is_primary', contact_18.is_primary,",
|
||||||
|
" 'name', entity_20.name,",
|
||||||
|
" 'target',",
|
||||||
|
" (SELECT jsonb_build_object(",
|
||||||
|
" 'address', email_address_21.address,",
|
||||||
|
" 'archived', entity_22.archived,",
|
||||||
|
" 'created_at', entity_22.created_at,",
|
||||||
|
" 'id', entity_22.id,",
|
||||||
|
" 'name', entity_22.name,",
|
||||||
|
" 'type', entity_22.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.email_address email_address_21",
|
||||||
|
" JOIN agreego.entity entity_22 ON entity_22.id = email_address_21.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_22.archived",
|
||||||
|
" AND relationship_19.target_id = email_address_21.id),",
|
||||||
|
" 'type', entity_20.type",
|
||||||
|
" )), '[]'::jsonb)",
|
||||||
|
" FROM agreego.contact contact_18",
|
||||||
|
" JOIN agreego.relationship relationship_19 ON relationship_19.id = contact_18.id",
|
||||||
|
" JOIN agreego.entity entity_20 ON entity_20.id = relationship_19.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_20.archived",
|
||||||
|
" AND contact_18.parent_id = entity_3.id),",
|
||||||
|
" 'first_name', person_1.first_name,",
|
||||||
|
" 'id', entity_3.id,",
|
||||||
|
" 'last_name', person_1.last_name,",
|
||||||
|
" 'name', entity_3.name,",
|
||||||
|
" 'phone_numbers',",
|
||||||
|
" (SELECT COALESCE(jsonb_agg(jsonb_build_object(",
|
||||||
|
" 'archived', entity_25.archived,",
|
||||||
|
" 'created_at', entity_25.created_at,",
|
||||||
|
" 'id', entity_25.id,",
|
||||||
|
" 'is_primary', contact_23.is_primary,",
|
||||||
|
" 'name', entity_25.name,",
|
||||||
|
" 'target',",
|
||||||
|
" (SELECT jsonb_build_object(",
|
||||||
|
" 'archived', entity_27.archived,",
|
||||||
|
" 'created_at', entity_27.created_at,",
|
||||||
|
" 'id', entity_27.id,",
|
||||||
|
" 'name', entity_27.name,",
|
||||||
|
" 'number', phone_number_26.number,",
|
||||||
|
" 'type', entity_27.type",
|
||||||
|
" )",
|
||||||
|
" FROM agreego.phone_number phone_number_26",
|
||||||
|
" JOIN agreego.entity entity_27 ON entity_27.id = phone_number_26.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_27.archived",
|
||||||
|
" AND relationship_24.target_id = phone_number_26.id),",
|
||||||
|
" 'type', entity_25.type",
|
||||||
|
" )), '[]'::jsonb)",
|
||||||
|
" FROM agreego.contact contact_23",
|
||||||
|
" JOIN agreego.relationship relationship_24 ON relationship_24.id = contact_23.id",
|
||||||
|
" JOIN agreego.entity entity_25 ON entity_25.id = relationship_24.id",
|
||||||
|
" WHERE",
|
||||||
|
" NOT entity_25.archived",
|
||||||
|
" AND contact_23.parent_id = entity_3.id),",
|
||||||
|
" 'type', entity_3.type",
|
||||||
|
")",
|
||||||
|
"FROM agreego.person person_1",
|
||||||
|
"JOIN agreego.organization organization_2 ON organization_2.id = person_1.id",
|
||||||
|
"JOIN agreego.entity entity_3 ON entity_3.id = organization_2.id",
|
||||||
|
"WHERE NOT entity_3.archived)"
|
||||||
|
]
|
||||||
|
]
|
||||||
Reference in New Issue
Block a user