fixed ordering of all things sql
This commit is contained in:
23
GEMINI.md
23
GEMINI.md
@ -285,3 +285,26 @@ JSPG abandons the standard `cargo pgrx test` model in favor of native OS testing
|
||||
3. **Modular Test Dispatcher**: The `src/tests/types/` module deserializes the abstract JSON test payloads into `Suite`, `Case`, and `Expect` data structures.
|
||||
* The `compile` action natively asserts the exact output shape of `jspg_stems`, allowing structural and relationship mapping logic to be tested purely through JSON without writing brute-force manual tests in Rust.
|
||||
4. **Unit Context Execution**: When `cargo test` executes, the runner iterates the JSON payloads. Because the tests run natively inside the module via `#cfg(test)`, the Rust compiler globally erases `pgrx` C-linkage, instantiates the `MockExecutor`, and allows for pure structural evaluation of complex database logic completely in memory in parallel.
|
||||
|
||||
### SQL Expectation Formatting & Auto-Variablization
|
||||
|
||||
Because JSPG SQL compilation generates large, complex relational statements (often featuring dynamically generated UUIDs or timestamps), manually updating expected SQL strings in the test fixtures is error-prone and tedious. To streamline this, JSPG includes a built-in intelligent test fixture formatter.
|
||||
|
||||
**When to use it:**
|
||||
Whenever you modify the internal SQL generation logic (in the Queryer or Merger) and need to update the expected SQL outputs across the entire test suite.
|
||||
|
||||
**How to run it:**
|
||||
Run the test suite sequentially while passing the `UPDATE_EXPECT=1` environment variable:
|
||||
```bash
|
||||
UPDATE_EXPECT=1 cargo test --test-threads=1
|
||||
```
|
||||
*Note: The `--test-threads=1` flag is strictly required to prevent parallel tests from concurrently overwriting the same JSON fixture files and corrupting them.*
|
||||
|
||||
**How it works (Intelligent Variablization):**
|
||||
The JSPG engine natively generates actual, random UUIDs in memory for records inserted during `merger` tests. To assert relational integrity without hardcoding ephemeral random strings, the formatter utilizes an intelligent variable extraction map:
|
||||
1. **Payload Extraction**: Before evaluating the SQL output, the test runner recursively scans the JSON of the `data` and `mocks` blocks for that specific test case. It maps any physical UUID it finds to its exact JSON path (e.g., `3333...` -> `mocks.0.id`).
|
||||
2. **SQL Canonicalization**: The test runner utilizes `sqlparser` to format the raw engine SQL into pristine, multi-line readable structures.
|
||||
3. **Variable Mapping**: It scans the formatted SQL using regex for UUIDs. If it encounters a UUID matching the payload extraction map, it replaces it with a template tag like `{{uuid:mocks.0.id}}` or `{{uuid:data.customer_id}}`.
|
||||
4. **Generated Fallbacks**: If it encounters a brand-new random UUID that wasn't provided in the inputs (e.g., a newly generated ID for an `INSERT`), it assigns it a sequential tracking variable like `{{uuid:generated_0}}`. Every subsequent appearance of that *exact* same random UUID in the SQL transaction will reuse the `{{uuid:generated_0}}` tag. Timestamps are naturally replaced with `{{timestamp}}`.
|
||||
|
||||
This guarantees the `assert_pattern` execution engine can strictly validate that the exact same ID generated for a parent entity is correctly passed as a foreign key to its children across complex database transactions.
|
||||
|
||||
1891
fixtures/merger.json
1891
fixtures/merger.json
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -1,128 +0,0 @@
|
||||
import json
|
||||
import re
|
||||
import subprocess
|
||||
import os
|
||||
|
||||
def format_sql(sql_str):
|
||||
"""
|
||||
Given a single-line SQL string from the test runner,
|
||||
formats it with beautiful indentation according to the
|
||||
rules seen in the jspg project fixtures.
|
||||
"""
|
||||
|
||||
# 1. First, let's normalize spaces around operators to make splitting easier.
|
||||
# We'll use a simple regex tokenizer.
|
||||
# The actual SQL doesn't have spaces around =, >, <, etc.
|
||||
sql_str = re.sub(r'([a-zA-Z0-9_]+)\.([a-zA-Z0-9_]+)=([a-zA-Z0-9_]+)\.([a-zA-Z0-9_]+)', r'\1.\2 = \3.\4', sql_str)
|
||||
sql_str = re.sub(r"([a-zA-Z0-9_]+)\.([a-zA-Z0-9_]+)='([a-zA-Z0-9_]+)'", r"\1.\2 = '\3'", sql_str)
|
||||
sql_str = re.sub(r"([a-zA-Z0-9_]+)\.([a-zA-Z0-9_]+)>([a-zA-Z0-9_]+)\.([a-zA-Z0-9_]+)", r"\1.\2 > \3.\4", sql_str)
|
||||
sql_str = sql_str.replace("AND ", " AND ")
|
||||
sql_str = sql_str.replace("WHERE NOT", "WHERE NOT")
|
||||
|
||||
# We'll just run a basic custom state-machine formatter
|
||||
# Let's clean up tokens to preserve spaces.
|
||||
|
||||
# We will build the string by adding newlines and indentation where appropriate.
|
||||
out = []
|
||||
indent = 0
|
||||
i = 0
|
||||
|
||||
# A quick helper to match and consume
|
||||
def match(prefix):
|
||||
if sql_str[i:].startswith(prefix):
|
||||
return True
|
||||
return False
|
||||
|
||||
in_build_object = []
|
||||
|
||||
# Let's just use a simpler replacement strategy for line breaks,
|
||||
# then iterate over lines to fix indentation.
|
||||
|
||||
# Pre-process for line breaks:
|
||||
s = sql_str
|
||||
|
||||
# Break before certain keywords
|
||||
s = s.replace("(SELECT COALESCE", "\n(SELECT COALESCE")
|
||||
s = s.replace("FROM ", "\nFROM ")
|
||||
s = s.replace("JOIN ", "\nJOIN ")
|
||||
s = s.replace("WHERE ", "\nWHERE\n ")
|
||||
s = s.replace(" AND ", "\n AND ")
|
||||
|
||||
# Break before keys in jsonb_build_object, but only if they are followed by a subquery
|
||||
# We'll do this by matching: ,'key_name',(SELECT
|
||||
s = re.sub(r",('([^']+)')\s*,\s*\(SELECT", r",\n\1,\n(SELECT", s)
|
||||
|
||||
# Also break scalar keys in jsonb_build_object
|
||||
s = re.sub(r",('([^']+)')\s*,", r",\n\1, ", s)
|
||||
s = s.replace("jsonb_build_object('", "jsonb_build_object(\n'")
|
||||
|
||||
# CASE statements
|
||||
s = s.replace("CASE WHEN", "CASE\nWHEN")
|
||||
s = s.replace("THEN(", "THEN\n(")
|
||||
s = s.replace("ELSE NULL END", "\nELSE NULL END")
|
||||
s = s.replace(" WHEN ", "\nWHEN ")
|
||||
|
||||
lines = [l.strip() for l in s.split('\n') if l.strip()]
|
||||
|
||||
# Now we do a pass to compute indentations based on parenthesis matching and keywords.
|
||||
formatted_lines = []
|
||||
current_indent = 0
|
||||
|
||||
for idx, line in enumerate(lines):
|
||||
# Calculate indent delta before
|
||||
close_paren_count = 0
|
||||
while line.startswith(')'):
|
||||
close_paren_count += 1
|
||||
line = line[1:]
|
||||
|
||||
if close_paren_count > 0:
|
||||
current_indent = max(0, current_indent - 2 * close_paren_count)
|
||||
# Prepend the closed parens to the line properly if there's text left,
|
||||
# or just emit them if it's just parens.
|
||||
if line:
|
||||
pass # We handle adding them back later
|
||||
else:
|
||||
formatted_lines.append(" " * current_indent + ")" * close_paren_count)
|
||||
continue
|
||||
|
||||
# Handle specific keywords
|
||||
if line.startswith("FROM ") or line.startswith("JOIN ") or line.startswith("WHERE"):
|
||||
pass # Keep parent indent
|
||||
elif line.startswith("AND "):
|
||||
line = " " + line
|
||||
elif line.startswith("WHEN "):
|
||||
line = " " + line
|
||||
elif line.startswith("ELSE "):
|
||||
line = " " + line
|
||||
|
||||
# If it's a key value pair in build_object, we indent
|
||||
if line.startswith("'") and "jsonb_build_object" not in line:
|
||||
# We add 2 extra spaces for the items inside build_object
|
||||
line = " " + line
|
||||
|
||||
if line.startswith("(SELECT jsonb_build_object"):
|
||||
line = " " + line
|
||||
|
||||
formatted_line = (" " * current_indent) + (")" * close_paren_count) + line
|
||||
|
||||
# Calculate indent delta after
|
||||
open_paren_count = line.count('(') - line.count(')')
|
||||
current_indent += max(0, open_paren_count * 2)
|
||||
|
||||
formatted_lines.append(formatted_line)
|
||||
|
||||
return formatted_lines
|
||||
|
||||
def format_sql_regex(sql_str):
|
||||
# The actual jspg parser output might be tricky, let's use a simpler heuristic formatting
|
||||
# based exactly on the user's provided output format.
|
||||
# It requires custom tokenizing because of nested SELECTs.
|
||||
|
||||
# Let's try to tokenise
|
||||
tokens = re.split(r"(\(SELECT COALESCE|\(SELECT jsonb_build_object|FROM|JOIN|WHERE|AND|CASE|WHEN|THEN|ELSE NULL END|\n|,\s*')", sql_str)
|
||||
|
||||
pass
|
||||
|
||||
# We will actually just run `cargo test -- --nocapture` to grab the actual SQLs
|
||||
# and do some string replacements.
|
||||
# Given the complexity, let's build a dedicated node-based formatter in python.
|
||||
@ -1,3 +1,4 @@
|
||||
use indexmap::IndexSet;
|
||||
use crate::database::schema::Schema;
|
||||
|
||||
impl Schema {
|
||||
@ -65,10 +66,10 @@ impl Schema {
|
||||
}
|
||||
}
|
||||
} else if let Some(one_of) = &self.obj.one_of {
|
||||
let mut type_vals = std::collections::HashSet::new();
|
||||
let mut kind_vals = std::collections::HashSet::new();
|
||||
let mut type_vals = IndexSet::new();
|
||||
let mut kind_vals = IndexSet::new();
|
||||
let mut disjoint_base = true;
|
||||
let mut structural_types = std::collections::HashSet::new();
|
||||
let mut structural_types = IndexSet::new();
|
||||
|
||||
for c in one_of {
|
||||
let mut child_id = String::new();
|
||||
|
||||
@ -1,4 +1,5 @@
|
||||
use crate::database::schema::Schema;
|
||||
use indexmap::IndexMap;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::sync::Arc;
|
||||
|
||||
@ -10,5 +11,5 @@ pub struct Enum {
|
||||
pub source: String,
|
||||
pub values: Vec<String>,
|
||||
#[serde(default)]
|
||||
pub schemas: std::collections::BTreeMap<String, Arc<Schema>>,
|
||||
pub schemas: IndexMap<String, Arc<Schema>>,
|
||||
}
|
||||
|
||||
@ -1,5 +1,6 @@
|
||||
use crate::database::page::Page;
|
||||
use crate::database::schema::Schema;
|
||||
use indexmap::IndexMap;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::sync::Arc;
|
||||
|
||||
@ -18,5 +19,5 @@ pub struct Punc {
|
||||
pub save: Option<String>,
|
||||
pub page: Option<Page>,
|
||||
#[serde(default)]
|
||||
pub schemas: std::collections::BTreeMap<String, Arc<Schema>>,
|
||||
pub schemas: IndexMap<String, Arc<Schema>>,
|
||||
}
|
||||
|
||||
@ -1,4 +1,4 @@
|
||||
use std::collections::HashSet;
|
||||
use indexmap::{IndexMap, IndexSet};
|
||||
|
||||
use crate::database::schema::Schema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
@ -25,7 +25,7 @@ pub struct Type {
|
||||
#[serde(default)]
|
||||
pub hierarchy: Vec<String>,
|
||||
#[serde(default)]
|
||||
pub variations: HashSet<String>,
|
||||
pub variations: IndexSet<String>,
|
||||
#[serde(default)]
|
||||
pub relationship: bool,
|
||||
#[serde(default)]
|
||||
@ -39,5 +39,5 @@ pub struct Type {
|
||||
pub default_fields: Vec<String>,
|
||||
pub field_types: Option<Value>,
|
||||
#[serde(default)]
|
||||
pub schemas: std::collections::BTreeMap<String, Arc<Schema>>,
|
||||
pub schemas: IndexMap<String, Arc<Schema>>,
|
||||
}
|
||||
|
||||
393
src/tests/formatter.rs
Normal file
393
src/tests/formatter.rs
Normal file
@ -0,0 +1,393 @@
|
||||
use sqlparser::ast::{
|
||||
BinaryOperator, Expr, Function, FunctionArg, Join, JoinConstraint, JoinOperator,
|
||||
Query, Select, SelectItem, SetExpr, Statement, TableWithJoins, Value
|
||||
};
|
||||
use sqlparser::dialect::PostgreSqlDialect;
|
||||
use sqlparser::parser::Parser;
|
||||
|
||||
pub struct SqlFormatter {
|
||||
pub lines: Vec<String>,
|
||||
pub indent: usize,
|
||||
}
|
||||
|
||||
impl SqlFormatter {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
lines: Vec::new(),
|
||||
indent: 0,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn format(sql: &str) -> Vec<String> {
|
||||
let dialect = PostgreSqlDialect {};
|
||||
let ast = match Parser::parse_sql(&dialect, sql) {
|
||||
Ok(ast) => ast,
|
||||
Err(e) => {
|
||||
println!("DEBUG PARSE SQL ERROR: {:?}", e);
|
||||
return vec![sql.to_string()];
|
||||
}
|
||||
};
|
||||
|
||||
if ast.is_empty() {
|
||||
return vec![sql.to_string()];
|
||||
}
|
||||
|
||||
let mut formatter = SqlFormatter::new();
|
||||
formatter.format_statement(&ast[0]);
|
||||
formatter.lines
|
||||
}
|
||||
|
||||
fn push_str(&mut self, s: &str) {
|
||||
if self.lines.is_empty() {
|
||||
self.lines.push(format!("{}{}", " ".repeat(self.indent), s.replace("JSONB", "jsonb")));
|
||||
} else {
|
||||
let last = self.lines.last_mut().unwrap();
|
||||
last.push_str(&s.replace("JSONB", "jsonb"));
|
||||
}
|
||||
}
|
||||
|
||||
fn push_line(&mut self, s: &str) {
|
||||
self.lines.push(format!("{}{}", " ".repeat(self.indent), s.replace("JSONB", "jsonb")));
|
||||
}
|
||||
|
||||
fn format_statement(&mut self, stmt: &Statement) {
|
||||
match stmt {
|
||||
Statement::Query(query) => {
|
||||
self.push_line("(");
|
||||
self.format_query(query);
|
||||
self.push_str(")");
|
||||
}
|
||||
Statement::Update(_update) => {
|
||||
let sql = stmt.to_string();
|
||||
self.format_update_fallback(&sql);
|
||||
}
|
||||
_ => {
|
||||
let sql = stmt.to_string();
|
||||
if sql.starts_with("INSERT") {
|
||||
self.format_insert_fallback(&sql);
|
||||
} else {
|
||||
self.push_line(&sql);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn format_insert_fallback(&mut self, sql: &str) {
|
||||
let s = sql.to_string();
|
||||
if let Some(values_idx) = s.find(" VALUES (") {
|
||||
let prefix = &s[..values_idx];
|
||||
let suffix = &s[values_idx + 9..];
|
||||
|
||||
if let Some(paren_idx) = prefix.find(" (") {
|
||||
self.push_line(&format!("{} (", &prefix[..paren_idx]));
|
||||
self.indent += 2;
|
||||
let cols = &prefix[paren_idx + 2..prefix.len() - 1];
|
||||
let cols_split: Vec<&str> = cols.split(", ").collect();
|
||||
for (i, col) in cols_split.iter().enumerate() {
|
||||
let comma = if i < cols_split.len() - 1 { "," } else { "" };
|
||||
let c = col.replace("\"", "");
|
||||
self.push_line(&format!("\"{}\"{}", c, comma));
|
||||
}
|
||||
self.indent -= 2;
|
||||
self.push_line(")");
|
||||
} else {
|
||||
self.push_line(prefix);
|
||||
}
|
||||
|
||||
self.push_line("VALUES (");
|
||||
self.indent += 2;
|
||||
|
||||
let vals = if suffix.ends_with(")") { &suffix[..suffix.len() - 1] } else { suffix };
|
||||
let mut val_tokens = Vec::new();
|
||||
let mut curr = String::new();
|
||||
let mut in_str = false;
|
||||
for c in vals.chars() {
|
||||
if c == '\'' {
|
||||
in_str = !in_str;
|
||||
curr.push(c);
|
||||
} else if c == ',' && !in_str {
|
||||
val_tokens.push(curr.trim().to_string());
|
||||
curr = String::new();
|
||||
} else {
|
||||
curr.push(c);
|
||||
}
|
||||
}
|
||||
if !curr.trim().is_empty() {
|
||||
val_tokens.push(curr.trim().to_string());
|
||||
}
|
||||
|
||||
for (i, val) in val_tokens.iter().enumerate() {
|
||||
let comma = if i < val_tokens.len() - 1 { "," } else { "" };
|
||||
|
||||
if val.starts_with("'{") && val.ends_with("}'") {
|
||||
let inner = &val[1..val.len() - 1];
|
||||
// Unescape single quotes from SQL strings
|
||||
let unescaped = inner.replace("''", "'");
|
||||
if let Ok(json) = serde_json::from_str::<serde_json::Value>(&unescaped) {
|
||||
if let Ok(pretty) = serde_json::to_string_pretty(&json) {
|
||||
let lines: Vec<&str> = pretty.split('\n').collect();
|
||||
self.push_line("'{");
|
||||
self.indent += 2;
|
||||
for (j, line) in lines.iter().skip(1).enumerate() {
|
||||
if j == lines.len() - 2 {
|
||||
self.indent -= 2;
|
||||
// re-escape single quotes for SQL
|
||||
self.push_line(&format!("{}'{}", line.replace("'", "''"), comma));
|
||||
} else {
|
||||
self.push_line(&line.replace("'", "''"));
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
self.push_line(&format!("{}{}", val, comma));
|
||||
}
|
||||
self.indent -= 2;
|
||||
self.push_line(")");
|
||||
} else {
|
||||
self.push_line(&s);
|
||||
}
|
||||
}
|
||||
|
||||
fn format_update_fallback(&mut self, sql: &str) {
|
||||
let s = sql.to_string();
|
||||
if let Some(set_idx) = s.find(" SET ") {
|
||||
self.push_line(&format!("{} SET", &s[..set_idx]));
|
||||
self.indent += 2;
|
||||
|
||||
let after_set = &s[set_idx + 5..];
|
||||
let where_idx = after_set.find(" WHERE ");
|
||||
let assigns = if let Some(w) = where_idx { &after_set[..w] } else { after_set };
|
||||
let assigns_split: Vec<&str> = assigns.split(", ").collect();
|
||||
for (i, assign) in assigns_split.iter().enumerate() {
|
||||
let comma = if i < assigns_split.len() - 1 { "," } else { "" };
|
||||
self.push_line(&format!("{}{}", assign.replace("\"", ""), comma));
|
||||
}
|
||||
self.indent -= 2;
|
||||
|
||||
if let Some(w) = where_idx {
|
||||
self.push_line("WHERE");
|
||||
self.indent += 2;
|
||||
self.push_line(&after_set[w + 7..]);
|
||||
self.indent -= 2;
|
||||
}
|
||||
} else {
|
||||
self.push_line(&s);
|
||||
}
|
||||
}
|
||||
|
||||
fn format_query(&mut self, query: &Query) {
|
||||
match &*query.body {
|
||||
SetExpr::Select(select) => self.format_select(select),
|
||||
SetExpr::Query(inner_query) => {
|
||||
self.push_str("(");
|
||||
self.format_query(inner_query);
|
||||
self.push_str(")");
|
||||
}
|
||||
_ => self.push_str(&query.to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
fn format_select(&mut self, select: &Select) {
|
||||
self.push_str("SELECT ");
|
||||
for (i, p) in select.projection.iter().enumerate() {
|
||||
let comma = if i < select.projection.len() - 1 { ", " } else { "" };
|
||||
self.format_select_item(p);
|
||||
self.push_str(comma);
|
||||
}
|
||||
|
||||
if !select.from.is_empty() {
|
||||
self.push_line("FROM ");
|
||||
for (i, table) in select.from.iter().enumerate() {
|
||||
let comma = if i < select.from.len() - 1 { ", " } else { "" };
|
||||
self.format_table_with_joins(table);
|
||||
self.push_str(comma);
|
||||
}
|
||||
|
||||
if let Some(selection) = &select.selection {
|
||||
self.push_line("WHERE");
|
||||
self.indent += 2;
|
||||
self.push_line(""); // new line for where clauses
|
||||
self.format_expr(selection);
|
||||
self.indent -= 2;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn format_select_item(&mut self, item: &SelectItem) {
|
||||
match item {
|
||||
SelectItem::UnnamedExpr(expr) => self.format_expr(expr),
|
||||
SelectItem::ExprWithAlias { expr, alias } => {
|
||||
self.format_expr(expr);
|
||||
self.push_str(&format!(" AS {}", alias));
|
||||
}
|
||||
_ => self.push_str(&item.to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
fn format_table_with_joins(&mut self, table: &TableWithJoins) {
|
||||
self.push_str(&table.relation.to_string());
|
||||
for join in &table.joins {
|
||||
self.push_line("");
|
||||
self.format_join(join);
|
||||
}
|
||||
}
|
||||
|
||||
fn format_join(&mut self, join: &Join) {
|
||||
let op = match &join.join_operator {
|
||||
JoinOperator::Inner(_) => "JOIN",
|
||||
JoinOperator::LeftOuter(_) => "LEFT JOIN",
|
||||
_ => "JOIN",
|
||||
};
|
||||
self.push_str(&format!("{} {} ON ", op, join.relation));
|
||||
|
||||
match &join.join_operator {
|
||||
JoinOperator::Inner(JoinConstraint::On(expr)) => self.format_expr(expr),
|
||||
JoinOperator::LeftOuter(JoinConstraint::On(expr)) => self.format_expr(expr),
|
||||
JoinOperator::Join(JoinConstraint::On(expr)) => self.format_expr(expr),
|
||||
_ => {
|
||||
println!("FALLBACK JOIN OP: {:?}", join.join_operator);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn format_expr(&mut self, expr: &Expr) {
|
||||
match expr {
|
||||
Expr::Function(func) => self.format_function(func),
|
||||
Expr::BinaryOp { left, op, right } => {
|
||||
if *op == BinaryOperator::And || *op == BinaryOperator::Or {
|
||||
self.format_expr(left);
|
||||
self.push_line(&format!("{} ", op));
|
||||
self.format_expr(right);
|
||||
} else {
|
||||
self.format_expr(left);
|
||||
self.push_str(&format!(" {} ", op));
|
||||
self.format_expr(right);
|
||||
}
|
||||
}
|
||||
Expr::Nested(inner) => {
|
||||
self.push_str("(");
|
||||
self.format_expr(inner);
|
||||
self.push_str(")");
|
||||
}
|
||||
Expr::IsNull(inner) => {
|
||||
self.format_expr(inner);
|
||||
self.push_str(" IS NULL");
|
||||
}
|
||||
Expr::IsNotNull(inner) => {
|
||||
self.format_expr(inner);
|
||||
self.push_str(" IS NOT NULL");
|
||||
}
|
||||
Expr::Subquery(query) => {
|
||||
self.push_str("(");
|
||||
self.indent += 2;
|
||||
self.push_line("");
|
||||
self.format_query(query);
|
||||
self.indent -= 2;
|
||||
self.push_line(")");
|
||||
}
|
||||
Expr::Case { operand, conditions, else_result, .. } => {
|
||||
self.push_str("CASE");
|
||||
if let Some(op) = operand {
|
||||
self.push_str(" ");
|
||||
self.format_expr(op);
|
||||
}
|
||||
self.indent += 2;
|
||||
for when in conditions {
|
||||
self.push_line("WHEN ");
|
||||
self.format_expr(&when.condition);
|
||||
self.push_str(" THEN ");
|
||||
self.format_expr(&when.result);
|
||||
}
|
||||
if let Some(els) = else_result {
|
||||
self.push_line("ELSE ");
|
||||
self.format_expr(els);
|
||||
}
|
||||
self.indent -= 2;
|
||||
self.push_line("END");
|
||||
}
|
||||
Expr::UnaryOp { op, expr: inner } => {
|
||||
self.push_str(&format!("{} ", op));
|
||||
self.format_expr(inner);
|
||||
}
|
||||
|
||||
Expr::Value(sqlparser::ast::ValueWithSpan { value: Value::SingleQuotedString(s), .. }) | Expr::Value(sqlparser::ast::ValueWithSpan { value: Value::EscapedStringLiteral(s), .. }) => {
|
||||
if s.starts_with('{') && s.ends_with('}') {
|
||||
if let Ok(json) = serde_json::from_str::<serde_json::Value>(s) {
|
||||
if let Ok(pretty) = serde_json::to_string_pretty(&json) {
|
||||
let lines: Vec<&str> = pretty.split('\n').collect();
|
||||
self.push_str("'{");
|
||||
self.indent += 2;
|
||||
for (j, line) in lines.iter().skip(1).enumerate() {
|
||||
if j == lines.len() - 2 {
|
||||
self.indent -= 2;
|
||||
self.push_line(&format!("{}'", line.replace("'", "''")));
|
||||
} else {
|
||||
self.push_line(&line.replace("'", "''"));
|
||||
}
|
||||
}
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
self.push_str(&expr.to_string());
|
||||
}
|
||||
_ => {
|
||||
self.push_str(&expr.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn format_function(&mut self, func: &Function) {
|
||||
let name = func.name.to_string();
|
||||
self.push_str(&format!("{}(", name));
|
||||
|
||||
if let sqlparser::ast::FunctionArguments::List(list) = &func.args {
|
||||
if name == "jsonb_build_object" {
|
||||
self.indent += 2;
|
||||
self.push_line("");
|
||||
let mut i = 0;
|
||||
while i < list.args.len() {
|
||||
let arg_key = &list.args[i];
|
||||
let arg_val = if i + 1 < list.args.len() { Some(&list.args[i+1]) } else { None };
|
||||
|
||||
self.format_function_arg(arg_key);
|
||||
self.push_str(", ");
|
||||
if let Some(val) = arg_val {
|
||||
self.format_function_arg(val);
|
||||
}
|
||||
|
||||
if i + 2 < list.args.len() {
|
||||
self.push_str(",");
|
||||
self.push_line("");
|
||||
}
|
||||
i += 2;
|
||||
}
|
||||
self.indent -= 2;
|
||||
self.push_line(")");
|
||||
} else {
|
||||
for (i, arg) in list.args.iter().enumerate() {
|
||||
let comma = if i < list.args.len() - 1 { ", " } else { "" };
|
||||
self.format_function_arg(arg);
|
||||
self.push_str(comma);
|
||||
}
|
||||
self.push_str(")");
|
||||
}
|
||||
} else {
|
||||
self.push_str(")");
|
||||
}
|
||||
}
|
||||
|
||||
fn format_function_arg(&mut self, arg: &FunctionArg) {
|
||||
match arg {
|
||||
FunctionArg::Unnamed(sqlparser::ast::FunctionArgExpr::Expr(expr)) => self.format_expr(expr),
|
||||
_ => {
|
||||
println!("FALLBACK ARG: {:?}", arg);
|
||||
self.push_str(&arg.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,4 +1,5 @@
|
||||
use crate::*;
|
||||
pub mod formatter;
|
||||
pub mod runner;
|
||||
pub mod types;
|
||||
use serde_json::json;
|
||||
|
||||
@ -127,7 +127,7 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
||||
}
|
||||
}
|
||||
"merge" => {
|
||||
let result = test.run_merge(db_unwrapped.unwrap());
|
||||
let result = test.run_merge(db_unwrapped.unwrap(), path, suite_idx, case_idx);
|
||||
if let Err(e) = result {
|
||||
println!("TEST MERGE ERROR FOR '{}': {}", test.description, e);
|
||||
failures.push(format!(
|
||||
@ -137,7 +137,7 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
||||
}
|
||||
}
|
||||
"query" => {
|
||||
let result = test.run_query(db_unwrapped.unwrap());
|
||||
let result = test.run_query(db_unwrapped.unwrap(), path, suite_idx, case_idx);
|
||||
if let Err(e) = result {
|
||||
println!("TEST QUERY ERROR FOR '{}': {}", test.description, e);
|
||||
failures.push(format!(
|
||||
@ -160,3 +160,83 @@ pub fn run_test_case(path: &str, suite_idx: usize, case_idx: usize) -> Result<()
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn extract_uuids(val: &Value, path: &str, map: &mut HashMap<String, String>) {
|
||||
let uuid_re = regex::Regex::new(r"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$").unwrap();
|
||||
|
||||
match val {
|
||||
Value::Object(obj) => {
|
||||
for (k, v) in obj {
|
||||
let new_path = if path.is_empty() { k.clone() } else { format!("{}.{}", path, k) };
|
||||
extract_uuids(v, &new_path, map);
|
||||
}
|
||||
}
|
||||
Value::Array(arr) => {
|
||||
for (i, v) in arr.iter().enumerate() {
|
||||
let new_path = if path.is_empty() { i.to_string() } else { format!("{}.{}", path, i) };
|
||||
extract_uuids(v, &new_path, map);
|
||||
}
|
||||
}
|
||||
Value::String(s) => {
|
||||
if s != "00000000-0000-0000-0000-000000000000" && uuid_re.is_match(s) {
|
||||
map.insert(s.clone(), path.to_string());
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn canonicalize_with_map(s: &str, uuid_map: &HashMap<String, String>, gen_map: &mut HashMap<String, usize>) -> String {
|
||||
let uuid_re = regex::Regex::new(r"[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}").unwrap();
|
||||
let s1 = uuid_re.replace_all(s, |caps: ®ex::Captures| {
|
||||
let val = &caps[0];
|
||||
if val == "00000000-0000-0000-0000-000000000000" {
|
||||
val.to_string()
|
||||
} else if let Some(path) = uuid_map.get(val) {
|
||||
format!("{{{{uuid:{}}}}}", path)
|
||||
} else {
|
||||
let next_idx = gen_map.len();
|
||||
let idx = *gen_map.entry(val.to_string()).or_insert(next_idx);
|
||||
format!("{{{{uuid:generated_{}}}}}", idx)
|
||||
}
|
||||
});
|
||||
|
||||
let ts_re = regex::Regex::new(r"\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(?:\.\d{1,6})?(?:Z|\+\d{2}(?::\d{2})?)?").unwrap();
|
||||
ts_re.replace_all(&s1, "{{timestamp}}").to_string()
|
||||
}
|
||||
|
||||
pub fn update_sql_fixture(path: &str, suite_idx: usize, case_idx: usize, queries: &[String]) {
|
||||
use crate::tests::formatter::SqlFormatter;
|
||||
let content = fs::read_to_string(path).unwrap();
|
||||
let mut file_data: Value = serde_json::from_str(&content).unwrap();
|
||||
|
||||
let mut uuid_map = HashMap::new();
|
||||
if let Some(test_case) = file_data.get(suite_idx).and_then(|s| s.get("tests")).and_then(|t| t.get(case_idx)) {
|
||||
if let Some(data) = test_case.get("data") {
|
||||
extract_uuids(data, "data", &mut uuid_map);
|
||||
}
|
||||
if let Some(mocks) = test_case.get("mocks") {
|
||||
extract_uuids(mocks, "mocks", &mut uuid_map);
|
||||
}
|
||||
}
|
||||
|
||||
let mut gen_map = HashMap::new();
|
||||
|
||||
let mut formatted_sql = Vec::new();
|
||||
for q in queries {
|
||||
let res = SqlFormatter::format(q);
|
||||
let mapped_res: Vec<String> = res.into_iter().map(|l| canonicalize_with_map(&l, &uuid_map, &mut gen_map)).collect();
|
||||
formatted_sql.push(mapped_res);
|
||||
}
|
||||
|
||||
if let Some(expect) = file_data[suite_idx]["tests"][case_idx].get_mut("expect") {
|
||||
if let Some(obj) = expect.as_object_mut() {
|
||||
obj.remove("pattern");
|
||||
obj.insert("sql".to_string(), serde_json::json!(formatted_sql));
|
||||
}
|
||||
}
|
||||
|
||||
// To preserve original formatting, we just use serde_json pretty output
|
||||
let formatted_json = serde_json::to_string_pretty(&file_data).unwrap();
|
||||
fs::write(path, formatted_json).unwrap();
|
||||
}
|
||||
|
||||
@ -75,7 +75,7 @@ impl Case {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn run_merge(&self, db: Arc<Database>) -> Result<(), String> {
|
||||
pub fn run_merge(&self, db: Arc<Database>, path: &str, suite_idx: usize, case_idx: usize) -> Result<(), String> {
|
||||
if let Some(mocks) = &self.mocks {
|
||||
if let Some(arr) = mocks.as_array() {
|
||||
db.executor.set_mocks(arr.clone());
|
||||
@ -94,7 +94,10 @@ impl Case {
|
||||
} else if result.errors.is_empty() {
|
||||
// Only assert SQL if merge succeeded
|
||||
let queries = db.executor.get_queries();
|
||||
expect.assert_pattern(&queries).and_then(|_| expect.assert_sql(&queries))
|
||||
if std::env::var("UPDATE_EXPECT").is_ok() {
|
||||
crate::tests::runner::update_sql_fixture(path, suite_idx, case_idx, &queries);
|
||||
}
|
||||
expect.assert_sql(&queries)
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
@ -106,7 +109,7 @@ impl Case {
|
||||
return_val
|
||||
}
|
||||
|
||||
pub fn run_query(&self, db: Arc<Database>) -> Result<(), String> {
|
||||
pub fn run_query(&self, db: Arc<Database>, path: &str, suite_idx: usize, case_idx: usize) -> Result<(), String> {
|
||||
if let Some(mocks) = &self.mocks {
|
||||
if let Some(arr) = mocks.as_array() {
|
||||
db.executor.set_mocks(arr.clone());
|
||||
@ -123,7 +126,10 @@ impl Case {
|
||||
Err(format!("Query {}", e))
|
||||
} else if result.errors.is_empty() {
|
||||
let queries = db.executor.get_queries();
|
||||
expect.assert_pattern(&queries).and_then(|_| expect.assert_sql(&queries))
|
||||
if std::env::var("UPDATE_EXPECT").is_ok() {
|
||||
crate::tests::runner::update_sql_fixture(path, suite_idx, case_idx, &queries);
|
||||
}
|
||||
expect.assert_sql(&queries)
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user