Key Takeaways
- Use testing frameworks like tSQLt or utPLSQL to write and run SQL tests
- Create test cases with sample data to verify query correctness
- Utilize in-memory databases or database snapshots for isolation and performance
Key Takeaways
I since moved to SQLite in NodeJS and one of the things I just implemented is views that look at strategic points in the DB—where data is ingested—and tests whether all text fields are Unicode-normalized; for this I can use the JS `String::normalize()` method I previously didn't even know of. Stuff like this makes me easily forget the many advanced feautures of PostgreSQL and forgive SQLite its quirks because the overall experience is that much more enjoyable. And, to answer directly: I can write unit tests for SQL in the same file and using the same code that I use for all my unit tests. Yay!
On how this compares to alternatives - the way we think about it is theres kind of 3 layers. DQ monitors check if your data is healthy after its already in prod (anomalies, freshness, etc). dbt tests verify your models are sound during deployment (constraints, relationships, schema).
What was missing for us was testing the actual business logic before you deploy. for example, does this calculation actually return what you expect for these inputs.
This framework lets you define mock tables as dataclasses, inject via CTEs so its fast, and validate your logic in development before it ever hits prod. same test works across warehouses too which was useful when we were evaluating migrations. also its not tied to dbt or any specific framework
Curious how others test SQL logic today. Do you write tests, or just rely on warehouse runs?
Not affiliated with Hacker News or Y Combinator. We simply enrich the public API with analytics.