If there is one thing this year has made clear: modern data and analytics technology is providing us with more power, more flexibility, more options... and more responsibility.
The modern tech stack only works if you bring discipline.
Governance, strong architecture, and constant focus on business value are non-negotiable. Skip them, and you’re just trading one kind of data chaos for another.
In this issue, we go over how the open format is changing the modern tech stack, and what it will take to maximize its potential (and avoid disarray).
Plus:
Our Recommended Read: A midyear checkpoint on data priorities
We shared our 2025 data priorities back in January — not trends, but practical areas worth focusing on. From open formats to AI that drives business outcomes, these concepts are showing up in the work we’re doing with clients every day.
Now that we’re halfway through the year, this is a good checkpoint to see how you’re stacking up.
The writing’s on the wall: open data formats are going mainstream.
What that means to your tech stack: Because formats like Parquet, Iceberg, and Delta Lake let different tools read the same data without duplicating or moving it around — your stack won’t be built around a platform, it’ll be built around many.
You might run mission-critical workloads on Databricks… but do quick analysis in DuckDB.
Your storage stays put, while compute flexes based on the job.
Tools don’t need to own your data — they just need to read.
What that means for your efficiency: You can stop rebuilding dashboards and duplicating pipelines just to make tools play nice — because they’re all reading from the same, consistent source.
But here’s the catch: this kind of flexibility demands serious foresight. Without clear governance, cost control, and architecture discipline, you’ll trade one kind of chaos for another.
Open formats give you real flexibility — but only if your governance and architecture are built to support it.
Emerging Tech Insights
A couple tech updates on our radar:
1. Databricks Just Entered the OLTP Chat
Databricks’ new Lakebase is bringing transactional power to the lakehouse. Think low-latency, high-frequency transactions — but with cloud-native perks.
Why Lakebase matters:
Built on the reliability and stability of PostgreSQL, but decouples storage and compute for more flexibility
Handles lightning-fast transactions, thanks to a clever caching layer
Auto-terminates when idle and spins up in about one second — saving costs without risking downtime
It’s a bold move into OLTP territory — and a sign Databricks is betting big on being your all-in-one data platform.
2. Snowflake Goes All-In on AI-Ready Data What’s new:
Snowflake-native semantic layer to standardize metrics across AI and analytics workflows
Workspaces with Git/dbt integration to streamline deployment and governance
Notebooks and AI app building tools baked right into the platform
Why it matters: Snowflake’s betting big on governed, ready-for-AI data — not just automation. CEO Sridhar Ramaswamy emphasized that simplicity — not complexity — is what drives value. From semantic consistency to version-controlled workflows, the focus is shifting toward making AI trustworthy and useful, not just flashy.
From the Field
A Better Way to Monitor KPIs: Statistical Process Control (SPC) Dashboards
One of our experts built a lightweight performance tracking app — using a SPC tool called an XmR chart — to help a client monitor unusual patterns in website traffic, so they could implement dynamic budget allocation for their multi-channel advertising spend.
Unlike a traditional dashboard that merely displays values and trends, SPC proactively distinguishes routine fluctuations from genuine issues and signals when action is needed.
Here’s a look at the app’s drill-down view — providing a clean visual of acceptable upper and lower limits for one of their KPIs:
Why this matters: This approach supports data-driven improvements because data teams can avoid overcorrection and focus more on fixing the right problems.
When to consider SPC: Having originated in manufacturing, SPC is seeing an increase in applications across many industries. SPC is powerful in scenarios where you want to improve a recurring process over time (logistics, clinical operations, QA, software development, customer service, data quality, etc.).
Busting Data Myths
🛑 Myth: "AI is a faster way to do analytics."
✅ Reality Check: As our CEO put it: “AI is redefining what analytics even looks like.”
Yes, AI writes your SQL. Yes, it summarizes findings. But that’s just the start. With tools like Databricks AI/BI and Zenlytic — and new tech like Model Context Protocol (MCP) — AI is shifting how we interact with data altogether.
These aren’t clunky “ask your data” dashboards 2.0. When implemented correctly, you truly can have a "conversation with your data."