Is your data gameday ready? Better yet, is your stack built to win, or just built to show up?
Earlier this week, we hosted a panel — Closing the AI Readiness Gap: How AI-Leading Orgs Build Their Data Stack Differently — with leaders from Databricks, dbt, and ThoughtSpot.
Our guests got into the stuff data leaders wrestle with: how to make the ROI case for AI infrastructure, risk avoidance, and metric definition consistency. Today we cover a few important takeaways – and here’s the full conversation (it’s worth the listen!).
Our CTO Patrick Vinton sat down with leaders from Databricks, dbt, and ThoughtSpot to dig into findings from our AI Data Readiness Research. Here are three important things that organizations succeeding with AI are doing differently.
1) Reframe how you talk about the cost and ROI of a good data infrastructure
Most data leaders position infrastructure as a necessary cost. Russell Christopher, Senior Director of Product Strategy at dbt, argued that's exactly why they lose the budget conversation — because, as he put it, your CFO or CEO may not think data infrastructure is "all that sexy." His advice: learn to manage up and reframe.
"It's not just the cost of doing business so that users can have access to data. You reframe it more as the control surface for AI risk, or it's a multiplier on every single AI dollar that you are going to spend."
Here’s the case to bring to your next AI budget conversation: gaps in the data platform don't just delay AI, they compound cost, risk, and technical debt every time we try to scale.
2) Get your semantic layer right before you scale AI
Anjali Kumari, VP of Product Management at ThoughtSpot, described a common pitfall: organizations pour money into cleaning and prepping data, only to treat business context as an afterthought.
The business language and the data language simply don't match — "revenue" in a board meeting means something different than "ACV" in your reporting, and nobody knows where that definition lives.
"AI without semantic context is just pattern recognition. The semantic layer is what turns it into decision intelligence."
The fix isn't another PDF of metric definitions attached as an afterthought. It's operationalizing your semantic layer so that definitions are governed, consistent, and flow across your tools.
If you need another nudge, our AI Data Readiness research backs this up: 88% of Leaders rate their metadata practices as mature, compared to just 6% of Laggards.
3) Treat data readiness as an operating discipline, not a project
Daniel Lacouture, Solutions Architect at Databricks, sees a familiar pattern across industries: early momentum, plenty of POCs, and then projects get stuck in what he calls "POC purgatory."
"The companies that see ROI standardize on one governed foundation, operationalize quality, not just deployment, and pick high-value, repeatable use cases tied to business KPIs. Then they scale that, instead of reinventing the wheel each time."
The teams that break through aren't using fancier models; they're building the foundation and feedback loops so AI can run like a product in production.
Watch the full session
The three takeaways above are just part of the discussion. Watch the full panel to hear how leaders from Databricks, dbt, and ThoughtSpot are thinking about infrastructure, semantics, and scaling AI without stalling.