Expectations around data and AI are rising fast. (You already know that.)
Tolerance for wasted effort is shrinking just as quickly.
2026 won't reward teams that add the most – or slickest – technology. It will reward leaders who make harder calls: where to invest, what to standardize, what to operationalize, and what to stop doing before it drains time and trust.
We asked our 7 practice leads a straightforward question: what data practices are working for clients right now, and what keeps failing?
Their answers aren't a list of trends. They're priorities shaped by what drives adoption. Where they agreed, where they disagreed, and why that tension matters as you execute this year.
"No matter how successful modernization is, how much the execs buy in, how innovative your design is — if the organization isn't adopting what's being built, it doesn't matter how god it is."
- Travis LaMont
Project Management Practice Lead, Analytics8
Seven practice leads, seven different focus areas, and one shared conclusion:
Your biggest constraint in 2026 is not technology capability. It’s whether your organization is ready to adopt what you build.
Across every conversation, the same message surfaced: AI, analytics, and modern platforms only create value when teams trust them, use them, and change how they work because of them.
Here are 4 things our experts say you need to get right:
1) Use Governance to Make AI Explainable and Controllable
Governance is no longer about policy decks or documentation no one reads. It’s the control layer that makes AI usable at scale.
As AI moves from analysis into decision-making, leaders need to understand why systems behave the way they do and be able to explain that behavior to the business.
Christina Salmi argues the shift needs to be toward ontologies — formalizing definitions, relationships, and rules that already exist in people’s heads. When meaning is explicit and reusable, AI stops being a black box and starts becoming something the organization can stand behind.
Governance doesn’t slow innovation. It’s what allows you to move faster without losing control.
2) Align Data Teams to Decisions, Not Just Delivery
Too many organizations still run data teams like production lines. Ship the report. Close the ticket. Move on.
Hart Shuford sees the same pattern repeatedly: when data teams aren’t tied to real business decisions, insights stall at analysis. Reports get delivered, but no one knows what to do next.
AI raises expectations here. Analysis is easier than ever. Acting on it is still hard.
In 2026, data leaders should hold teams accountable to outcomes, not outputs. That means aligning work to specific decisions, metrics, and operational goals — not just faster delivery.
The problem with AI isn’t access to answers. It’s turning those answers into action.
Most organizations are stuck where AI can summarize or recommend but can’t execute. When that happens, adoption fades and ROI follows.
John Bemenderfer points to Model Context Protocol (MCP) as a practical unlock. By standardizing how AI connects to systems and tools, teams can build execution paths once and reuse them across models. That makes iteration faster and improvement continuous.
But execution raises the stakes. Start small. Define clear boundaries. Build confidence before expanding scope.
4) Strengthen Delivery Discipline as AI Speeds Everything Up
AI compresses timelines and expands what teams can build. That makes delivery discipline more important, not less.
Travis LaMont sees many analytics initiatives fail quietly. The work gets done, but behavior doesn’t change. Dashboards go unused. Products stall after launch.
The root cause is almost always the same: delivery lost sight of the original problem.
In 2026, leaders need repeatable frameworks that carry work from intake to execution to adoption. Without that structure, AI just helps teams build the wrong things faster.
Bottom line: 2026 won’t be defined by who adopts the most AI. It will be defined by who makes their investments stick.
Get All 7 Expert Perspectives
Three more perspectives inside — including where experts debated MCP vs. Agentic Frameworks and what that means for organizations at different maturity levels.