The hype says AI. The reality says data. As AI adoption accelerated across industries, organizations discovered an uncomfortable truth: the quality of AI outputs depends entirely on the quality, accessibility, and trustworthiness of the data that underpins them. There is a pattern across this year’s pieces - the shift from data as a technical problem to data as a strategic enabler. Whether it’s breaking down silos, democratizing access for non-technical users, or building the monitoring infrastructure needed to trust what AI delivers, the enterprises making real progress are those that understand a simple truth - without trusted, well-governed data, AI is just an expensive experiment.
[Databricks takes on the agentic AI challenge - with automation, observability, and scale in mi…
The hype says AI. The reality says data. As AI adoption accelerated across industries, organizations discovered an uncomfortable truth: the quality of AI outputs depends entirely on the quality, accessibility, and trustworthiness of the data that underpins them. There is a pattern across this year’s pieces - the shift from data as a technical problem to data as a strategic enabler. Whether it’s breaking down silos, democratizing access for non-technical users, or building the monitoring infrastructure needed to trust what AI delivers, the enterprises making real progress are those that understand a simple truth - without trusted, well-governed data, AI is just an expensive experiment.
Databricks takes on the agentic AI challenge - with automation, observability, and scale in mind
This isn’t about agent-washing. Our customers are already doing agentic AI - what they need now is a way to do it at scale, with confidence and visibility.
Why? With enterprise AI moving from experimentation to production, Databricks’ launch of Lakeflow Designer and Agent Bricks addresses the real bottleneck: not building AI, but trusting it. By baking governance, lineage tracking (which shows the history and origin of data), and Large Language Model (LLM) ‘judges’ into the infrastructure from the outset, the company is tackling the visibility gap that keeps AI pilots from becoming AI deployments. The message is clear - it’s not one agent in a lab, it’s fleets of agents in production, and that demands a different architecture entirely.
Why SAP Business Data Cloud, and why now? Burning questions edition with Irfan Khan
It’s never, ever going to be an SAP data strategy or an Oracle, or Microsoft, or pick any other vendor. It’s our customer data strategy.
Why? SAP’s Business Data Cloud announcement cut through the noise with a simple premise: if you want useful AI, you need useful data - and that means solving the ‘big, hairy semantic layer problem’ (the challenge of giving data consistent meaning across different systems). The partnership with Databricks and focus on zero-copy data sharing - where data can be accessed without duplicating it across platforms - points to a genuine attempt to meet customers where their data already lives, rather than demanding yet another replatforming exercise. For SAP shops wrestling with fragmented data, the promise of semantically enriched, harmonized data across SAP and non-SAP sources could be transformative.
Vinmar International sees efficiency gains by creating a real-time ‘aggregator of truth’ with Celonis
A lot of noise is filtered because of this layer of knowledge base, and only the true deviations are surfaced to our logistics team members. That adds to a huge level of efficiency.
Why? Vinmar’s story demonstrates what happens when process mining evolves from retrospective analysis to real-time risk management. By integrating data from SAP, transport systems, banking, and shipping providers - refreshed every 15-30 minutes - the company has created a single operational view that was previously impossible. The result? A 20% improvement in labour capacity and the ability to handle demand peaks without additional staffing. It’s a great example of data breaking down silos.
"Our goal is not to displace people" - Planful CEO Grant Halloran takes a stand on the future of AI for finance leaders
We’re positioning all of these agents as assistants because our goal is not to displace people. Our goal is to make our users increasingly productive, and eliminate a lot of the burden that the AI can take off you.
**Why? **Amid the agentic AI gold rush - where vendors promise autonomous systems that act independently - Planful’s human-centric approach stands out. The focus on solving the ‘hidden-in-plain-sight’ problem, where surface-level data obscures what’s really happening in the detail, shows how AI can augment finance teams rather than replace them. Planful’s AI Co-Innovation Council, where customers shape product direction based on real burdens rather than vendor assumptions, offers a blueprint for responsible AI development that actually delivers productivity gains.
A common data platform for marketing, service, and analytics - Klaviyo’s CRM for B2C
People have been talking about for years this idea of blurring the lines between marketing and service... And I think a big reason for that is the real unlock is if you start to move those service interactions up the funnel.
**Why? **Klaviyo’s positioning as ‘the only CRM built for business-to-consumer (B2C)’ tackles the persistent problem of siloed customer data. By unifying marketing, service, and analytics on a common platform, the company is enabling something meaningful – service capabilities that drive acquisition, not just resolution. Customers don’t (and shouldn’t have to) distinguish between marketing and service. Having a unified view is the difference between keeping customers and losing them.
Salesforce adjusts Data Cloud pricing to entice customers to get their data in shape for Agentforce
If you just move your data from Salesforce into Data Cloud and don’t do anything with it, there’s been little value delivered. So why should we charge for that? We’re going to make that free.
Why? Salesforce’s pricing overhaul highlights that lack of data readiness is the biggest blocker to AI adoption. By removing charges for ingesting data from its own applications and simplifying consumption-based pricing, the company is acknowledging that customers need to get their data house in order before agentic AI can deliver value. The shift from seats to consumption also flags up a broader industry transition that enterprise buyers - accustomed to predictable licensing - will need to navigate.
How data from on-pitch sensors is helping rugby teams cross the line
We can automate the pass-to-pass data, which used to take hours, and feed that into an API, which then becomes part of the broadcast feed.
**Why? **Sportable’s sensor technology shows what happens when data tools become affordable. What was once prohibitively expensive technology reserved for cash-rich sports is now accessible via a portable, Software-as-a-Service (SaaS) solution at £45,000 per annum. The upshot? Women’s football leagues without major stadium deals can access the same data-driven performance tools as Premier League clubs. Data for the many, not just the few.
Kent Wildlife Trust swarms with AWS tech to splat insect decline
Without insects, potentially life on earth would collapse because they are at the bottom of the food chain. They are key to pollination, decomposition, loads of core ecosystem functions require invertebrates.
**Why? **The Bugs Matter project shows how citizen science, supported by cloud infrastructure, can generate data for problems that genuinely matter. With flying insects declining 59% over two decades, the AWS Imagine Grant will help Kent Wildlife Trust expand globally - from Ireland to Australia - while solving real data challenges around sovereignty, storage, and real-time analysis. Sometimes the most important data isn’t commercial; it’s existential.
Qlik Connect 2025 - Catalyst Cloud’s case for humanized data
It’s your data, in their world. We can’t expect a nurse or a supplier to become a data analyst. They just need to make faster, better decisions.
**Why? **Catalyst Cloud’s approach challenges the assumption that data democratization means teaching everyone to use dashboards. Its Fusion Data Portal is designed to be ‘so intuitive that your mom should be able to use it’ - abstracting analytical complexity entirely for nurses, suppliers, and product managers who need insights, not interfaces. By using behavioural analytics to optimize portal layouts based on actual usage patterns, Catalyst Cloud’s use case emphasizes frictionless processes over power.