AI-Driven businesses must rethink data handling: Confluent’s Jay Kreps

BENGALURU, India – As artificial intelligence (AI) moves from theoretical hype to real-world applications, one challenge remains: AI can’t make good decisions with outdated data. Today’s AI-powered applications must continuously process and react to live information in real time, and businesses that rely on batch-based architectures are struggling to keep up.

Speaking at Current 2025, Confluent’s flagship event in Bengaluru, Jay Kreps, Co-founder and CEO of Confluent, emphasized that AI-driven businesses need a fundamental shift in how they handle data.

“AI projects are failing because old development methods can’t keep pace with new consumer expectations,” Kreps said in his opening address. “Applications today aren’t just passive tools; they need to understand live business context and take action automatically. That’s only possible when AI is fed real-time data, and that’s exactly what we’re enabling with Tableflow.”

From data warehouses to real-time decision-making

For years, businesses have relied on batch-based analytics, where data was collected, stored in data warehouses, and processed periodically. But in the AI era, this approach is too slow. AI applications don’t just need historical data—they need live operational data to make the right decisions in the moment.

“Traditionally, being ‘data-driven’ meant generating reports from a data warehouse and making decisions later,” Kreps explained. “But today, AI doesn’t just analyze data—it acts on it. And that shift requires real-time streaming infrastructure.”

This is why Confluent is expanding real-time capabilities within Tableflow, making it easier for enterprises to integrate real-time operational data into AI and analytics. By supporting popular open table formats, including Apache Iceberg and Delta Lake, Confluent is helping enterprises seamlessly combine historical batch data with real-time streaming data—a crucial step for powering AI-driven applications.

AI needs real-time context to work

One of the biggest challenges for AI is context. Even the most powerful AI model can’t make relevant decisions if it doesn’t understand what’s happening right now—whether it’s a customer request, a stock level change, or a security alert.

“AI isn’t magic—it’s just software that makes decisions,” Kreps said. “And if those decisions are based on outdated data, they’ll be wrong. The key is ensuring AI has fresh, real-time information, and that’s exactly what data streaming provides.”

This is particularly critical for Retrieval-Augmented Generation (RAG) applications, where AI models need real-time business context to generate accurate and useful responses.

“Think about an AI-powered customer service bot,” Kreps continued. “If it’s answering a refund request, it needs to know whether the refund has already been processed. If it’s assisting in sales, it should have the latest inventory data. Without real-time updates, AI applications become dumb tools rather than intelligent agents.”

Beyond analysis: AI agents that act in real time

AI isn’t just about understanding data—it’s about acting on it. More businesses are deploying AI-powered agents that automate decisions and processes, whether it’s managing supply chains, detecting fraud, or optimizing pricing in real time.

Kreps highlighted an example in grocery delivery services, where AI can automate product catalog management—a historically manual task.

“If you’re running an online grocery service, you need to aggregate product data from thousands of stores, each with different naming conventions and stock levels,” Kreps explained. “AI can help by automatically categorizing products, normalizing descriptions, and ensuring accurate inventory visibility. But this only works if the AI is operating on live, real-time data.”

The shift from batch to streaming AI

For businesses transitioning to AI-driven operations, one concern is maintaining the ability to test and refine models. Batch-based systems allowed companies to run A/B tests, compare model outputs, and make incremental improvements. Kreps emphasized that streaming AI doesn’t eliminate this—it enhances it.

“One of the best things about batch processing was the ability to rerun and benchmark AI models,” he said. “We’re bringing that same capability to real-time AI. With streaming, you can run models in parallel, compare outputs, and fine-tune decision-making—just like you would in a traditional batch system, but now in real time.”

This ability to replay and iterate AI decisions allows businesses to continuously improve their models, ensuring AI-driven applications remain accurate and effective.

The future of AI-powered businesses

As AI adoption accelerates, enterprises will need streaming infrastructure to keep up. The companies that succeed won’t be the ones simply analyzing historical data—they’ll be the ones powering real-time AI-driven operations.

“We’re moving beyond dashboards and reports,” Kreps concluded. “The future of business isn’t just data-driven—it’s AI-driven, in real time. And that’s what we’re building at Confluent.”

With Tableflow’s latest advancements, including Iceberg and Delta Lake support, Confluent is ensuring businesses have the data infrastructure needed to power AI applications that aren’t just reactive, but proactive—able to analyze, decide, and act at the speed of business.

Share on