Confluent has unveiled Confluent Intelligence, a new platform designed to help businesses build and scale artificial intelligence (AI) systems that operate with real-time context. Developed on Confluent Cloud, the fully managed service enables organisations to continuously stream and process both historical and live data, providing AI applications with the context they need to make accurate and timely decisions.
The new offering aims to address one of the key barriers holding back AI adoption: the lack of context. According to MIT’s The State of AI in Business 2025 report, 95 per cent of generative AI (GenAI) projects currently deliver no measurable return, despite companies investing an estimated US$30 to US$40 billion. Many of these projects fail because AI models lack a full understanding of the relationships, meaning, and history of the data they process. Confluent Intelligence seeks to solve this problem by creating a continuous data flow that allows AI systems to evaluate past events, adapt to present conditions, and make decisions instantly.
Jay Kreps, Co-founder and CEO of Confluent, said, “We started Confluent to take on one of the hardest problems in data: helping information move freely across a business so companies can act in real time. That same foundation uniquely positions Confluent to close the AI context gap. Off-the-shelf models are powerful, but without the continuous flow of data, they can’t deliver decisions that are timely and uniquely valuable to a business. That’s where data streaming becomes essential.”
Powering enterprise-ready AI through real-time streaming
Confluent Intelligence combines Apache Kafka and Apache Flink into a unified, fully managed stack for building context-rich AI systems. It features built-in governance, low-latency performance, and full replayability to help organisations move quickly from pilot projects to large-scale production deployments.
Its core capabilities include a Real-Time Context Engine, a managed service that streams structured and reliable data to AI agents and applications. This allows teams to access high-quality context without having to manage Kafka or backend infrastructure, significantly speeding up development cycles. The feature is currently available in early access.
Another key component, Streaming Agents, enables developers to build, deploy, and coordinate event-driven AI agents directly within Flink. These agents can process data, make decisions, and act autonomously in real time. The feature, now in open preview, brings agentic AI directly into stream processing, allowing enterprises to automate workflows intelligently and efficiently.
Additionally, the platform introduces built-in machine learning functions within Flink SQL. These include tools for anomaly detection, forecasting, model inference, and real-time visualisation. With these functions, teams can simplify complex data science tasks and extract actionable insights with greater speed and accuracy.
Atilio Ranzuglia, Head of Data and AI at Palmerston North City Council, said, “Good AI needs good data. Confluent is our trusted source of truth, streaming high-quality data to our data lakes and AI platforms to train models in real time. It provides context and orchestration for our agents to automate workflows, accelerating our smart city transformation.”
Deepening collaboration to advance agentic AI
As part of the launch, Confluent announced a deeper collaboration with Anthropic, the company behind the Claude large language model (LLM). Claude will become the default LLM integrated into Confluent’s Streaming Agents. This integration will enable enterprises to build more adaptive, context-aware AI systems that can deliver real-time intelligence across a variety of use cases.
Through the combination of Anthropic’s advanced reasoning models and Confluent’s real-time data infrastructure, organisations can develop AI applications capable of advanced anomaly detection, personalised customer experiences, and other high-value functions that demand both accuracy and adaptability.
Nithin Prasad, Senior Engineering Manager at GEP, added, “AI-powered procurement and supply chain use cases are at the core of what GEP does. Confluent helps make them possible by providing a data streaming platform that fuels our models with real-time streaming data and eliminates fear of data loss.”
Confluent Intelligence is now available with select features in early access and open preview, while its built-in machine learning capabilities are generally available on Confluent Cloud.


