Tuesday, 25 November 2025
29.5 C
Singapore
19.5 C
Thailand
21.4 C
Indonesia
27.3 C
Philippines

Confluent expands Confluent Cloud for Apache Flink to boost real-time AI development

Confluent upgrades Confluent Cloud for Apache Flink with new AI tools, simplifying real-time app development and improving data processing.

Confluent has unveiled a set of new features for its Confluent Cloud for Apache Flink, designed to simplify the development of real-time AI applications. The enhancements include Flink Native Inference, Flink Search, and built-in machine learning (ML) functions, offering developers a more streamlined experience in deploying and managing AI models within a single platform.

The announcement, made on 19 March in Singapore, highlights Confluent’s continued efforts to bring powerful AI capabilities to organisations without requiring deep technical expertise or fragmented tools.

Shaun Clowes, Chief Product Officer at Confluent, said, “Building real-time AI applications has been too complex for too long, requiring a maze of tools and deep expertise just to get started. With the latest advancements in Confluent Cloud for Apache Flink, we’re breaking down those barriers—bringing AI-powered streaming intelligence within reach of any team. What once required a patchwork of technologies can now be done seamlessly within our platform, with enterprise-level security and cost efficiencies baked in.”

Addressing the complexity of real-time AI

According to research from McKinsey, 92% of businesses plan to increase their AI investments in the next three years. However, the journey to building AI applications remains challenging. Many developers currently navigate an ecosystem filled with multiple tools, programming languages, and interfaces to deploy ML models and extract context from various data sources. This complexity often leads to inefficiencies, operational delays, and unreliable AI outputs—sometimes referred to as “AI hallucinations.”

To address this, Confluent’s updated Flink features are focused on three core areas: inference, search, and accessibility.

Flink Native Inference allows development teams to run open source AI models directly within Confluent Cloud. This reduces the need for managing additional infrastructure, making it easier to maintain security and efficiency while running ML models. Since inference takes place within the platform, it adds another layer of data protection and lowers costs.

Flink Search, another new addition, gives developers a single interface to access data across multiple vector databases like MongoDB, Elasticsearch, and Pinecone. These vector searches are essential in helping large language models (LLMs) provide accurate and relevant responses. By simplifying the process of retrieving context-rich data, Flink Search eliminates the need for complex ETL (Extract, Transform, Load) pipelines and manual data handling.

Making data science more accessible

The built-in ML functions in Flink SQL bring common AI use cases such as forecasting, anomaly detection, and real-time data visualisation into reach for users without advanced data science backgrounds. By embedding these features directly into SQL workflows, Confluent enables teams across industries to generate insights and improve decision-making faster.

Commenting on the platform’s impact, Steffen Hoellinger, Co-founder and CEO at Airy, said, “Confluent helps us accelerate copilot adoption for our customers, giving teams access to valuable real-time, organisational knowledge. Confluent’s data streaming platform with Flink AI Model Inference simplified our tech stack by enabling us to work directly with large language models (LLMs) and vector databases for retrieval-augmented generation (RAG) and schema intelligence, providing real-time context for smarter AI agents. As a result, our customers have achieved greater productivity and improved workflows across their enterprise operations.”

Stewart Bond, Vice President of Data Intelligence and Integration Software at IDC, added, “The ability to integrate real-time, contextualised, and trustworthy data into AI and ML models will give companies a competitive edge with AI. Organisations need to unify data processing and AI workflows for accurate predictions and LLM responses. Flink provides a single interface to orchestrate inference and vector search for RAG, and having it available in a cloud-native and fully managed implementation will make real-time analytics and AI more accessible and applicable to the future of generative AI and agentic AI.”

More features available in early access

Confluent Cloud for Apache Flink remains the only serverless stream processing solution combining real-time and batch processing in a single platform. With the new AI, ML, and analytics capabilities, businesses can reduce operational overhead and simplify development processes further. These features are now available in an early access programme for current Confluent Cloud users.

Other enhancements in Confluent Cloud include Tableflow, Freight Clusters, Confluent for Visual Studio Code, and the Oracle XStream CDC Source Connector, providing teams with even more tools to manage and process real-time data.

Hot this week

Heidi launches in Singapore after securing US$65 million in Series B funding

Heidi opens its Singapore hub after raising US$65 million, aiming to expand healthcare AI adoption across Southeast Asia.

When fraud is inevitable, resilience becomes the real defence

As identity scams and deepfakes surge, companies must focus on recoverability. Here’s why resilience now matters most.

Jeff Bezos to co-lead AI startup Project Prometheus

Jeff Bezos will become co-CEO of AI startup Project Prometheus, focusing on manufacturing technologies.

Singapore organisations face rising data risks amid AI adoption and data sprawl, says Proofpoint

Proofpoint’s 2025 report finds Singapore firms face growing data security risks as AI tools and data sprawl intensify insider threats.

Roblox’s selfie verification hints at a more intrusive online future

Roblox’s new age verification system signals a growing shift toward identity checks across online platforms, raising safety and privacy concerns.

OpenAI introduces a new shopping assistant in ChatGPT

OpenAI launches a new ChatGPT shopping assistant that helps users compare products, find deals, and search for images ahead of Black Friday.

OpenAI was blocked from using the term ‘cameo’ in Sora after a temporary court order

A judge blocks OpenAI from using the term “cameo” in Sora until 22 December as Cameo pursues its trademark dispute.

Google warns staff of rapid scaling demands to keep pace with AI growth

Google tells staff it must double AI capacity every six months as leaders warn of rapid growth, rising demand, and tough years ahead.

OnePlus confirms 15R launch date as part of three-device announcement

OnePlus confirms the 17 December launch of the 15R, Watch Lite, and Pad Go 2, with UK pre-order discounts and added perks.

Related Articles

Popular Categories