Wednesday, 24 December 2025
31.2 C
Singapore
30.3 C
Thailand
22.4 C
Indonesia
27.6 C
Philippines

Confluent expands Confluent Cloud for Apache Flink to boost real-time AI development

[output_post_excerpt]

Confluent has unveiled a set of new features for its Confluent Cloud for Apache Flink, designed to simplify the development of real-time AI applications. The enhancements include Flink Native Inference, Flink Search, and built-in machine learning (ML) functions, offering developers a more streamlined experience in deploying and managing AI models within a single platform.

The announcement, made on 19 March in Singapore, highlights Confluent’s continued efforts to bring powerful AI capabilities to organisations without requiring deep technical expertise or fragmented tools.

Shaun Clowes, Chief Product Officer at Confluent, said, “Building real-time AI applications has been too complex for too long, requiring a maze of tools and deep expertise just to get started. With the latest advancements in Confluent Cloud for Apache Flink, we’re breaking down those barriers—bringing AI-powered streaming intelligence within reach of any team. What once required a patchwork of technologies can now be done seamlessly within our platform, with enterprise-level security and cost efficiencies baked in.”

Addressing the complexity of real-time AI

According to research from McKinsey, 92% of businesses plan to increase their AI investments in the next three years. However, the journey to building AI applications remains challenging. Many developers currently navigate an ecosystem filled with multiple tools, programming languages, and interfaces to deploy ML models and extract context from various data sources. This complexity often leads to inefficiencies, operational delays, and unreliable AI outputs—sometimes referred to as “AI hallucinations.”

To address this, Confluent’s updated Flink features are focused on three core areas: inference, search, and accessibility.

Flink Native Inference allows development teams to run open source AI models directly within Confluent Cloud. This reduces the need for managing additional infrastructure, making it easier to maintain security and efficiency while running ML models. Since inference takes place within the platform, it adds another layer of data protection and lowers costs.

Flink Search, another new addition, gives developers a single interface to access data across multiple vector databases like MongoDB, Elasticsearch, and Pinecone. These vector searches are essential in helping large language models (LLMs) provide accurate and relevant responses. By simplifying the process of retrieving context-rich data, Flink Search eliminates the need for complex ETL (Extract, Transform, Load) pipelines and manual data handling.

Making data science more accessible

The built-in ML functions in Flink SQL bring common AI use cases such as forecasting, anomaly detection, and real-time data visualisation into reach for users without advanced data science backgrounds. By embedding these features directly into SQL workflows, Confluent enables teams across industries to generate insights and improve decision-making faster.

Commenting on the platform’s impact, Steffen Hoellinger, Co-founder and CEO at Airy, said, “Confluent helps us accelerate copilot adoption for our customers, giving teams access to valuable real-time, organisational knowledge. Confluent’s data streaming platform with Flink AI Model Inference simplified our tech stack by enabling us to work directly with large language models (LLMs) and vector databases for retrieval-augmented generation (RAG) and schema intelligence, providing real-time context for smarter AI agents. As a result, our customers have achieved greater productivity and improved workflows across their enterprise operations.”

Stewart Bond, Vice President of Data Intelligence and Integration Software at IDC, added, “The ability to integrate real-time, contextualised, and trustworthy data into AI and ML models will give companies a competitive edge with AI. Organisations need to unify data processing and AI workflows for accurate predictions and LLM responses. Flink provides a single interface to orchestrate inference and vector search for RAG, and having it available in a cloud-native and fully managed implementation will make real-time analytics and AI more accessible and applicable to the future of generative AI and agentic AI.”

More features available in early access

Confluent Cloud for Apache Flink remains the only serverless stream processing solution combining real-time and batch processing in a single platform. With the new AI, ML, and analytics capabilities, businesses can reduce operational overhead and simplify development processes further. These features are now available in an early access programme for current Confluent Cloud users.

Other enhancements in Confluent Cloud include Tableflow, Freight Clusters, Confluent for Visual Studio Code, and the Oracle XStream CDC Source Connector, providing teams with even more tools to manage and process real-time data.

Hot this week

Samsung unveils Exynos 2600 as first 2nm mobile processor

Samsung unveils the Exynos 2600, the world’s first 2nm mobile chip, expected to debut in the Galaxy S26 in early 2026.

IATA raises concerns over potential 5G interference with aviation systems

IATA warns uneven global 5G rules could pose aviation risks, even as Singapore reports no interference with aircraft systems.

Apple explores iPhone-class chip for future MacBook, leaks suggest

Leaked Apple files hint at testing a MacBook powered by an iPhone-class chip, suggesting a possible lower-cost laptop in the future.

ChatGPT for Android may soon offer faster access to specific chats

ChatGPT for Android may add home-screen shortcuts that open specific chats directly, making repeat conversations easier to access.

Valve ends production of its last Steam Deck LCD model

Valve ends production of its last Steam Deck LCD model, leaving OLED versions as the only option and raising the entry price for new buyers.

Square Enix releases Final Fantasy VII Remake Intergrade demo on Switch 2 and Xbox

Free demo for Final Fantasy VII Remake Intergrade launches on Switch 2 and Xbox, letting players carry progress into the full 2026 release.

AI designs a Linux computer with 843 parts in a single week

Quilter reveals a Linux computer designed by AI in one week, hinting at a future where hardware development is faster and more accessible.

Super Mario Bros inspired Hideo Kojima’s path into game development

Hideo Kojima reveals how Super Mario Bros convinced him that video games could one day surpass movies and led him into game development.

Indie Game Awards withdraws Clair Obscur honours over generative AI use

Indie Game Awards withdraws Clair Obscur’s top honours after confirming generative AI assets were used during the game’s production.

Related Articles

Popular Categories