Sunday, 30 November 2025
26.9 C
Singapore
14.5 C
Thailand
21.2 C
Indonesia
27.4 C
Philippines

NetApp launches new enterprise-grade AI data platform with NVIDIA integration

NetApp launches AFX and AI Data Engine with NVIDIA integration to simplify AI data pipelines and power enterprise AI innovation.

NetApp has introduced a new enterprise-grade data platform designed to accelerate artificial intelligence (AI) innovation and simplify data management across hybrid and multicloud environments. The launch features the NetApp AFX disaggregated storage system and the NetApp AI Data Engine, both aimed at building a secure, scalable, and high-performance foundation for modern AI workloads.

Strengthening AI infrastructure with AFX

As organisations move from experimental AI pilots to mission-critical applications, they require infrastructure capable of managing vast amounts of data efficiently. NetApp’s new AFX system addresses this by decoupling performance from capacity, enabling enterprises to scale resources independently. Running on the new AFX 1K storage platform and powered by the company’s ONTAP operating system, the solution delivers linear performance scaling of up to 128 nodes, terabytes per second of bandwidth, and exabyte-level capacity.

AFX is certified for NVIDIA DGX SuperPOD supercomputing systems and integrates seamlessly across on-premises and cloud environments. Optional DX50 data compute nodes introduce a global metadata engine that provides a real-time catalogue of enterprise data, enhanced by NVIDIA accelerated computing.

“With the new NetApp AFX systems, customers now have a trusted, proven choice in on-premises enterprise storage built on a comprehensive data platform to rapidly propel AI innovation forward,” said Syam Nair, Chief Product Officer at NetApp. “The combination of NetApp AFX with AI Data Engine provides the enterprise resilience and performance built and proven over decades by NetApp ONTAP, now in a disaggregated storage architecture, and all still built on the most secure storage on the planet.”

Simplifying AI data management with AI Data Engine

The NetApp AI Data Engine (AIDE) is a unified service designed to simplify the AI data lifecycle from ingestion to application. It provides a global, real-time view of an organisation’s entire data estate, supports seamless connectivity to any model or tool, and automates data synchronisation to ensure information remains current and accurate. Built-in guardrails ensure security and privacy throughout the AI workflow.

AIDE integrates with the NVIDIA AI Data Platform reference design, combining NVIDIA accelerated computing and NVIDIA AI Enterprise software to support semantic search, data vectorisation, and retrieval. It also incorporates NVIDIA NIM microservices for advanced data handling. Future ecosystem enhancements will include NVIDIA RTX PRO Servers with RTX PRO 6000 Blackwell GPUs, further boosting AI performance.

“NetApp AI Data Engine enables customers to seamlessly connect their entire data estate across hybrid multicloud environments to build a unified data foundation,” Nair added. “Enterprises can then dramatically accelerate their AI data pipelines by collapsing multiple data preparation and management steps into the integrated NetApp AI Data Engine.”

Andrew Sotiropoulos, Senior Vice President and General Manager of NetApp Asia Pacific, noted that customers are shifting from passive data storage to active data management. “Our customers are testing and deploying AI to improve productivity and gain a competitive advantage. One constant theme is how they are shifting from passive data storage to active data management—extracting, processing, and curating data to unlock insights,” he said.

Expanding capabilities across hybrid and multicloud environments

NetApp has also introduced new capabilities that enhance flexibility across hybrid and multicloud environments. A new Object REST API enables direct access to Azure NetApp Files data without needing to copy it into a separate object store. This allows existing NFS and SMB datasets to connect directly with Azure services such as Azure OpenAI, Azure Databricks, and Azure Synapse for model training, intelligent search, and application development.

Enhanced FlexCache capabilities in Azure NetApp Files further streamline hybrid cloud operations by unifying data estates across on-premises and cloud environments. Data from ONTAP-based storage systems can now be made visible and writeable in Azure, with only the required data transferred on demand. This enables more efficient workflows and simplifies tasks such as continuous backup, disaster recovery, and workload balancing.

“Enterprises are looking for a trusted, high-performance data foundation to turn massive volumes of information into real intelligence that powers their AI journey,” said Justin Boitano, Vice President of Enterprise AI Products at NVIDIA. “NetApp’s data platform has transformed into an AI-native storage platform by integrating NVIDIA accelerated computing and software, including leading AI models.”

Michael Leone, Practice Director and Principal Analyst at Omdia, said the new solutions reflect a deep understanding of customer needs. “Adding independent scaling of performance and capacity management to the robust data management capabilities in ONTAP will enable enterprises to confidently invest in AI projects that deliver value quickly to the business,” he said.

NetApp will showcase these new solutions at NetApp INSIGHT 2025 in Las Vegas from 14 to 16 October, where it will host sessions and demonstrations on how its technologies are driving digital transformation across industries.

Hot this week

Snapdragon devices set to support file transfers to iPhones through Quick Share

Snapdragon devices will soon support Quick Share transfers to iPhones, expanding cross-platform file sharing between Android and iOS.

AppWorks Demo Day in Singapore highlights scalable AI, IoT and Web3 startups

AppWorks Demo Day in Singapore showcases 16 AI, IoT and Web3 startups reflecting a new wave of experienced Southeast Asian founders.

Belkin UltraCharge Pro 3-in-1 Magnetic Charging Dock with Qi2 25W review: Fast, quiet and convenient charging

Belkin UltraCharge Pro 3-in-1 Magnetic Charging Dock with Qi2 25W offers fast, quiet and convenient wireless charging for iPhone, Apple Watch and AirPods.

Nintendo acquires Bandai Namco Studios Singapore

Nintendo acquires Bandai Namco Studios Singapore to boost game development and expand its subsidiary network.

Google denies claims that Gmail data is used to train the Gemini AI model

Google denies claims that Gmail content is used to train its Gemini AI model, offering clarity on Smart Features and user privacy.

DeepSeek launches open AI model achieving gold-level scores at the Maths Olympiad

DeepSeek launches Math-V2, the first open AI model to achieve gold-level scores at the International Mathematical Olympiad.

AI browsers vulnerable to covert hacks using simple URL fragments, experts warn

Experts warn AI browsers can be hacked with hidden URL fragments, posing risks invisible to traditional security measures.

Slop Evader filters out AI content to restore pre-ChatGPT internet

Slop Evader filters AI-generated content online, restoring pre-ChatGPT search results for a more human web.

Lara Croft becomes gaming’s best-selling heroine amid new Tomb Raider rumours

Lara Croft becomes gaming’s best-selling heroine as new Tomb Raider rumours fuel excitement.

Related Articles

Popular Categories