Wednesday, 21 May 2025
27.1 C
Singapore
29.6 C
Thailand
20.7 C
Indonesia
29 C
Philippines

ASUS unveils next-generation AI POD infrastructure with NVIDIA at Computex 2025

ASUS unveils new AI POD designs with NVIDIA at Computex 2025, boosting AI scalability, efficiency, and real-time agentic AI deployment.

At Computex 2025 in Taipei, ASUS announced the launch of its new AI POD infrastructure, designed with validated reference architectures under the NVIDIA Enterprise AI Factory. The newly unveiled solutions are built to support the adoption of agentic AI systems and high-performance computing (HPC), offering flexibility for both air-cooled and liquid-cooled data centre deployments.

Available as NVIDIA-Certified Systems across Grace Blackwell, HGX, and MGX platforms, these innovations are engineered to help enterprises scale AI capabilities with enhanced performance, efficiency, and manageability.

High-density architecture for accelerated AI performance

ASUS’s AI POD design incorporates NVIDIA’s latest hardware to support scalable deployments of large AI models. The solution features NVIDIA GB200 and GB300 NVL72 racks, which support both liquid-cooled and air-cooled options.

The liquid-cooled setup enables a 576-GPU non-blocking cluster spread across eight racks, while the air-cooled configuration supports a 72-GPU rack. Each setup integrates NVIDIA Quantum InfiniBand or Spectrum-X Ethernet networking to deliver high throughput and low latency, setting a new standard for efficient AI infrastructure.

These reference architectures are tailored to help enterprise IT teams manage AI and HPC workloads by providing a consistent framework and accelerating deployment timelines with lower operational risk.

Scalable rack systems for complex workloads

ASUS has also introduced MGX-compliant rack designs featuring its ESC8000 series systems. These racks include dual Intel Xeon 6 processors and the NVIDIA RTX PRO 6000 Blackwell Server Edition GPU, paired with NVIDIA’s latest ConnectX-8 SuperNIC, capable of speeds up to 800Gb/s. This configuration provides the flexibility needed for immersive workloads, large language model (LLM) processing, and complex 3D tasks.

For even more advanced needs, ASUS offers HGX-based architectures. The ASUS XA NB3I-E12 and ESC NB8-E11 systems come embedded with NVIDIA HGX B300 and B200, respectively. These provide high GPU density, robust thermal management, and support for both liquid and air cooling, making them suitable for AI fine-tuning, inference, and training workloads. The streamlined manufacturing and rack integration reduce total cost of ownership and simplify large-scale deployment.

Full-stack integration for AI Factory applications

ASUS’s infrastructure supports the growing trend of agentic AI, enabling the deployment of AI agents capable of autonomous decision-making. The systems integrate tightly with the NVIDIA AI Enterprise software platform and NVIDIA Omniverse, supporting real-time simulation and collaboration environments.

The end-to-end ecosystem includes high-speed networking and storage, such as the ASUS RS501A-E12-RS12U and VS320D series, certified by NVIDIA. These ensure seamless scalability for AI and HPC applications. Resource utilisation is further optimised with SLURM-based workload scheduling and NVIDIA UFM for fabric management in Quantum InfiniBand environments. Storage capabilities are enhanced through the WEKA Parallel File System and ASUS ProGuard SAN Storage, which provide the throughput and scalability needed for enterprise data.

To support enterprise customers through the entire deployment process, ASUS provides tools such as the ASUS Control Center (Data Center Edition) and the ASUS Infrastructure Deployment Center (AIDC). These tools simplify the development, orchestration, and scaling of AI models. L11 and L12-validated systems further offer reliability and assurance for enterprise-level deployments.

ASUS continues to position itself as a leading partner in enterprise AI infrastructure, offering solutions that combine flexibility, performance, and ease of integration — helping organisations accelerate their journey into the next generation of AI.

Hot this week

Windsurf launches its own AI models for software engineers

Windsurf unveils its SWE-1 AI models to support full software engineering, offering tools beyond code writing.

Satya Nadella swaps podcasts for AI chat during his commute

Satya Nadella now talks to Copilot AI about podcast transcripts instead of listening to them during his drive to the office.

Proofpoint to acquire Hornetsecurity in strategic move to expand human-centric security

Proofpoint will acquire Hornetsecurity to strengthen global human-centric security for SMBs and MSPs through Microsoft 365 protection.

Western Digital and Ingrasys collaborate on advanced fabric-attached storage for AI workloads

Western Digital and Ingrasys team up to develop a TOR switch with embedded storage, targeting AI-ready disaggregated data centre solutions.

Asus cuts gaming laptop prices with new RTX 5060 models

Asus adds RTX 5060 GPUs to seven gaming laptops, offering lower prices and strong specs across its Strix, TUF, and Zephyrus models.

ASUS ROG showcases new esports gear and partnerships at Computex 2025

ASUS ROG unveils new esports gear and partnerships at Computex 2025, including keyboards, mice, monitors, and pro collaborations.

Vertagear and Audi launch premium gaming chair collection inspired by automotive craftsmanship

Vertagear and Audi unveil a premium gaming chair line that blends ergonomic comfort with automotive-inspired luxury design.

Xiaomi launches 3-nanometre chip to rival Apple and Qualcomm

Xiaomi unveiled the 3-nm XRing O1 chip for its new phone and tablet, matching Apple and Qualcomm in the global semiconductor race.

US buyer activity rises on Alibaba.com after tariff pause agreement

US buyers flood Alibaba.com after a 90-day US-China tariff pause, boosting inquiries by over 40% and driving holiday stock orders early.

Related Articles

Popular Categories