Tuesday, 21 October 2025
27.7 C
Singapore
24.5 C
Thailand
23.4 C
Indonesia
28.1 C
Philippines

ASUS unveils generative-AI POD solutions at ISC 2024

ASUS and TWSC introduce GenAI POD Solution at ISC 2024, showcasing advanced AI servers and cooling solutions for AI supercomputing and data centres.

ASUS, in partnership with its subsidiary Taiwan Web Service Corporation (TWSC), has announced its new GenAI POD Solution at ISC 2024. This solution aims to address the growing need for AI supercomputing. ASUS is showcasing its NVIDIA MGX-powered AI servers, including the ESC NM1-E1, ESR1-511N-M1, ESC N8A-E12, and RS720QN-E11-RS24U HGX GPU servers. These servers are enhanced by TWSC’s exclusive resource management platform and software stacks, making them capable of handling various generative AI and large language model (LLM) training tasks. These integrated solutions feature advanced thermal designs and can be customised for enterprises, providing comprehensive data centre solutions with robust software platforms to support AI initiatives.

The ASUS ESC NM1-E1, powered by an NVIDIA GH200 Grace Hopper Superchip, features 72 Arm Neoverse V9 CPU cores and NVIDIA NVLink-C2C technology. This combination ensures high performance and efficiency, making it suitable for AI-driven data centres, high-performance computing (HPC), data analytics, and NVIDIA Omniverse applications. This server promises significant improvements in performance and memory capabilities.

The ASUS ESR1-511N-M1 server, also powered by the NVIDIA GH200 Grace Hopper Superchip, is designed for large-scale AI and HPC applications. It supports deep-learning (DL) training and inference, data analytics, and high-performance computing. With an enhanced thermal solution, it achieves optimal performance and lower power usage effectiveness (PUE). Its flexible configuration, including a 1U design and support for up to four E1.S local drives via NVIDIA BlueField-3, along with three PCI Express (PCIe) 5.0 x16 slots, ensures seamless and rapid data transfers.

ASUS NVIDIA HGX servers boost AI with end-to-end H100 eight-GPU power

The ASUS ESC N8A-E12 is a robust 7U dual-socket server equipped with dual AMD EPYC 9004 processors and eight NVIDIA H100 Tensor Core GPUs. Designed for generative AI, it features an enhanced thermal solution for optimal performance and lower PUE. This HGX server offers a unique one-GPU-to-one-NIC configuration, providing maximum throughput for compute-heavy tasks.

The ASUS RS720QN-E11-RS24U is a high-density server featuring an NVIDIA Grace CPU Superchip with NVIDIA NVLink-C2C technology. This innovative solution can accommodate four nodes within a 2U4N chassis, offering PCIe 5.0 compatibility and exceptional performance for dual-socket CPUs. It is ideal for data centres, web servers, virtualisation clouds, and hyperscale environments.

ASUS introduces efficient D2C cooling solution

ASUS’s direct-to-chip (D2C) cooling solution offers a swift and straightforward approach, leveraging existing infrastructure and enabling quick deployment with reduced PUE. The ASUS RS720QN-E11-RS24U supports manifolds and cool plates, allowing for diverse cooling solutions. Additionally, these servers support a rear-door heat exchanger that fits standard rack-server designs, meaning only the rear door needs to be replaced to enable liquid cooling. ASUS collaborates with leading cooling solution providers to offer comprehensive cooling solutions, aiming to minimise data centre PUE, carbon emissions, and energy consumption, contributing to greener data centres.

TWSC’s generative AI POD solutions

TWSC has extensive experience in deploying and maintaining large-scale AIHPC infrastructure for NVIDIA Partner Network cloud partners. This includes the National Center for High-performance Computing (NCHC)’s TAIWANIA-2 (#10 / Green 500, November 2018) and FORERUNNER 1 (#92 / Green 500, November 2023) supercomputer series. TWSC’s AI Foundry Service allows for quick deployment of AI supercomputing and flexible model optimisation for AI 2.0 applications, enabling users to tailor AI demand to their needs.

TWSC’s generative AI POD solutions offer enterprise-grade AI infrastructure with swift rollouts and comprehensive end-to-end services, ensuring high availability and cybersecurity standards. These solutions are designed to support success stories across academic, research, and medical institutions. Comprehensive cost-management capabilities optimise power consumption and streamline operational expenses (OPEX), making TWSC technologies a compelling choice for organisations seeking a reliable and sustainable generative AI platform.

Hot this week

NetApp introduces built-in data breach detection for enterprise storage

NetApp debuts built-in data breach detection in enterprise storage, enhancing cyber resilience with AI-powered security and recovery tools.

Veeam launches new data cloud platform for managed service providers

Veeam launches Data Cloud for MSPs, a new SaaS platform that simplifies data resilience, strengthens security, and helps providers scale services.

Pixel 10 Pro Fold review: Google’s most polished and capable foldable yet

The Pixel 10 Pro Fold combines premium design, powerful AI, strong performance and advanced cameras in Google’s most refined foldable yet.

Mintegral reveals key 2025 app economy trends as AI and short drama reshape growth

Mintegral reports AI apps, short-form drama, and third-party Android stores are transforming APAC’s mobile growth landscape.

Samsung reportedly cancels Galaxy S26 Edge plans after weak sales of S25 Edge

Samsung is reportedly cancelling the Galaxy S26 Edge after weak S25 Edge sales and plans to discontinue the model once stocks run out.

Oura redesigns app with enhanced stress tracking and hypertension study

Oura unveils redesigned app with advanced stress tracking and begins FDA-backed study to develop early hypertension detection features.

Shadow of the Colossus turns 20: Exploring the moral depth of gaming’s quietest hero

Shadow of the Colossus marks its 20th anniversary, celebrated for its quiet heroism, moral depth, and enduring emotional power.

Samsung partners with Nvidia to develop custom CPUs and XPUs for AI dominance

Nvidia partners with Samsung to develop custom CPUs and XPUs, expanding its NVLink Fusion ecosystem to strengthen its AI hardware dominance.

NVIDIA unveils first US-made Blackwell wafer as domestic chip production expands

NVIDIA unveils its first US-made Blackwell wafer at TSMC’s Arizona facility, marking a major milestone in domestic AI chip production.

Related Articles