Elon Musk has unveiled his long-term vision for xAI, outlining plans to significantly scale the company’s computing power and make its artificial intelligence models open-source. Speaking over the weekend, Musk said xAI aims to run the equivalent of 50 million Nvidia H100 graphics chips within the next five years, marking an ambitious leap in computing capacity.
Having thought about it some more, I think the 50 million H100 equivalent number in 5 years is about right.
— Elon Musk (@elonmusk) August 23, 2025
Eventually, billions. https://t.co/VlYsmDgqLh
Nvidia’s H100 chips have become the preferred hardware for AI companies thanks to their advanced processing power. Each H100 can handle around 900 gigabytes of data per second, enabling large language models to process billions, or even trillions, of parameters rapidly and deliver accurate AI responses. Demand for these chips has surged, pushing supply limits and driving prices to more than US$30,000 per unit, making them a major investment for companies pursuing large-scale AI projects.
Currently, xAI operates roughly 200,000 H100 chips at its Colossus data centre in Memphis and is building Colossus II to expand its infrastructure. Rival firms are also racing to boost their computing resources. Reports suggest OpenAI has a similar number of 200kH100s, while Meta is expected to surpass both companies with an estimated 600,000 chips once its new systems become fully operational. Meta is also investing heavily in AI, reportedly committing hundreds of billions of dollars to its projects.
Competition and custom chip development
Despite Nvidia’s dominance, major AI firms, including xAI, OpenAI, Meta, and Google, are developing custom hardware to reduce reliance on third-party suppliers and improve efficiency. Details of these in-house projects remain confidential, leaving uncertainty about how much additional capacity they could provide.
While Musk’s announcement signals an intent to outpace rivals, industry observers suggest that Meta currently holds the strongest position in terms of computing power. However, hardware capacity alone will not determine leadership in the AI sector. Success will depend on developing practical applications, innovative solutions, and strategic partnerships to ensure widespread adoption and implementation.
Musk’s track record of rapidly scaling ventures suggests xAI could meet its ambitious targets, but aggressive expansion also carries risks. Securing sufficient funding will be essential for xAI to maintain momentum as competition intensifies.
Open sourcing Grok models
In addition to its hardware plans, xAI has begun open-sourcing its Grok AI models. Over the weekend, the company released Grok 2.5 on Hugging Face, allowing researchers and developers to examine the model’s underlying weights and architecture. Musk pledged that every Grok model will eventually be released in this way, describing the move as an effort to promote transparency and trust in xAI’s technology.
The decision has prompted debate about how fully open these models will be, particularly since Grok 4 reportedly incorporates Musk’s personal views into its responses. It remains to be seen whether future open-source releases will reflect these elements or present a more neutral version.
Musk’s announcement underscores his ambition to make AI a core focus of X Corp, his wider business empire. As xAI accelerates both hardware investment and transparency initiatives, the company is positioning itself as a key competitor in the rapidly evolving AI landscape.