Current location: Home> Ai News

Musk disclosed for the first time the training cost of Grok 3: up to 200,000 Nvidia GPU

Author: LoRA Time: 18 Feb 2025 685

Recently, Musk officially launched the brand new chatbot Grok3 during a live broadcast, and revealed the amazing cost of training the model. Grok3 is currently open to Premium+ subscription users and has performed well in multiple review areas, surpassing other competitors in the market such as Gemini, DeepSeek and ChatGPT.

Musk introduced in the live broadcast that Grok3 consumed a total of 200,000 Nvidia GPUs during training, which is shocking. Before this, Grok2's training only used about 20,000 GPUs, and the improvement of Grok3's computing power is a qualitative leap. To support such a huge training scale, xAI company has built a new supercomputing data center called "Colossus", which is one of the world's most powerful AI training facilities.

image.png

With the launch of Grok3, users can experience the power of this series of models through the newly opened Grok.com webpage. Musk said that Grok3 has made significant improvements in reasoning, understanding and generating content, which will further promote the development of artificial intelligence technology.

In addition, Musk also mentioned that xAI plans to expand its supercomputer GPU cluster from the current 100,000 yuan to 1 million yuan, demonstrating his ambitions in the field of AI. This series of actions will undoubtedly have a profound impact on the future technological landscape.

Overall, the launch of Grok3 marks a major advancement in AI technology, and Musk's continued investment and innovation will also inject new vitality into this field.