Lates News
Musk: 230,000 GPUs, including 300,000 GB200s, will be used to train Grok in a supercomputing cluster named Colossus 1. In Colossus 2, the first batch of 550,000 GB200s and GB300s will also come online for training in a few weeks. As Huang Renxun said, the speed of xAI is unparalleled.
Latest