Some Grok 3 Launch Highlights#
A couple of days ago xAI launched Grok 3 on X. I wanted to create a short overview for myself of some highlights from the model launch, and I thought I would share it with you as well.
Model#
Grok 3 leverages 10x to 15x more compute compared to the previous generation.
Pre-training was completed early in January, with the model continuously receiving daily improvements.
On Chatbot Arena, an early version of Grok 3 achieved an aggregated ELO score of 1,400 across multiple categories—setting a new benchmark against competitors.
There’s also a Grok 3 Mini Reasoning version
Once Grok 3 is stable and mature, there are plans to open source Grok 2.
Infrastructure#
It took 122 days to get the 100,000 GPU data center named “Colossus” operational.
The cluster “Colossus” is described as the biggest coherently connected H100 cluster of its kind.
The data center is housed in an abandoned Electrolux factory in Memphis.
GPUs are liquid cooled, requiring extensive plumbing and specialized engineering to achieve the needed density.
After the initial build, capacity was doubled in a subsequent phase, and a re-engineered data center was completed in just 92 days.
A new training cluster with 1.2 gigawatts of power (roughly 5x the current capacity) is already in development.
Product#
Big Brain Mode enables the model to allocate additional compute (i.e., more time to think) during test time, boosting problem-solving capabilities.
Deep Search, a new product serving as the first generation of Grok agents, offering next-generation search with integrated verification across multiple sources.
Grok 3’s advanced capabilities are initially available to Premium Plus subscribers on X.
A voice-interactive version is in development. The model will both understand speech and generate audio responses directly, with no intermediary speech-to-text / text-to-speech going on.
They mentioned that users may notice improvements even within 24 hours, as the model is updated daily.