Will an AI model use more than 1e28 FLOPS in training before 2026?
5
Ṁ785
Dec 31
24%
chance

Resolution source: Epoch AI's list of notable AI models. I will check this source on January 1st, 2026, to see whether there is a model that uses more than 1e28 FLOPS https://epoch.ai/data/notable-ai-models

AI models do not only include LLMs, but also other types of AI models that are mentioned in the resolution source

As of market creation, the biggest LLM model is Grok 3, with 4.6e26 FLOPs of training compute

Get
Ṁ1,000
and
S1.00
Sort by:

Don't see how this is possible. Stargate won't be fully underway until Mid 2026, and if they manage a partial training run with, say 200k GB200s that runs twice as long as Grok 3, that's looking at ~3.68e27 FLOPS. I think it's a similar-ish story for Project Rainier and even for Google.