
How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
Mini
5
Ṁ1237Jul 2
1D
1W
1M
ALL
2%
<1e24
2%
[1e24, 3e24)
12%
[3e24, 1e25)
15%
[1e25, 3e25)
33%
[3e25, 1e26)
25%
[3e26, 1e27)
4%
[1e27, 3e27)
1.9%
[3e27, 1e28)
1.6%
[1e28, 3e28)
1.2%
[3e28, 1e29)
1.2%
>=1e29
Get
1,000and
1.00
Related questions
Related questions
Will an AI model use more than 1e28 FLOPS in training before 2026?
24% chance
At least one of the most powerful neural nets at end of 2030 will be trained using 10^26 FLOPs
97% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^26 FLOPs
97% chance
Will the largest machine learning training run (in FLOP) as of the end of 2025 be in the United States?
89% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^27 FLOPs
82% chance
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
44% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
52% chance
Will it cost less than 100k USD to train and run a language model that outperforms GPT-3 175B on all benchmarks by the end 2024?
85% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
16% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2027?
82% chance