How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
Mini
5
Ṁ11792025
1D
1W
1M
ALL
3%
<1e24
3%
[1e24, 3e24)
7%
[3e24, 1e25)
8%
[1e25, 3e25)
38%
[3e25, 1e26)
30%
[3e26, 1e27)
5%
[1e27, 3e27)
2%
[3e27, 1e28)
1.8%
[1e28, 3e28)
1.3%
[3e28, 1e29)
1.3%
>=1e29
Get
1,000
and1.00
Related questions
Related questions
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
42% chance
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
43% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
22% chance
Will a machine learning training run exceed 10^25 FLOP in China before 2025?
77% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2025?
15% chance
Will it cost less than 100k USD to train and run a language model that outperforms GPT-3 175B on all benchmarks by the end 2024?
85% chance
How big will Mistral's known largest language model be? (2024)
Will a machine learning training run exceed 10^26 FLOP in China before 2027?
53% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
How many FLOPs will go into training the first ASL-3 model?