Will anyone train a TokenFormer model at scale before 2026?
Plus
2
Ṁ7252026
25%
chance
1D
1W
1M
ALL
Will anyone train a TokenFormer model using at least (the equivalent of) 200,000 H100-hours before 2026?
Get
1,000
and1.00
Related questions
Related questions
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
43% chance
Will a OpenAI model have over 500k token capacity by the end of 2024.
20% chance
AI: Will someone train a $1B model by 2025?
67% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
22% chance
Before 2028, will there be enough inference capacity to generate 30T frontier model tokens per day?
39% chance
Will OpenAI release a tokenizer with vocab size > 150k by end of 2024?
42% chance
Will a new lab create a top-performing AI frontier model before 2028?
57% chance
Will a model costing >$30M be intentionally trained to be more mechanistically interpretable by end of 2027? (see desc)
57% chance
AI: Will someone train a $100M model by 2025?
85% chance
Will models be able to do the work of an AI researcher/engineer before 2027?
40% chance