Will we see most new language models shifting to addition-only architectures like BitNet/BitNet 1.58b in 2024?
Mini
2
Ṁ35Jan 1
43%
chance
1D
1W
1M
ALL
Get
1,000
and1.00
Related questions
Related questions
Will there be an AI language model that strongly surpasses ChatGPT and other OpenAI models before the end of 2024?
4% chance
By the end of 2026, will we have transparency into any useful internal pattern within a Large Language Model whose semantics would have been unfamiliar to AI and cognitive science in 2006?
38% chance
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
42% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
51% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
68% chance
Will any language model trained without large number arithmetic be able to generalize to large number arithmetic by 2026?
51% chance
Will Meta release an open source language model that outperforms GPT-4 by the end of 2024
67% chance
By 2025 will there be a competitive large language model with >50% of the total training data generated from a large language model?
75% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
26% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance