By March 14, 2025, will there be an AI model with over 10 trillion parameters?
➕
Plus
42
Ṁ1523
Mar 14
62%
chance

Finished training.

it should be competitive with SOTA of the time. Rough estimates ofc but not too far behind.

I will not trade in this market.

Get
Ṁ1,000
and
S1.00
Sort by:

If I take llama and "mixture of experts"" it a thousand times, then the resulting system has a lot of parameters, is state of the art(if not efficient in parameter use), and isn't THAT costly to run 🤔

@Mira do it

Does any open source model resolves to YES, or only entreprise models?

... can I train a 10 trillion MNIST classifier? :)

@BarrDetwix yes you can :) it'll be a tad bit costly though. And I think i should ammend the description to say that it should be competitive with SOTA of the time

Apologies for the title change, I will compensate if you want me to. I've changed title to AI model instead of LLM because we soon will have VLMs (vision language models) become popular, etc.