Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
➕
Plus
37
Ṁ1751
2026
53%
chance

The total power consumption could be estimated to be around 50-60 million kWh for training GPT-4.

1/10th of this energy = 5-6 million kWh

1/100th of this energy = 0.5-0.6 million kWh

See calculations below:

Related

Get
Ṁ1,000
and
S1.00
Sort by:

In the title, and in line f. and in line i. you mean "energy" where you have written "power".

If such a model is trained on synthetic data generated with a precursor model, does this take into account the energy used to train the precursor + run inference on it to produce the synthetic data?