Will there be an LLM (as good as GPT-4) that was trained with 1/10th the energy consumed to train GPT-4, by 2026?
➕
Plus
22
Ṁ1270
2026
85%
chance

The total power consumption could be estimated to be around 50-60 million kWh for training GPT-4.

1/10th of this energy = 5-6 million kWh

1/100th of this energy = 0.5-0.6 million kWh

See calculations below:

Get
Ṁ1,000
and
S1.00
Sort by:
bought Ṁ500 YES

Jensen Huang (CEO of NVIDIA) said that with Blackwell GPUs, you could train GPT-4 with only about 4 MW of power consumed. Looks like even without algorithmic improvements we can get there.

Approx 4x efficiency improvement from silicon alone, based on latest GPUs being announced now (specifically mi300x Vs a100).