How much compute will be used to train GPT-5?
➕
Plus
16
Ṁ922
2027
0.6%
Fewer than 5e25 FLOP
17%
Between 5e25 FLOP and 1.5e26 FLOP
43%
Between 1.5e26 FLOP and 4.5e26 FLOP
33%
Between 4.5e26 FLOP and 1.4e27 FLOP
3%
Between 1.4e27 FLOP and 4e27 FLOP
2%
More than 4e27 FLOP
  • This resolves as the bucket that contains the number of floating point operations (FLOP) used to train GPT-5

  • FLOP may be performed at any precision (INT8, FP16, etc.)

  • This resolves on the basis of the numbers reported by OpenAI, and, if they don't report this, on the basis of the first credible estimates reported in this database (details about the database may be found here)

  • If GPT-5 is not released by EOY, 2027, this resolves ambiguously

  • If the estimate is exactly on the edge of each range, we will resolve it as the largest of the two options

Get
Ṁ1,000
and
S1.00
Sort by:

@TamayBesiroglu What happens if it's between 1e26 and 2.5e26? It seems there are two options for this

  • Between 5e25 FLOP and 2.5e26 FLOP

  • Between 1e26 FLOP and 5e26 FLOP

Betting here based on the uneven log scale.

@BoltonBailey Yea, it's tricky not to have a non-even log-scale and round numbers. I've now edited the buckets slightly to roughly incremenent by factors of three.

@TamayBesiroglu I can see that. These buckets are much more even. I have sold my positions because the numbers on the options now no longer reflect what they were when I bought them. Luckily, no one seems to have traded in the interim.

related