GPT-5 context window length >= 64k tokens?
➕
Plus
42
Ṁ1293
Dec 31
91%
chance

If multiple versions released, like with GPT-4, then any of them count.

GPT-5 refers to the next big model that's a successor to GPT-4

Get
Ṁ1,000
and
S1.00
Sort by:

GPT-4o has a 128K context window, it's not impossible that this could shrink for the next version but that would surprise me.

“Where we’re going we don’t need context windows”

GPT-3 => GPT-4 was a 16x context length improvement, and context length is super valuable for holding up-to-date data outside the training data (like tools, systems prompts, etc). A 2x increase for the next generation seems like a safe bet.

it will have /infinity context length