Will a NPC of a major video game realize they are a NPC before 2030?
➕
Plus
54
Ṁ2120
2029
51%
chance

It need to be unintented by the game designers.

Related markets:

Mar 2, 4:34pm: Will a NPC of a major video game realize it is a NPC before 2030? → Will a NPC of a major video game realize they are a NPC before 2030?

Get
Ṁ1,000
and
S1.00
Sort by:

What defines a "major" videogame? AAA? See for instance this question: https://manifold.markets/PatrickLD/will-language-models-or-similar-nat

@JacksonWagner as I was inspired by the @PatrickLD here I will use the same criteria used in their question

What counts as 'realising'? If an NPC calls a procedural conversation generation function that runs an internal process like "Sentence.type = revelation, revelation.subtype = identity, subject= self, object=NPC" and the NPC says "Oh shit, I just realised I'm a videogame NPC!", does that count? What about if it calls a GPT-like API with a prompt of "give me some dialogue that a videogame NPC could say in <situation x>" and it returns the same?

@AngolaMaldives the second counts, the first is excluded because it must be unintented by the game designers.

@FranklinBaldo what the. The second one clearly doesn't count, there is no real consciousness there. Selling.

@OrlandoMoreno why it doesnt count?

predicts NO

@FranklinBaldo Because a language model has no model of the world nor of itself. It can't be called aware.

@OrlandoMoreno LLMs for sure have a world model. The model is of the NPC in question, and the LLM can generate a simulacra of it given the context.

predicts NO

@FranklinBaldo The transformer only models language as probabilities on sequences of tokens. This correlates a bit with the way humans model the world, with the way humans talk about those models. But this process is not as strong as actual causal modeling, and it most likely doesn't have the kind of structure needed for consciousness.

@OrlandoMoreno there is a lot o people out there without such strong casual model, are they unconscious? I am not confident at all that human minds are significant different than probabilities on sequences of token. Either way, the market is not about consciousness, I avoid the term because it is too charged. The NPC in question just need the realize by what is happening around they that it is indeed a NPC in a game and externalize this realization.

@OrlandoMoreno I’m also very skeptical of current LLMs being conscious, but I think the question still makes sense. You make a system that mimics the behaviors of a “real person” that’s alive (though it isn’t.) You don’t intend for this system to mimic an intelligence that realizes they’re in a video game, but despite your intentions, the system does.