Will Yudkowsky claim that he is more than 50% sure that AI will kill everyone no later than 1 year after the claim?
➕
Plus
66
Ṁ7640
2036
29%
chance

Criteria for Resolution:

  1. Claim of Probability: Eliezer Yudkowsky must make a public claim that he is more than 50% sure that AI will kill everyone no later than 1 year after the claim is made. The claim must be made in all seriousness, not as a joke or sarcasm.

  2. Types of Qualifying Statements: Statements such as "certain”, "pretty sure”, "almost sure”, “rather yes than no”, or any similar expressions that indicate a probability of higher than 50% will qualify, beyond explicit quantitative claims.

  3. If resolution criteria are not met, the market will resolve NO on January 1, 2036.

Verification of Ambiguity:

- In cases where the claim is ambiguous, efforts will be made to explicitly ask Yudkowsky if his statement means that the probability is higher than 50%.

A related market (the same question but 90% instead of 50%): https://manifold.markets/IhorKendiukhov/will-yudkowsky-claim-that-he-is-mor-f2h2nq5epx

Get
Ṁ1,000
and
S1.00
Sort by:
bought Ṁ50 NO

No, that would be falsifiable and we can't have that...

.