If a large, good survey of AI engineers in the United States is run, what will be the average p(doom) within 10 years?
Mini
8
Ṁ117Jan 2
14%
chance
1D
1W
1M
ALL
The survey must be representative of all programmers working on AI. There cannot be a biased towards ones engaged in X-risk discourse, those more active on Twitter, etc.
The survey must clearly state that the question is about the chance that AI kills or enslaves >99% of people on Earth within the next 10 years, or some very similar proposition. (e.g. if it asks about 15 years instead of 10 that still counts, but 100 years does not.)
The survey must be unbiased, and not use emotionally charged language or other push polling techniques.
Resolves N/A if no such survey is performed by the end of 2024.
Get
1,000
and1.00
Related questions
Related questions
What will be the average P(doom) of AI researchers in 2025?
20% chance
What will be the median p(doom) of AI researchers after AGI is reached?
Will my p(doom) be above 10% in 20 years (2043)?
31% chance
Will my p(doom) be above 10% in 10 years (2033)?
63% chance
Will Paul Christiano publicly announce a greater than 10% increase in his p(doom | AGI before 2100) within the next 5 years?
44% chance
The probability of "extremely bad outcomes e.g., human extinction" from AGI will be >5% in next survey of AI experts
73% chance
Will some U.S. software engineers be negatively affected financially due to AI by end of 2025?
65% chance
Will Eliezer Yudkowsky publicly claim to have a P(doom) of less than 50% at any point before 2040?
31% chance
What will Manifold's P(doom) be at the end of 2024?
29% chance
Will Eliezer's AI doom market have a higher P(doom) in the third quarter of 2026 than today's (2023-09-27) 21%?
33% chance