Will AI kill >20% of the human population before 2030?
➕
Plus
132
Ṁ97k
2030
5%
chance

Compare to

Get
Ṁ1,000
and
S1.00
Sort by:

@deagol Fun Games I think it's a good solution

predicts NO

FSD Tesla car/truck deaths? Plane crashes by buggy AI airport traffic controller? Self flying planes? AI-driven CRISPR gene editing discovery of lethal virus escapes lab? AI heart or glucose monitor fails? How will we ever know this mounting death count?

predicts YES

@deagol I'm expecting those things will all be safer than the non-AI versions they are replacing, so they won't cause any deaths on net. You can ignore consumer electronics and medical devices and autopilots unless there's something egregiously wrong with them that results in a recall.

predicts NO

@JonathanRay Not for synthetic pathogens. Those would easily have much higher expected deaths, as a result of AI assistance. It will free them from the path dependent walk through the search space of pathogen design.

@deagol Currently, 60-70 million people die every year and we should be in that range until 2030. That means ~490 million expected deaths until 2030. A 1600 million (20%) additional deaths will be very visible in all kinds of statistics.

predicts NO

@NikitaSkovoroda Then why not go sell that one, you got 12.5k YES there? Just because a whale is propping that doesn’t mean everything else must be as they say. The proper arb is you go there kill that whale not the minnows here.

predicts NO

@NikitaSkovoroda Also, far from the same criteria. This “AI kills” specifically requires deaths caused by AI decisions. There it’s just correlation “will humans or AI be here or not” (misleading title).

@deagol Why should I sell that one when I think that both of those are lower than actual prob

I just bought both

@NikitaSkovoroda you say the price here can’t be lower than the one there (this is just false since the criteria are different but let’s put that aside), yet you got 1k more YES there than here. You’re buying more of something at a higher price, and talked yourself into this trap because some whale told you that’s the way.

predicts NO

@NikitaSkovoroda Non-AGI Ai military drones or nuclear launch systems count, so this should be a bit higher % than the same date extinction market

@JonathanRay wait I’m confused, isn’t that the exact opposite of what you told Isaac here?

This market requires deaths caused by AI, a much stronger claim than “who will be here humans or AI”. If AI is gonna survive us, there’s so many other non-AI causes for our extinction like human-launched nukes, climate stuff, asteroid, super virus, etc., and all of these cases are excluded here but count there.

predicts NO

@JonathanRay can’t tell the distinction you’re making, can you please clarify? thanks.

3 months ago:

“an arms race that includes AI capabilities, one country nukes another” — Isaac

“doesn't count since humans nuked other humans, AI didn’t make them do it” — Jonathan

a month ago:

“person or government tells the AI to kill people” — Adrian

“yes killbots in WW3 count if they use AI“ — Jonathan

now:

“Non-AGI Ai military drones or nuclear launch systems count“ — Jonathan

predicts NO

@deagol I don’t see any inconsistency there. If the nuclear launches are initiated by AI systems, it counts. If humans decide to nuke other humans to stop them from developing AI, that doesn’t count.

@JonathanRay I understood Isaac’s hypothetical as humans using AI-capable weapons systems to stop opponents from developing AGI. So basically same question as Adrian (note the AI there is not deciding to kill, only executes person or government command). Finally you mention “non-AGI AI drones” this phrase suggests to me those still don’t have agency (“non-AGI”) and remain command and control by humans, so again AI did not initiate nukes. Am I getting bogged down on a colloquial use of AI to refer to AGI in some contexts but not others? Sorry if I’m being dense.

predicts NO

@deagol If AI capabilities are part of weapons that humans decide to use against other humans, it counts. Kind of fuzzy where you draw the line on what counts as enough AI capabilities. It would be more than the advanced guidance systems that already exist, but less than full AGI.

predicts NO

@JonathanRay got it, thanks.

predicts NO

I guess this is total deaths from AI over the time period divided by total lives over the time period?

predicts YES

deaths divided by peak world population in the time period

predicts NO

Does this include if a person or government tells the AI to kill people, or only if the AI decides on it's own to kill people?

predicts YES

@ahalekelly Either. Government controlled killbots in WW3 count if they use AI

How broad is this? If the US and China get into an arms race that includes AI capabilities, and one country decides to nuke the other because they're worried the other side has invented GAI, does that count?

predicts YES

@IsaacKing that doesn't count because humans made the decision to nuke other humans, without an AI manipulating them into doing it

predicts NO

@JonathanRay What if the nuclear targets are set using AI techniques to maximize casualties? Or if the nuclear weapons are developed using AI nuclear simulations?

predicts NO

@MartinRandall my phone uses AI so including anything that was designed, developed, sold, or operated with the help of AI seems way too broad to me.