Will a misaligned AGI take over the world?
Will a misaligned AGI take over the world?
➕
Plus
38
Ṁ4292
2123
11%
chance

By misaligned I mean an AGI that isn't aligned with the ACTUAL values of most humans.

And by taking over the world I mean the news would say so.
If you have recomendations that would make this description more useful I'm open to hear them.

Updated twice 08/02/2023: The definition of misaligned was to not being aligned with a single human and then was to not being aligned with any set of humans.

Feb 8, 12:42am: Will a misaligned AI take over the world? → Will a misaligned AGI take over the world?

Get
Ṁ1,000
and
S1.00


Sort by:
1y

Most humans have pretty dubious values.

Let’s say it proves to be too vulnerable to an EMP, and therefore requires humans to stand around as a “backup”. Would that count as a “No”?

predicts YES 1y

@Meta_C That's a yes, I think. Taking over the world does not require genocide.

2y
  • What if the AI is aligned to the values of some company/ideology/other subset of humanity, but not any singular human?

  • What if a human controls an AI well enough to take over the world, but this causes irreversible actions that even they regret, and the world ends up dystopic?

predicts YES 2y

@ThomasKwa What if the AI convinces the AI's creator that that is what they want even if the AI was initially "misaligned"?

predicts YES 2y
predicts YES 2y

fuck I ctrl+enter instead of shift+enter
for the first point: you are right, I want to consider that alignment so I changed the description. I also deleted the "intent alignment" part because I don't actually know the consensus of the term.
for the second point: if they regret it I wouldn't consider it properly aligned.

predicts YES 2y

@patodesu Doesn't it make sense that the AI would modify the human's brain? Everyone regrets things sometimes so what is the criteria? Eg. they write an article saying they regret it or what?

predicts YES 2y

@ZZZZZZ First question: Not necessarily, but I will change the description so that will not be a problem anymore.
Second question: I consider that to actually align an AI means to give it objectives that you will not regret giving it. The thing is, I don't know how the criteria would be to know that and now that I think about it, what I really care is the chances of align it with "human values", AKA the most ambitous definition of alignment.
So sorry but I'll change the description again.

2y

What if the news use a phrase along the lines of "AI has taken over the world", but what they really mean is something like "people are regularly using ChatGPT to assist with their work or look up facts"?

2y

@tailcalled well that wouldn't be misaligned i think. Just in case in the description I also put AGI

2y

What if it takes over the world but destroys the news in the process?

2y

@tailcalled yeah, it could happen maybe. So what should i put? No clarification at all?

2y

@patodesu Dunno

What is this?

What is Manifold?
Manifold is the world's largest social prediction market.
Get accurate real-time odds on politics, tech, sports, and more.
Win cash prizes for your predictions on our sweepstakes markets! Always free to play. No purchase necessary.
Are our predictions accurate?
Yes! Manifold is very well calibrated, with forecasts on average within 4 percentage points of the true probability. Our probabilities are created by users buying and selling shares of a market.
In the 2022 US midterm elections, we outperformed all other prediction market platforms and were in line with FiveThirtyEight’s performance. Many people who don't like trading still use Manifold to get reliable news.
How do I win cash prizes?
Manifold offers two market types: play money and sweepstakes.
All questions include a play money market which uses mana Ṁ and can't be cashed out.
Selected markets will have a sweepstakes toggle. These require sweepcash S to participate and winners can withdraw sweepcash as a cash prize. You can filter for sweepstakes markets on the browse page.
Redeem your sweepcash won from markets at
S1.00
→ $1.00
, minus a 5% fee.
Learn more.