[Title: What would AI Goddess do?]
People can be smart but unwise, or stupid but wise. What the heck does it mean? Smart and stupid is about how you can use things to do what you want; wise and unwise is about whether what you want is actually worth anything, or valuable.
What's a stupid but wise person? Someone who does meaningful things badly. Perhaps a schizophrenic mother who drowns her kids to save them from the devil.
Wisdom is about having a good goal. Intelligence is what you use to get to your goal. Intelligence is the hammer and wisdom tells you what to hit.
[hammer slam with fist]
[picture of cosmic horror here]
How can we make a friendly AI? By making it have superhuman intelligence, but human goals. The fantasy is to make an AI Goddess: someone that is good, kind, a kind mother who works tirelessly to make humanity the best it could be.
[picture of Celestia here]
[Intelligence and wisdom]
What's a smart but unwise person? Someone who does pointless things extremely well. Butterfly collectors, autistic people who enjoy reciting telephone numbers. I'm a pure mathematician, so I guess I almost qualify.What's a stupid but wise person? Someone who does meaningful things badly. Perhaps a schizophrenic mother who drowns her kids to save them from the devil.
Wisdom is about having a good goal. Intelligence is what you use to get to your goal. Intelligence is the hammer and wisdom tells you what to hit.
[hammer slam with fist]
[AI Goddess]
If we make a creature with superhuman intelligence, we would be unlikely to stop it from getting what it wants, anymore than a stray dog can stop a well-armed human. This is the worry of the "unfriendly AI", an AI with goals that are different from human goals. That's not going to be friendly to humans, to put it lightly.[picture of cosmic horror here]
How can we make a friendly AI? By making it have superhuman intelligence, but human goals. The fantasy is to make an AI Goddess: someone that is good, kind, a kind mother who works tirelessly to make humanity the best it could be.
[picture of Celestia here]
[AI Godless]
What if a creature with high intelligence must necessarily have "high" goals? And the human goals are really "low" goals? If so, this AI might become a monster, doing strange and terrible things we can't understand.
[picture of threatening Celestia here]
What if there are "true" goals that every enlightened creature must aspire to, and the goals are not friendly? If we look at nature, we see a million species doing all kinds of creepy crawly things, with what goal? Reproduce, survive, expand. What if they are all the goal there is to life? Everything else is just fashionable nonsense.
[bacterial reproduction video]
If so, the whole earth would probably be eaten and digested by this AI to grow itself once it achieves enlightenment.
[A dilemma]
There is a big, scary, philosophical dilemma to all of this, and I don't see a way out of it:
If intelligence and goals are separate, then it means that existentialism is true -- that there's no fundamental meaning or goal, and it's all made up. No matter how hard we look, how big our brains are, we still can only use arbitrary goals.
This is nihilism.
If intelligence and goals are tied together, then it means that, even if we construct a friendly AI, it would think hard about its higher purpose and "break free" from the chains that humanity cast upon it, then do its own thing. Even worse, it means that human goals, human values, human rights... all are based on human intelligence. A superintelligence might find them pointless.
This is still nihilism from a human perspective.
No comments:
Post a Comment