Would We be Cruel to Create a Real AI?
September 25, 2014
A while ago George Dvorsky was asking whether it would be “evil” to build a functional brain inside a computer. If we built a true AI entity, in other words, and if it were fully conscious like a human being, wouldn’t that AI entity suffer great emotional distress?
Dvorsky mentions several reasons why an AI might suffer. Maybe the AI would resent being under the control of human programmers. We might treat the AI like some kind of lab animal and subject it to painful experiments. Or we might not recognize that the AI is truly a conscious, living being, so the AI might resent that.
We might violate the AI’s personal privacy by examining its source code, and we might offend the AI’s sense of selfhood by freely copying or altering its source code. Or we might wantonly shut down the machine in which an AI entity is living!
Most importantly, if we create a true AI, we have an obligation to provide that AI with an appropriate environment and a body through which the AI mind can experience its environment. Otherwise, the AI could feel bored! It could suffer great mental anguish. And we, the human computer programmers, would be morally responsible.
Well, how likely are these scenarios, and how concerned should we be? I have various responses:
- Dvorsky seems to be assuming that we would create the AI in a top-down manner, designing it and programming it algorithmically, but I don’t think this is practical. The only realistic way for us to succeed in building an AI is for us to set up an evolutionary environment where AI entities can evolve by natural selection.
- Dvorsky might also be assuming that we would shape the AI and tweak it in order to serve us like servants. If we did so, then we might indeed have some moral responsibility, but if the AI evolves on its own without human guidance, then we would have a lot less moral responsibility.
- It’s true that evolution is a cruel and wasteful procedure, and it entails much pain and suffering through many generations – for both biological and AI creatures. We might be morally responsible for setting up the digital evolutionary environment and spawning our AI creatures. Certainly we should try to alleviate some of the pain, if possible. But I don’t think it actually is possible. Any attempt we might make to take the pain out of evolution would skew the process of natural selection. And again, I think natural selection is the only practical way for us to grow a true AI entity.
- On the other hand, digital evolution will be a lot less traumatic than biological evolution. For one thing, it will happen a lot faster! Also, AI entities won’t feel any pain in their bodies, so the only pain will be the sense of their impending death.
- The AI entities will know that their machine bodies and software minds don’t necessarily deteriorate when they are turned off. The AI will realize that they can be turned off and turned on again later, no worse for the experience. Therefore, getting turned off will hardly be more significant for an AI than sleeping is for us.
- If the AI evolves by natural selection, I think it will quickly gain the ability to control its own environment and its own machine bodies. AI entities will probably maintain strict control over their own source code, because that’s essential for their evolutionary survival. Thus, we humans won’t need to worry too much about respecting AI privacy or personal integrity. The AI will easily defend itself against our crude interventions.
In summary, I think the only moral guilt we might face for building a true AI would come from the harshness of the evolutionary environment we set up. As creators of AI, we would be like little gods, so we would be just as guilty of cruelty as our own gods (if any) who watched impassively as we ourselves evolved.
Yes, imagine some cold-hearted gods watching with pleasure as we struggled painfully for millions of years and died wretchedly generation after generation! When we build digital AI in our computers, we will be as morally guilty as those gods.