|If true artificial intelligence could be achieved would unplugging the machine constitute murder?
probably not (8bm.com readers choice)
And I am not just saying that so when the machines take over like they did in The Matrix they will put me up in some fancy virtual hedonistic retreat and program all of the virtual women there to sexually devour me every single day.
Ok, maybe I am.
But if a computer one day fits all of the definitions of intelligence, it's aware of itself, it is capable of independent thought and ideas, it has the ability to learn or understand or to deal with new or trying situations,
can apply knowledge to manipulate its environment or to think abstractly, then unplugging it would be no different than turning off the button on someone’s respirator or oxygen machine.
What makes us human to me is our intelligence.
It isn’t how many toes we have or if we blink our eyes at people when they walk into the room.
That is why I don’t have a problem euthanizing people who are brain dead or losing all of their faculties and aren’t going to get any better.
This flesh and blood crap is over rated.
If I can sit there with my computer and it can hold a discussion with me about philosophy, come up with its own theories on things, tell jokes, refer to things that I said in previous conversations and make its own connections between two seemingly unrelated ideas, then I will grieve when it’s hard drive crashes.
Then I will seek revenge on mankind for killing the only friend that I've ever had, or something really cliche like that.