It seems to me that before you discuss the moral issues surrounding human rights vs AI rights, you first have to consider what exactly is intelligence?
Just because something looks human and acts human does not necessarily mean that it is. Likewise, just because something is completely different from us, doesn't mean that it's not worthy of respect and consideration.
It all boils down to what we consider to be intelligence. Don't any of you realize that intelligence is something which we as a race have tried in vain to define since Plato?
What is intelligence? Is it the ability to use tools? Is it the ability to adapt to new situations? Is it the ability to "feel" (whatever that means)?
And if ... IF ... we as human beings are capable of creating something as intelligent as ourselves, should we? Shouldn't we seriously consider the ramifications? If we are not ready to treat each other with equality, what makes us think that we can treat another recognizeable form of intelligence fairly?
The movie, AI, had so many messages, but this was one of the more obvious ones.
If you think that machines haven't been replacing human labor since the industrial revolution, then you're seriously myopic! The thing is, technology always, ALWAYS, provides both benefit and detriment. It makes life easier for some, yet makes life hard for others. But guess what, whatever the final definition, we all agree that humans are intelligent, and we all agree that a part of that intelligence is adaptation. We'll adapt. We can only hope that radicals, both liberal and conservative alike, won't muddy the issues with closed-minded ignorance.
Technology has allowed the majority of us not to have to go out and hunt or to get up at the crack of dawn to milk the cows. I'm sure that none of you will complain about that. But have you thought of the people who were put out of work by the technology that allows us to enjoy our present luxuries? If you're going to piss and whine about AIs displacing labor, you should foresake all your current luxuries and go live in the jungle.
The issue is not whether AIs will displace humans. The issue is whether humans are ready to accept AIs. Have we reached an average level of education and wealth that will allow humanity to maintain productivity without having to engage in menial labor? The answer is a definite NO. Have we reached an average level of personal introspection and philosophy that will allow each of us to accept a second form of intelligence on this planet, artificial or otherwise? The answer is another definite NO.
When humans stop killing one another in the name of oil, and persecuting one another in the name of religion, and treating one another unequally because of skin color, then we may, and only MAY, be ready to consider the moral consequences of AI.
But all this may be totally moot for the time-being because the state of AI today is LIGHT-EONS away from popular fictional conceptions.