Artificial Intelligence

Text-only Version: Click HERE to see this thread with all of the graphics, features, and links.



Grand_Moff_Gav
The question is this, can an Intelligence which is of artificial life be considered sentient and indeed equal with Organic Intelligence?

Should we ever develop "robots" who are say like Data from Star Trek or Andrew from The Bicentennial Man should they have access to the same human rights as mankind?

Shakyamunison
Originally posted by Grand_Moff_Gav
The question is this, can an Intelligence which is of artificial life be considered sentient and indeed equal with Organic Intelligence?

Should we ever develop "robots" who are say like Data from Star Trek or Andrew from The Bicentennial Man should they have access to the same human rights as mankind?

Yes. We are the ones who determine what sentient is and is not, so it is better to give the benefit of doubt.

chithappens
Umm, just to ask, is cloning also considered "artificial?"

Symmetric Chaos
Originally posted by Grand_Moff_Gav
The question is this, can an Intelligence which is of artificial life be considered sentient and indeed equal with Organic Intelligence?

Sure. Why not?

Originally posted by Grand_Moff_Gav
Should we ever develop "robots" who are say like Data from Star Trek or Andrew from The Bicentennial Man should they have access to the same human rights as mankind?

If we did they should. I assume for safety's sake we'll stick with non-volitional AIs for a long time.

Shakyamunison
Originally posted by Symmetric Chaos
...If we did they should. I assume for safety's sake we'll stick with non-volitional AIs for a long time.


laughing I agree. No building Cylon's any time soon.

=Tired Hiker=
I don't think a robot could ever think for itself, it could just seem to. No, they are robots. They don't deserve the same rights as humans.

Symmetric Chaos
Originally posted by =Tired Hiker=
I don't think a robot could ever think for itself, it could just seem to. No, they are robots. They don't deserve the same rights as humans.

There's no major difference between a collection of cells thinking and a collection of transistors thinking.

chithappens
Originally posted by Symmetric Chaos
There's no major difference between a collection of cells thinking and a collection of transistors thinking.

Ok be honest, we both know that's know what the question is asking

Bardock42
Originally posted by Grand_Moff_Gav
The question is this, can an Intelligence which is of artificial life be considered sentient and indeed equal with Organic Intelligence?

Should we ever develop "robots" who are say like Data from Star Trek or Andrew from The Bicentennial Man should they have access to the same human rights as mankind?

I'd say yes.

En Sabah Nur X
Originally posted by Grand_Moff_Gav
The question is this, can an Intelligence which is of artificial life be considered sentient and indeed equal with Organic Intelligence?

Should we ever develop "robots" who are say like Data from Star Trek or Andrew from The Bicentennial Man should they have access to the same human rights as mankind?

Humans' are molecular machines, functions of matter, which is a function of energy which is a function of information which is a function of bits, the universe is... reality and reality is an idea a category made by man to explain the world.

Molecular machines born of man, our children are as human as we are, their rights should be the same.

=Tired Hiker=
Originally posted by Symmetric Chaos
There's no major difference between a collection of cells thinking and a collection of transistors thinking.

But there really is, right? A collection of transistors thinking is really just a bunch of stuff programmed to execute protocols, it's not really thinking at all. Humans thinking is far more complex and connected to real emotions, physical feelings such as pain.

Sci Fi books and movies are great at making us believe machines can feel and think just like humans, but it's simply Sci Fi, that is all. It's not realistic and I don't think it ever will be.

=Tired Hiker=
Originally posted by Grand_Moff_Gav
Should we ever develop "robots" who are say like Data from Star Trek or Andrew from The Bicentennial Man should they have access to the same human rights as mankind?

Okay okay, SHOULD we ever deveop such "robots", then yes, I agree that they should have access to the same human rights as mankind. But I don't think it will ever happen.

Bardock42
Originally posted by =Tired Hiker=
But there really is, right? A collection of transistors thinking is really just a bunch of stuff programmed to execute protocols, it's not really thinking at all. Humans thinking is far more complex and connected to real emotions, physical feelings such as pain.

Sci Fi books and movies are great at making us believe machines can feel and think just like humans, but it's simply Sci Fi, that is all. It's not realistic and I don't think it ever will be.

Actually, that's more a philosophical question. But I think most scientists would say that humans are really just complex machines, far more complex than any robot we can build now (and probably for a long time), but really, it's not that different at all.

=Tired Hiker=
Originally posted by Bardock42
Actually, that's more a philosophical question. But I think most scientists would say that humans are really just complex machines, far more complex than any robot we can build now (and probably for a long time), but really, it's not that different at all.

It is very different. Most scientists would say that humans are complex machines, far more complex than any robot we can build now (and probably for a long time).

Bardock42
Originally posted by =Tired Hiker=
It is very different. Most scientists would say that humans are complex machines, far more complex than any robot we can build now (and probably for a long time).

But the thread starter asked, what if it wasn't that different anymore, and the sophistication of the robots approaches that of humans, what then?

WrathfulDwarf
Originally posted by Grand_Moff_Gav
The question is this, can an Intelligence which is of artificial life be considered sentient and indeed equal with Organic Intelligence?



For me such machines should be able to reproduce. Meaning a female robot have to give birth. If it could do it...then I would say they can be equal or even given rights.

Until then...it remains no different than my Xbox 360.

(which btw- I happen to enjoy and sometimes...even love)

Bardock42
Originally posted by WrathfulDwarf
For me such machines should be able to reproduce. Meaning a female robot have to give birth. If it could do it...then I would say they can be equal or even given rights.

Until then...it remains no different than my Xbox 360.

(which btw- I happen to enjoy and sometimes...even love)

That's stupid.


Is the ability to reproduce essential to all humans then too? Should we use sterile people as slaves? I am pretty sure that the morals behind human rights are not based on the ability to **** and create offspring.

WrathfulDwarf
Originally posted by Bardock42
That's stupid.


Is the ability to reproduce essential to all humans then too? Should we use sterile people as slaves? I am pretty sure that the morals behind human rights are not based on the ability to **** and create offspring.

Species should reproduce...yes...what species do Robots belong to?

Oh, we should give them rights because they can MIMIC or COPY such things as human emotions?

Yeah, why should a machine have rights?



Answer that BEFORE you throw that bags of tricks.

Symmetric Chaos
Originally posted by =Tired Hiker=
But there really is, right? A collection of transistors thinking is really just a bunch of stuff programmed to execute protocols, it's not really thinking at all. Humans thinking is far more complex and connected to real emotions, physical feelings such as pain.

Sci Fi books and movies are great at making us believe machines can feel and think just like humans, but it's simply Sci Fi, that is all. It's not realistic and I don't think it ever will be.

Everything you think and feel (physically and emotionally) is simply programmed into how your brain is built. It would obviously take a very long time but absolutely nothing prevents a machine from feeling emotions the way we do.

Symmetric Chaos
Originally posted by WrathfulDwarf
Oh, we should give them rights because they can MIMIC or COPY such things as human emotions?

Yes, otherwise high function autistics would have to be denied human rights.

Bardock42
Originally posted by WrathfulDwarf
Species should reproduce...yes...what species do Robots belong to?

What does it matter. You don't have to be of a species to get certain rights and being of a species does not grant you equal rights to other species.

Originally posted by WrathfulDwarf
Oh, we should give them rights because they can MIMIC or COPY such things as human emotions?

No, we should give them rights because they have emotions and thoughs similar to humans or even higher.

Originally posted by WrathfulDwarf
Yeah, why should a machine have rights?

Because that is basically what we are. And because (in this scenario) they can suffer just like we can.

Originally posted by WrathfulDwarf
Answer that BEFORE you throw that bags of tricks.

Okay, answered. Your post is still stupid.

Devil King
Originally posted by Grand_Moff_Gav
like Data from Star Trek

There was an episode of ST:TNG that dealt with that. I think Data ended up with rights.

dadudemon
Originally posted by Bardock42
But the thread starter asked, what if it wasn't that different anymore, and the sophistication of the robots approaches that of humans, what then?


The processing power, which is considered the primary hurdle for human-like AI, will be within reach by 2025.

Even if the AI is not as complex as human emotion, they should be every bit as intuitive in information processing as humans.

Their are some who say that human-like AI will never exist, but we will have computers who are better at being intelligent than we are. big grin

Robtard
Originally posted by Devil King
There was an episode of ST:TNG that dealt with that. I think Data ended up with rights.

Correct, Picard argued that if Data were allowed to be dissected and denied rights as an organic sentient being has, it would lead to a race of robotic slaves.

=Tired Hiker=
Originally posted by Bardock42
But the thread starter asked, what if it wasn't that different anymore, and the sophistication of the robots approaches that of humans, what then?
Hence why I posted this post a long ass time ago.Originally posted by =Tired Hiker=
Okay okay, SHOULD we ever deveop such "robots", then yes, I agree that they should have access to the same human rights as mankind. But I don't think it will ever happen.

xmarksthespot
I thought the primary hurdle for technological development was software development.

Devil King
But to answer the question Gav, I've got to go with Bardock on this one. If AI were developed, I think they should have rights. At least the right to decide for themselves. Which would likely be the true test of AI.

Symmetric Chaos
Originally posted by xmarksthespot
I thought the primary hurdle for technological development was software development.

So did I. Technically we have computers with far most raw intellectual might than any human being.

Devil King
Originally posted by Robtard
Correct, Picard argued that if Data were allowed to be dissected and denied rights as an organic sentient being has, it would lead to a race of robotic slaves.

Right. I just couldn't remember what happened at the end. I know Data wasn't hacked up but I couldn't recall if a final descision was reached.

=Tired Hiker=
Originally posted by dadudemon
Their are some who say that human-like AI will never exist, but we will have computers who are better at being intelligent than we are. big grin

This may be true, but will they need to eat and sleep and seek shelter like the rest of us? Will they also receive a paycheck each week to put food on the table? Negative ghost rider.

Symmetric Chaos
Originally posted by =Tired Hiker=
This may be true, but will they need to eat and sleep and seek shelter like the rest of us? Will they also receive a paycheck each week to put food on the table? Negative ghost rider.

Not in the same way but they would still need to draw power from somewhere, pay for maintenance, spend time recharging and so on.

Devil King
Maybe they will to buy lubricants and replacement parts. Now had you thought of that Earl?

Bardock42
Originally posted by =Tired Hiker=
Hence why I posted this post a long ass time ago.

Yes, but that was not what I initially replied about, just what you made resurface, so I had to repeat.

Devil King
Originally posted by Symmetric Chaos
Not in the same way but they would still need to draw power from somewhere, pay for maintenance, spend time recharging and so on.

JINX! cat

Robtard
Originally posted by Symmetric Chaos
So did I. Technically we have computers with far most raw intellectual might than any human being.

Not in all facets of intelligence though. Sure they can calculate mathematical equations billions of times better than us, but can they think outside of the box, so to speak? Do they have creativity? Can they reason deductively/inductively?

Shakyamunison
Originally posted by Robtard
Not in all facets of intelligence though. Sure they can calculate mathematical equations billions of times better than us, but can they think outside of the box, so to speak? Do they have creativity? Can they reason deductively/inductively?

If you use that as a standard, then you have eliminated half this forums population. wink

Devil King
Originally posted by Robtard
Not in all facets of intelligence though. Sure they can calculate mathematical equations billions of times better than us, but can they think outside of the box, so to speak? Do they have creativity? Can they reason deductively/inductively?

I guess the point is that the computer didn't figure out the equation first. It was programed to do math by a human that learned how to do math from another human.

xmarksthespot
Originally posted by Symmetric Chaos
So did I. Technically we have computers with far most raw intellectual might than any human being. With upwards of a quadrillion synapses, human neural processing is a petaflop/s computation, and only the world's fastest computers can even hope of mimicking it. Notwithstanding having to design algorithms to mimick neural system functions which use the least amount of computation possible, because computers just aren't fast enough yet.

Robtard
Xmarks,

You might know something of this, I remember reading that some research facility/laboratory was trying to develop and organic CPU chip based off brain cells (or something similar).

Symmetric Chaos
Originally posted by Robtard
Not in all facets of intelligence though. Sure they can calculate mathematical equations billions of times better than us, but can they think outside of the box, so to speak? Do they have creativity? Can they reason deductively/inductively?

That's a software problem not a hardware problem which as the issue being addressed.

Originally posted by xmarksthespot
With upwards of a quadrillion synapses, human neural processing is a petaflop/s computation, and only the world's fastest computers can even hope of mimicking it.

My mistake. But still as computing powers goes up we'll likely have stuff easily within human range inside of a decade.

xmarksthespot
Originally posted by Robtard
Xmarks,

You might know something of this, I remember reading that some research facility/laboratory was trying to develop and organic CPU chip based off brain cells (or something similar). Is it this?
http://news.bbc.co.uk/1/hi/sci/tech/358822.stm
Or this?
http://www.livescience.com/health/060327_neuro_chips.html
http://www.newscientist.com/article/dn8902-chip-ramps-up-neurontocomputer-communication.html

I'm sure there are other labs working on similar though.

=Tired Hiker=
Originally posted by Bardock42
Yes, but that was not what I initially replied about, just what you made resurface, so I had to repeat.

And I was simply repeating what you wrote to show it actually supports my point of view.

=Tired Hiker=
Originally posted by Symmetric Chaos
Not in the same way but they would still need to draw power from somewhere, pay for maintenance, spend time recharging and so on.

Why would a robot pay for maintenance? Wouldn't it be up to us to pay for it's maintenance? We are the ones who built it and it is serving a purpose for us. I doubt we'd build robots to think for themselves just to benefit robot-kind.

Robtard
Originally posted by xmarksthespot
Is it this?
http://news.bbc.co.uk/1/hi/sci/tech/358822.stm
Or this?
http://www.livescience.com/health/060327_neuro_chips.html
http://www.newscientist.com/article/dn8902-chip-ramps-up-neurontocomputer-communication.html

I'm sure there are other labs working on similar though.

Close enough.

Symmetric Chaos
Originally posted by =Tired Hiker=
Why would a robot pay for maintenance?

Because it makes economic sense.

Originally posted by =Tired Hiker=
Wouldn't it be up to us to pay for it's maintenance? We are the ones who built it and it is serving a purpose for us. I doubt we'd build robots to think for themselves just to benefit robot-kind.

If we don't make them pay for maintenance and such we'd be doing nothing but benefiting robots.

=Tired Hiker=
Originally posted by Symmetric Chaos
If we don't make them pay for maintenance and such we'd be doing nothing but benefiting robots. As is, we pay for robot mainenance all the time because the robots serve a purpose for us. If we designed robots to think for themselves and they did not serve a purpose for us, then of course they should pay for their own maintenance. But if that is the case, why create these robots in the first place?

Grand_Moff_Gav
Originally posted by Devil King
There was an episode of ST:TNG that dealt with that. I think Data ended up with rights.

Indeed, he argued that Data had Self-Awareness, Intelligence and Consciousness therefore he was as sentient as any other being.

Why does it matter if your carbon based or platinum based etc?

Pregnancy and Machine Construction isn;t even that different if you think about it.

Raw Materials are transformed into an end product.

dadudemon
Originally posted by Symmetric Chaos
That's a software problem not a hardware problem which as the issue being addressed.

You are correct. This is why some people say that real AI will never exist. We can create the processing power, eventually, and the "robots" will be more intelligent than we are but we never be able to create true AI.

I disagree, because I'm an optimist and it just seems close-minded and egotistical to think that a ridiculously complex program cannot be created. It doesn't matter if the program is merely mimicking human intelligence (or lack thereof), we ourselves are programmed via genetics and nurture anyway.

xmarksthespot
It's a two-sided coin (on an aside: what a redundant phrase - all coins by their nature must have two sides).
It may be egotistical to think that human neural systems are too complex to be mimicked by software algorithms. But then it's equally egotistical to posit that human programmers are capable of writing software algorithms able to mimic complex neural systems - human or otherwise.

dadudemon
Originally posted by xmarksthespot
But then it's equally egotistical to posit that human programmers are capable of writing software algorithms able to mimic complex neural systems - human or otherwise.

You could say that...but I wasn't thinking about the capability of the human-mind to produce such. I was more or less thinking that it was a mere probability that would come to fruition due to the greed and perseverance of humanity (That still may seem egotistical, but in my thoughts, I am placing perseverance on the same level as say, driver ants.) That's probably why I didn't think it was egotistical to hold such a view. From my now former perspective (because you made a good point), I could only see the ego of evangelicals as it related to the divine disposition of humanity. Fairly strange of a theist to hold such a perspective, isn't it? big grin

Bardock42
Originally posted by xmarksthespot
It's a two-sided coin (on an aside: what a redundant phrase - all coins by their nature must have two sides).

I am pretty sure the common expression is "two sides of the same coin". Which makes more sense. Maybe "double-edged sword" would have been closer to your meaning.

Mindship
Originally posted by Grand_Moff_Gav
The question is this, can an Intelligence which is of artificial life be considered sentient and indeed equal with Organic Intelligence?Depends what you mean by Intelligence (or for that matter, life). Broadly speaking, the factors involved in Intelligence are Knowledge, Memory, Logic and Creativity (making alogical, synergistic connections). The first three we understand and are already replicable. But Creativity: afaik, that remains an enigma.

Should we ever develop "robots" who are say like Data from Star Trek or Andrew from The Bicentennial Man should they have access to the same human rights as mankind? I think this involves more than Intelligence. As you implied above, this now touches upon Life, as well as Consciousness, ie, what it means to have either. Personally, I don't think a human-mimicking Turing machine should have the same rights as a human being.

dadudemon
If any of you have about an hour...this is a nice summary of Ray Kurzweil's thoughts. I just stumbled across this. He makes my specualtions look like chld's play.

He was wrong on some things but rediculously correct on others. Why aren't the Christians hailing this guy as a prophet?

http://en.wikipedia.org/wiki/Ray_Kurzweil

inimalist
Researchers create a virtual mouse brain:

http://www.theness.com/neurologicablog/?p=242

all we need to do is program the hundreds of billions of neurons and potentially thousands of connections between each of them and we will have a functioning brain, only in C++ or whatever smile

My question about this would relate to neuroplasticity. One of the craziest things about brains is their ability to reorganize themselves based on external stimuli. Hypothetically, robots might not have the ability to, at a hardware level, reorganize their basic machinery to adapt to novel situations, and might need a memory system far greater than ours with many more pre-set options available.

dadudemon
Originally posted by inimalist
Researchers create a virtual mouse brain:

http://www.theness.com/neurologicablog/?p=242

all we need to do is program the hundreds of billions of neurons and potentially thousands of connections between each of them and we will have a functioning brain, only in C++ or whatever smileThis is effective as of today, 06-23-2008.



Originally posted by inimalist
My question about this would relate to neuroplasticity. One of the craziest things about brains is their ability to reorganize themselves based on external stimuli. Hypothetically, robots might not have the ability to, at a hardware level, reorganize their basic machinery to adapt to novel situations, and might need a memory system far greater than ours with many more pre-set options available.

Nanotechnology is supposed to be the remedy. Supposedly, it will reorganize things at the nano level...very similar to biological cells.

Adam_PoE
Why isn't this in the Philosophy forum?

Robtard
We should philosophize over exactly that.

WrathfulDwarf
Originally posted by Bardock42
What does it matter. You don't have to be of a species to get certain rights and being of a species does not grant you equal rights to other species.


It does matter. They're machines. Period! There is no DNA in them.

Originally posted by Bardock42

No, we should give them rights because they have emotions and thoughs similar to humans or even higher.

Let's follow your train of thought here. We civilize humans should give rights to the ancestors of those future machines. So that in the future this machines won't have hard feelings towards us for treating their kin with respect. Yeah, let's give machines rights! So thus all laptops, video game consoles, microwaves etc...give them rights. Oh brother! Love your thinking.

Here is an idea! You want to give rights to someone? Give it to human clones...I'm sure they'll appreciated more than some mechanical gadget.



Because that is basically what we are. And because (in this scenario) they can suffer just like we can.

It's a friggin machine! It's not organic! It's program to replicate human emotions...it's not living.


Okay, answered. Your post is still stupid.

Not as stupid as giving a machine rights just because it can mimic a human being.

Bardock42
Originally posted by WrathfulDwarf
It does matter. They're machines. Period! There is no DNA in them.

So? Is DNA somehow essential to your understanding of morality?

Originally posted by WrathfulDwarf
Let's follow your train of thought here. We civilize humans should give rights to the ancestors of those future machines. So that in the future this machines won't have hard feelings towards us for treating their kin with respect. Yeah, let's give machines rights! So thus all laptops, video game consoles, microwaves etc...give them rights. Oh brother! Love your thinking.

Don't try that thinking thing, doesn't suit you. I never said we should give ancestors of anything rights, just because their descendants might qualify as equals. I said, once that is the case, those machines should have rights...

Originally posted by WrathfulDwarf
Here is an idea! You want to give rights to someone? Give it to human clones...I'm sure they'll appreciated more than some mechanical gadget.

Did you somehow think I was against rights for human clones?


Originally posted by WrathfulDwarf
It's a friggin machine! It's not organic! It's program to replicate human emotions...it's not living.

What's the difference. If it feels pain, if it feels fear, if it has hopes and emotions. Why should it not have rights like we do? We give animals some rights and in this hypothetical situation they would be emotionally and intelligence wise far below the machines talking about. Why are you so strongly opposed to non-human rights?

Originally posted by WrathfulDwarf
Not as stupid as giving a machine rights just because it can mimic a human being.

We were talking about a machine that actually has those emotions and does not just emulate them. Though, in practice, I doubt we could differentiate, and I doubt it would make a difference.

I don#t think humans deserve rights because it has DNA, but because it has a mind that can feel pain, that can think and believe...

WrathfulDwarf
Originally posted by Bardock42
What's the difference. If it feels pain, if it feels fear, if it has hopes and emotions. Why should it not have rights like we do? We give animals some rights and in this hypothetical situation they would be emotionally and intelligence wise far below the machines talking about. Why are you so strongly opposed to non-human rights?


That's the point. It feels nothing! It's a program. It can mimic emotions nothing more. I oppose giving rights to something that is so absurdly silly as a mechanical gadget. I can see giving it something like an insurance or a warranty....like my Xbox 360, or my car, or my PC. But give it rights? ha!

Originally posted by Bardock42
I don#t think humans deserve rights because it has DNA, but because it has a mind that can feel pain, that can think and believe...

And can do many other things machines can't...such as humor, passion, kindness and other stuff.

Think for a moment of the lack of jobs that these machines would cause to the future proletariat.

Can you imagine a world in which a human have to compete for a job with another machine? Put yourself in that man shoes...how is he going to put food on his table while some mechanical automaton does the job the man rightfully should have?

That's right Bardock! This capitalist pig (i.e. me) would agree with a Communist. We would rather give a job to a man than to a machine.

WORKERS OF ALL NATIONS...UNITE!

chithappens
Originally posted by WrathfulDwarf
That's the point. It feels nothing! It's a program. It can mimic emotions nothing more. I oppose giving rights to something that is so absurdly silly as a mechanical gadget. I can see giving it something like an insurance or a warranty....like my Xbox 360, or my car, or my PC. But give it rights? ha!




For the sake of argument I'll ask: If machines could "mimic" emotion what would stop someone from making the argument that humans simply mimic emotion also?

"Whoever/whatever/(insert here)" created us so "the creator" would see humans the same, no?

Symmetric Chaos
Originally posted by WrathfulDwarf
That's the point. It feels nothing! It's a program. It can mimic emotions nothing more. I oppose giving rights to something that is so absurdly silly as a mechanical gadget. I can see giving it something like an insurance or a warranty....like my Xbox 360, or my car, or my PC. But give it rights? ha!

DNA isn't a requirement. You have real no basis for saying that a robot cannot feel.

Originally posted by WrathfulDwarf
Think for a moment of the lack of jobs that these machines would cause to the future proletariat.

Can you imagine a world in which a human have to compete for a job with another machine? Put yourself in that man shoes...how is he going to put food on his table while some mechanical automaton does the job the man rightfully should have?

That's right Bardock! This capitalist pig (i.e. me) would agree with a Communist. We would rather give a job to a man than to a machine.

WORKERS OF ALL NATIONS...UNITE!

That's already happened to our current proletariat actually. In fact the machines taking their jobs quite clearly have no capacity to feel.

Bardock42
Originally posted by WrathfulDwarf
That's the point. It feels nothing! It's a program. It can mimic emotions nothing more. I oppose giving rights to something that is so absurdly silly as a mechanical gadget. I can see giving it something like an insurance or a warranty....like my Xbox 360, or my car, or my PC. But give it rights? ha!

I think you vastly overrated the human mind. Probably cause you believe in some soul nonsense or something. Why do you doubt a Robot could be self-aware like we are?



Originally posted by WrathfulDwarf
And can do many other things machines can't...such as humor, passion, kindness and other stuff.

Can't yet. What if they can in the future? That's the question, mate. Stop arguing on the basis "Bardock wants to give rights to my X-Box", it's nonsense.

Originally posted by WrathfulDwarf
Think for a moment of the lack of jobs that these machines would cause to the future proletariat.


Can you imagine a world in which a human have to compete for a job with another machine? Put yourself in that man shoes...how is he going to put food on his table while some mechanical automaton does the job the man rightfully should have?

Are you some sort of idiot? If there were such Robots, they would be used by employers much rather as slaves than as workers with rights, causing the "future proletariat" to lose jobs just as much or even more. You have to think before your type...it's less embarassing.

Originally posted by WrathfulDwarf
That's right Bardock! This capitalist pig (i.e. me) would agree with a Communist. We would rather give a job to a man than to a machine.

WORKERS OF ALL NATIONS...UNITE!

Mate, don't flatter yourself. Of capitalists on this site you are a very, very small flame.

demon-lllama
Robots are great in art!

It adds so much: in the Jacksonville mall nearby where I used to live, later on when I went back, the trash can spoke after you threw something away because it was electronic. Things like that amaze me, and I can't believe I missed out on the fun. Still, it didn't have the covered toilet seats like the airport.

I have some other experiences, but are you allowed to talk about hallucinations on KMC? -Because I never do nor did anything illegal like drugs.

I saw some toys on eToys that were supposed to simulate a gorilla head as though real, and I had wanted to get that. Have you been to the Rainforest Cafe? That place is way wacked beyond some things you'd think were like Disney because every 1/2 hour or something like that it rains and the animals go totally berserk. (It gets dark and blinks lites.)

WrathfulDwarf
Originally posted by Bardock42
I think you vastly overrated the human mind. Probably cause you believe in some soul nonsense or something. Why do you doubt a Robot could be self-aware like we are?

I think you vastly underrated the human mind. Probably because you feel technology will keep you happy.




Originally posted by Bardock42

Can't yet. What if they can in the future? That's the question, mate. Stop arguing on the basis "Bardock wants to give rights to my X-Box", it's nonsense.

Because you are saying nonsense! Mechanical things should have a warranty or insurance. They're things that are build not reproduce.


Originally posted by Bardock42

Are you some sort of idiot? If there were such Robots, they would be used by employers much rather as slaves than as workers with rights, causing the "future proletariat" to lose jobs just as much or even more. You have to think before your type...it's less embarassing.

Trying to belittle my comments using that tone makes you look utterly desperate and less convincing. Try talking about the actual point. So, are you willing to accept that robots can take away jobs from humans YES or NO?


Originally posted by Bardock42

Mate, don't flatter yourself. Of capitalists on this site you are a very, very small flame.

Capitalist is a capitalist no matter where he is from. I just happen to be the more consumer. Thank you very much.

Bardock42
Originally posted by WrathfulDwarf
I think you vastly underrated the human mind. Probably because you feel technology will keep you happy.






Because you are saying nonsense! Mechanical things should have a warranty or insurance. They're things that are build not reproduce.




Trying to belittle my comments using that tone makes you look utterly desperate and less convincing. Try talking about the actual point. So, are you willing to accept that robots can take away jobs from humans YES or NO?




Capitalist is a capitalist no matter where he is from. I just happen to be the more consumer. Thank you very much.

Wow, you talk such bullshit. Not even worth a reply. You'd dodge the point anyways.

WrathfulDwarf
And you can't even answer a simple YES or NO question.

Tip the jar before you leave.

chithappens
"Technology" takes jobs from people now. What's confusing about that?

Bardock42
Originally posted by WrathfulDwarf
And you can't even answer a simple YES or NO question.

Tip the jar before you leave.

It was implied in my post, yes, robots will (and do already) "take" jobs from workers (if there is still such a class by then) regardless of having rights or not (though more jobs if they do not have rights, which ****s your stupid little..."argument" in the arse).

WrathfulDwarf
Originally posted by chithappens
"Technology" takes jobs from people now. What's confusing about that?

Thus lies the problem.

Technology should help people in their jobs. Not take them away. It's going to be far worse when a robot with human capacities take over. Even worse giving them absurd rights that they don't deserved. Those rights should go to humans.

Bardock42
Originally posted by WrathfulDwarf
Thus lies the problem.

Technology should help people in their jobs. Not take them away. It's going to be far worse when a robot with human capacities take over. Even worse giving them absurd rights that they don't deserved. Those rights should go to humans.

WRONG.

It would not get worse if they had rights. It would, if anything, get better. But Robots are there to help their owner, not the person that works for their owner before the Robot did. And please, never call yourself capitalist again, it is insulting.

WrathfulDwarf
Originally posted by Bardock42
WRONG.

It would not get worse if they had rights. It would, if anything, get better. But Robots are there to help their owner, not the person that works for their owner before the Robot did. And please, never call yourself capitalist again, it is insulting.

Get better? Says who? YOU!? Oh, how I wonder....

Now who is talking BS here? U. Again, technology to help the workers not to take their jobs away. I'll bet you Five bucks that if Marx was alive today he would agree with someone as capitalistic as I.

So take that!

Bardock42
Originally posted by WrathfulDwarf
Get better? Says who? YOU!? Oh, how I wonder....

Now who is talking BS here? U. Again, technology to help the workers not to take their jobs away. I'll bet you Five bucks that if Marx was alive today he would agree with someone as capitalistic as I.

So take that! You are the one that says it would get worse. Economic reason though would agree with me.

That's my point, you have Marxist views. Not capitalist. Calling yourself that is inaccurate, so don't. It insults me and anyone with true capitalist views. Commie.

Either way, to me it is about whether a person or thing can think and feel. That's why I don't grant animals the same rights as humans. I would grant intelligent, creative self-conscious machines the right though. I am sorry, but "Workers would lose jobs, boo hoo" is not a good reason to take suffering things as slaves. In fact, it's very similar to arguments Slave owners would use against negros...oh well, I guess makes sense, you are more conservative than libertarian anyways. (also explains why you think you are a capitalist).

WrathfulDwarf
Originally posted by Bardock42
You are the one that says it would get worse. Economic reason though would agree with me.

That's my point, you have Marxist views. Not capitalist. Calling yourself that is inaccurate, so don't. It insults me and anyone with true capitalist views. Commie.

Either way, to me it is about whether a person or thing can think and feel. That's why I don't grant animals the same rights as humans. I would grant intelligent, creative self-conscious machines the right though. I am sorry, but "Workers would lose jobs, boo hoo" is not a good reason to take suffering things as slaves. In fact, it's very similar to arguments Slave owners would use against negros...oh well, I guess makes sense, you are more conservative than libertarian anyways. (also explains why you think you are a capitalist).

Yes, I happen to be very capitalist. I support investing in a free market. I happen to be a consumer of collectibles and support credit approval for anyone with Social security number (even with a history of low income). I'm a firm believer in mass production for the trade market....etc...etc...and etc...

My views are not just Marxists...they happen to be humane. I put humans first and machines death last. Conservative, me? Haha! Oh wow! You don't know me well bardock. Economic reasons are worthless if humans are alienated....especially in the work force. So you can take that back where you came from. Heck! I would go as far as to say that even the greek philosophers would agree...that humans should come first.

Bardock42
"Investing in a free market"...classic.


Sorry, this is just to pathetic to me, I am sure someone else will hand you your ass some more in due time.

WrathfulDwarf
Originally posted by Bardock42


Sorry, this is just to pathetic to me, I am sure someone else will hand you your ass some more in due time.

Oh, so this is about trying to get someone pwned and not about a discussion. You're wasting my precious time. See you later then Bardock.

=Tired Hiker=
This will be great fodder for my next "Situations" video!

Bardock42
Originally posted by WrathfulDwarf
Oh, so this is about trying to get someone pwned and not about a discussion. You're wasting my precious time. See you later then Bardock.

Very RJ of you. But I never said that. Read my post 5-6 times again, maybe, just maybe you will understand the meaning then.

inimalist
WrathfulDwarf:

If we learned there was a subterranian race of sentient, almost cognitively identical to us, beings living under the surface of Mars, would you be against one of said martian races using another martian race as slaves?

WrathfulDwarf
Originally posted by inimalist
WrathfulDwarf:

If we learned there was a subterranian race of sentient, almost cognitively identical to us, beings living under the surface of Mars, would you be against one of said martian races using another martian race as slaves?

This is is different....I would say there would be a cultural clash here...I don't support slavery. All I can do here is exchange ideas on how to maintain production without the need to abuse the workers rights.

Heck...Labour Unions would be throw into the argument. They might consider our ways. Who knows...?

=Tired Hiker=
Okay, so we create a bunch of robots, right. They possess physical thought and feelings like the rest of us, but they are smarter and they become more powerful than us. They end up enslaving us! We fight back and find their weakness, which happens to be sea water!! We throw all of them in the sea water and they die! But some of them are not evil, some of them mean no harm, but we don't trust them so we put them in concentration camps. Eventually we have no use for them and we are tired of paying taxes on feeding them so we dump a giant bucket of sea water on them as well. All is peaceful now, and decades later our children and our children's children only learn of these 'Robot Times' in our history books. We reflect and tell stories of how they raped us and treated us like dirt, but eventually we overcame them in the end and never try to create A.I. ever again.

Now, flash forward to the year 2630. A young scuba diver named Jerry Winkles finds a dead robot at the bottom of the sea. He digs it up and finds more! He digs so deep that he actually finds one robot that is still alive! He secretly brings it home and cares for it, teaches it things . . . it becomes smarter. It kills Jerry and spawns another beginning to the robot race, but this time the species is immune to sea water. We as humans die and are wiped from the face of history!

Folks, Tired Hiker here, please help me stop the creation of Artificial Intelligence so that doesn't happen. Thanks. -Hiker.

=Tired Hiker=
no expression

Symmetric Chaos
Originally posted by WrathfulDwarf
I don't support slavery.

But you do support taking sentient beings and forcing them to labor without pay or legal rights.

dadudemon
Originally posted by Bardock42
"Investing in a free market"...classic.


Sorry, this is just to pathetic to me, I am sure someone else will hand you your ass some more in due time.


I don't get it.

Why is investing in a free market, pathetic? I interpret it one of two ways.

1. Because one lives in a free market, he can invest under its reach.

2. One is investing in free markets in the hopes of propagating the idea of free markets.

The two are different.

An example would be investing in a country in hopes of helping the country succeed because the nascent country represents your idea of a state(#2) or another would be investing in Wal-Mart because, incidentally, you have the right to do so(#1).

Does he mean both at the same time?


DAMNIT! I don't understand what you mean by pathetic. I'm probably wrong on both accounts.

Bardock42
Originally posted by dadudemon
I don't get it.

Why is investing in a free market, pathetic? I interpret it one of two ways.

1. Because one lives in a free market, he can invest under its reach.

2. One is investing in free markets in the hopes of propagating the idea of free markets.

The two are different.

An example would be investing in a country in hopes of helping the country succeed because the nascent country represents your idea of a state(#2) or another would be investing in Wal-Mart because, incidentally, you have the right to do so(#1).

Does he mean both at the same time?


DAMNIT! I don't understand what you mean by pathetic. I'm probably wrong on both accounts.

The argument with him was pathetic (because of him).

The "I'm a capitalist, I support investing in a free market" is just ridiculous and funny.

Symmetric Chaos

dadudemon
Originally posted by Bardock42
The argument with him was pathetic (because of him).

The "I'm a capitalist, I support investing in a free market" is just ridiculous and funny.

Okay..I think I get it.


You are saying that his arguing a case for himself as being capitalist by claiming to invest in a free market is pathetic only because you believe his arguments are pathetic or rather, of the things you two arguing about, his arguments are pathetic. Right?

Meaning, it has nothing to do with investing in free markets being a capitalist privilege. (Hence, his using that as an example.)

Bardock42
Originally posted by dadudemon
Okay..I think I get it.


You are saying that his arguing a case for himself as being capitalist by claiming to invest in a free market is pathetic only because you believe his arguments are pathetic or rather, of the things you two arguing about, his arguments are pathetic. Right?

Meaning, it has nothing to do with investing in free markets being a capitalist privilege. (Hence, his using that as an example.)
No, I never said that saying that he supports investing in a free market is pathetic.

WrathfulDwarf
Originally posted by dadudemon
Okay..I think I get it.


You are saying that his arguing a case for himself as being capitalist by claiming to invest in a free market is pathetic only because you believe his arguments are pathetic or rather, of the things you two arguing about, his arguments are pathetic. Right?

Meaning, it has nothing to do with investing in free markets being a capitalist privilege. (Hence, his using that as an example.)

Don't bother to split hairs. When Bardock have no arguments he goes Ad hominem or character assasination.

That's why I told him not waste my precious time.





Anywhoo....As much as I oppose giving a job to a robot over a human. That does not translate to "I'm agaisnt technology". I like technology very much. I think it's helpful and necessary for our world and society. I just don't think it should be use to replace us. That is all...

Bardock42
Originally posted by WrathfulDwarf
Don't bother to split hairs. When Bardock have no arguments he goes Ad hominem or character assasination.

Ironic.

Symmetric Chaos
Originally posted by WrathfulDwarf
Anywhoo....As much as I oppose giving a job to a robot over a human. That doesn't mean I'm agaisnt technology. I like technology very much. I think it's helpful and necessary for our world and society. I just don't think it should be use to replace us. That is all...

But it already does. Think of unskilled workers, typists, professional calculators (yes it was a job), litter carriers, seamstresses and tailors all of them have lost their jobs to technology without the remotest capacity for thought. The idea that you can accept all of that so gladly but have something against doing the same thing when the robots can appreciate what they're doing is astounding.

WrathfulDwarf
Originally posted by Symmetric Chaos
But it already does. Think of unskilled workers, typists, professional calculators (yes it was a job), litter carriers, seamstresses and tailors all of them have lost their jobs to technology without the remotest capacity for thought. The idea that you can accept all of that so gladly but have something against doing the same thing when the robots can appreciate what they're doing is astounding.

They can be trained for other jobs. People still should be giving the right to support themselves. As a matter of fact, certain new jobs do require special training. Which is what I'm fine with, since it's giving people a chance to improve themselves. Educate them, train them, and provide them with the means to survive. I'm sure you can see that quite clearly.

Symmetric Chaos
Originally posted by WrathfulDwarf
They can be trained for other jobs. People still should be giving the right to support themselves. As a matter of fact, certain new jobs do require special training. Which is what I'm fine with, since it's giving people a chance to improve themselves. Educate them, train them, and provide them with the means to survive. I'm sure you can see that quite clearly.

The exact same conditions apply to sentient robots. People who lose their jobs to them will simply be train to do something else.

dadudemon
Originally posted by Bardock42
No, I never said that saying that he supports investing in a free market is pathetic.

Nor did I conclude that that is what you were saying. smile


Edit-


On topic-

According to Ray Kurzweil, we will eventually become one with AI. As in, the same entity.

Bardock42
Originally posted by dadudemon
Nor did I conclude that that is what you were saying. smile

Fair enough. Suppose I misunderstood "You are saying that his arguing a case for himself as being capitalist by claiming to invest in a free market is pathetic" then.

dadudemon
Originally posted by Bardock42
Fair enough. Suppose I misunderstood "You are saying that his arguing a case for himself as being capitalist by claiming to invest in a free market is pathetic" then.

Correct, because, singularly, an interpretation of that phrase alone would not lend itself to the conclusion I had drawn. The other half of that sentence is required to make sense of what I was making sense of.

Bardock42
Originally posted by dadudemon
Correct, because, singularly, an interpretation of that phrase alone would not lend itself to the conclusion I had drawn. The other half of that sentence is required to make sense of what I was making sense of.

I believe that the other half doesn't change that you said that you think that I called him calling himself capitalist, because he believes in investing in the free market, pathetic.

dadudemon
Originally posted by Bardock42
I believe that the other half doesn't change that you said that you think that I called him calling himself capitalist, because he believes in investing in the free market, pathetic.


To me, its quite obvious what I said was not what you think I said. (replace "said" with "posted".)

Even if you left the first sentence with that erroneous conclusion (confusingly enough, which was a conclusion about your conclusion), you could draw the correct context of my intentions from the following:

"Meaning, it has nothing to do with investing in free markets being a capitalist privilege. (Hence, his using that as an example.)"


If you are still covering this point simply because you want to show why you erroneously interpreted my post because you don't want myself or others to think that you couldn't/didn't understand my post. No worries. I didn't think that because I could care less about that. You're still the same old smart Bardock42...regardless if you "don't get something". I'm not sitting on this end thinking that you're an idiot. Besides, English isn't your first language...it would be utter dick of me to think you're an idiot for not getting my post. However, not in your favor, you have a better grasp on English than most native English speakers do.


Remember, this all started because I didn't get what you were saying.

Bardock42
I don't have a problem to admit that I misunderstood your post, I just still don't see a different interpretation. I get what you were saying in the whole context (I think), but I don't see how it changes the fact of what seems to be said in that first sentence. Could you explain it to me like I was a five year old.


Thanks for your asskissing, btw, positively noted.

chithappens
Originally posted by Bardock42
I am sorry, but "Workers would lose jobs, boo hoo" is not a good reason to take suffering things as slaves. In fact, it's very similar to arguments Slave owners would use against negros...oh well, I guess makes sense, you are more conservative than libertarian anyways. (also explains why you think you are a capitalist).

Comparing human "negros" with AI is moronic to me. First off, if you understood how labor worked in the U.S. at that time (slaves were not able to do the more dangerous jobs that say an immigrant from Eastern Europe was sent to do because the negro slave was worth too much money) you probably would not say such ignorant stuff. Just look into it in your own time.

What Dwarf is getting at is AI replacing humans in the work force. A machine would be able to do things more efficiently than a very capable human being because that would be what they are specifically programmed to do.

When the technology is available, all humans could be replaced with machines that are much better at the job than a hard working human. The "economist" would cut cost and hire the robot without having to worry about salary(unless robots start having living expenses but that's another topic altogether). It would have to be a serious, serious concern.

And this:

Originally posted by Symmetric Chaos
The exact same conditions apply to sentient robots. People who lose their jobs to them will simply be train to do something else.

is greatly oversimplifying the issue.

Bardock42
Originally posted by chithappens
Comparing human "negros" with AI is moronic to me. First off, if you understood how labor worked in the U.S. at that time (slaves were not able to do the more dangerous jobs that say an immigrant from Eastern Europe was sent to do because the negro slave was worth too much money) you probably would not say such ignorant stuff. Just look into it in your own time.

What Dwarf is getting at is AI replacing humans in the work force. A machine would be able to do things more efficiently than a very capable human being because that would be what they are specifically programmed to do.

When the technology is available, all humans could be replaced with machines that are much better at the job than a hard working human. The "economist" would cut cost and hire the robot without having to worry about salary(unless robots start having living expenses but that's another topic altogether). It would have to be a serious, serious concern.

And this:



is greatly oversimplifying the issue.

I am not sure what you are getting at. I wasn't comparing black slaves to robots (or their working conditions), I was comparing the arguments for not granting rights to blacks to those WD puts forth to not granting Self-Conscious Robots rights. Sorry if it offended you, wasn't my intention.

Bardock42
Additionally I think it is ridiculous to assume that we would built sentient robots (similar to humans or above) which can feel and be creative, just to make them work low-wage factory jobs. I am sure if it ever is an issue, it won't be for robots that carry boxes and drill holes in plastic cups.

dadudemon
Originally posted by Bardock42
I don't have a problem to admit that I misunderstood your post, I just still don't see a different interpretation. I get what you were saying in the whole context (I think), but I don't see how it changes the fact of what seems to be said in that first sentence. Could you explain it to me like I was a five year old.

Okay. How about I treat you like a college attending adult and use some logic?

"is arguing a case for himself as being capitalist by claiming to invest in a free market"=a

a is a justification for p

p=conclusion

Alone, a does not logically conclude to p.

However...

"only because you believe his arguments are pathetic or rather, of the things you two arguing about, his arguments are pathetic"= b

if b proceeds a, then a=p However, in the absence of b, a does not always = p

In other words, because (as you supposed) his arguments were weak and feeble, when he tried to further justify his position as being capitalist, you saw through his facade and you were able to call his argument for himself as pathetic. (Again, that's what you thought...not that I think WD's arguments were weak or strong.)


Originally posted by Bardock42
Thanks for your asskissing, btw, positively noted.

At first, I figured just to include a thing about English not being your original language. Then I thought that you may make a counter argument to null that point by saying that you can speak English better than most others whose native tongue is English. I also concluded that you may say, "thanks, but I don't need your patronizing excuses." So instead of leaving it with an open ended interpretation of patronization, I figured I would leave other little tidbits so that, holistically, your conclusion would lead you to think that I wasn't passive aggressively insulting you.

Now....

After reading all that about my ass kissing, you may want to again conclude, "there he goes again...pretending to have masterful plans". If you concluded that, you'd be utterly wrong. That would be a very pessimistic and close minded interpretation. The correct interpretation is actually in the from of another compliment. You would be concluding correctly if you concluded as follows: "Wow. He thinks so highly of my intelligence that he has to go through great lengths to make sure that, even if I'm just joking with him, his positive comments can be just that, positive."

WrathfulDwarf
Originally posted by chithappens


What Dwarf is getting at is AI replacing humans in the work force. A machine would be able to do things more efficiently than a very capable human being because that would be what they are specifically programmed to do.

When the technology is available, all humans could be replaced with machines that are much better at the job than a hard working human. The "economist" would cut cost and hire the robot without having to worry about salary(unless robots start having living expenses but that's another topic altogether). It would have to be a serious, serious concern.



Thank you chit! Those things trouble me for the future.



This is just babbling....

But in Star Wars....the Empire isn't evil. They still employ humans to do the work....or use clones. Droids are for service.

Ha! Bardock should read Dune. Great novel.

Bardock42
Originally posted by dadudemon
Okay. How about I treat you like a college attending adult and use some logic?

"is arguing a case for himself as being capitalist by claiming to invest in a free market"=a

a is a justification for p

p=conclusion

Alone, a does not logically conclude to p.

However...

"only because you believe his arguments are pathetic or rather, of the things you two arguing about, his arguments are pathetic"= b

if b proceeds a, then a=p However, in the absence of b, a does not always = p

In other words, because (as you supposed) his arguments were weak and feeble, when he tried to further justify his position as being capitalist, you saw through his facade and you were able to call his argument for himself as pathetic. (Again, that's what you thought...not that I think WD's arguments were weak or strong.)
Yes, that's how I understood it. And my answer to that was that you misunderstood, because I never called that pathetic. I only called our argument pathetic (our argument being about Robot rights). If you remember I called the "investing in free markets" thing "classic" in a sarcastic manner.

Do you understand now why I replied to you what I replied. And why it was correct to call you wrong about what I said?

Oh, and don't worry about the ass-kissing thing, just a little joke. I don't think you were scheming much there, seems natural to add that, as I obviously would have objected to a comment solely aimed at English being my second language.

Bardock42
Originally posted by WrathfulDwarf
Thank you chit! Those things trouble me for the future.



This is just babbling....

But in Star Wars....the Empire isn't evil. They still employ humans to do the work....or use clones. Droids are for service.

Ha! Bardock should read Dune. Great novel.

I read Dune.

WrathfulDwarf
Originally posted by Bardock42
I read Dune.

hug

Bardock42
Originally posted by WrathfulDwarf
hug You're welcome.

chithappens
Originally posted by Bardock42
I am not sure what you are getting at. I wasn't comparing black slaves to robots (or their working conditions), I was comparing the arguments for not granting rights to blacks to those WD puts forth to not granting Self-Conscious Robots rights. Sorry if it offended you, wasn't my intention.

Well I was not really offended about the negro part (we are talking about all humans anyway), but the example does not work when complete context of pre-Civil War labor in the U.S. is included.

Bardock42
Originally posted by chithappens
Well I was not really offended about the negro part (we are talking about all humans anyway), but the example does not work when complete context of pre-Civil War labor in the U.S. is included.

I am not referring to a "Robots would steal jobs" vs. "Blacks would steal jobs", but "Robots will do our work cheaper so they shouldn't have rights" vs. "Blacks would make our production more expensive, so they shouldn't have rights".

Meaning, just a monetary issue, not one about what is humane for sentient beings.


Jesus. I need to be clearer with what I say, I have way too much explaining to do.

chithappens
No no. I understood that. But slaves WERE THE ROBOTS OF THE TIME. That's the part you are misunderstanding.

In this sense, you pay for a slave: they work until their dying day. The hard jobs were reserved for poor whites and unfortunate immigrants.

*If this still does make sense yet excuse me. I can be more thorough later but the girlfiend wants to talk... sorry

Bardock42
Originally posted by chithappens
No no. I understood that. But slaves WERE THE ROBOTS OF THE TIME. That's the part you are misunderstanding.

In this sense, you pay for a slave: they work until their dying day. The hard jobs were reserved for poor whites and unfortunate immigrants.

*If this still does make sense yet excuse me. I can be more thorough later but the girlfiend wants to talk... sorry

I...I don't get it.

Robots would be slaves.
Blacks were slaves.

In that way it is similar...is that what you are saying?

chithappens
That's what I mean also but there is more to it. (again doing this half assed)

I can't go into great depth now but the pre-Civil War situation does not necessairily fit as a direct contrast because of different variables.

If applied today though, machines (who would obviously be programmed as job specific) would drive the price of human labor down because humans would have to take whatever was given to them because machines could do it way better.

Bardock42
Originally posted by chithappens
That's what I mean also but there is more to it. (again doing this half assed)

I can't go into great depth now but the pre-Civil War situation does not necessairily fit as a direct contrast because of different variables.

If applied today though, machines (who would obviously be programmed as job specific) would drive the price of human labor down because humans would have to take whatever was given to them because machines could do it way better.

So?

I am sorry, but you don't seem to form a coherent thought yet. I am not sure what exactly I said you disagree with.

dadudemon
Originally posted by Bardock42
Yes, that's how I understood it. And my answer to that was that you misunderstood, because I never called that pathetic. I only called our argument pathetic (our argument being about Robot rights). If you remember I called the "investing in free markets" thing "classic" in a sarcastic manner.

Do you understand now why I replied to you what I replied. And why it was correct to call you wrong about what I said?

Oh, and don't worry about the ass-kissing thing, just a little joke. I don't think you were scheming much there, seems natural to add that, as I obviously would have objected to a comment solely aimed at English being my second language.

I completely understand now. It makes perfect sense. It all fits together. Your second sentence, which had the word pathetic, was basically unrelated to his comment on free markets. Your "classic" comment was a mockery of his using it as an example, again, unrelated to your next sentence.


Damn. I don't think you could have done anything to make your second sentence less confusing...even if you pressed enter a bunch of times to try to indicate you were commenting on something else, it could still be confused as another comment on free marketing.

I think that this is another example of you excluding specific references like pronouns or nouns that has caused me to misunderstand you. This stuff happens all the time with me to you. DAMN IT! You could have said, "Sorry, this argument about AI is just to pathetic to me, I am sure someone else will hand you your ass some more in due time."

You had what I'd call another ambiguous pronoun reference. I erroneously interpreted "this" to reference the free market argument. If you would have said "Sorry, this whole argument..."...it would still have made more sense.


All of this is tangential...but still, it would have bugged the hell out of me until I understood it. Luckily, you guys dropped that argument, or else it would have bugged me even more.

Bardock42
Originally posted by dadudemon
I completely understand now. It makes perfect sense. It all fits together. Your second sentence, which had the word pathetic, was basically unrelated to his comment on free markets. Your "classic" comment was a mockery of his using it as an example, again, unrelated to your next sentence.


Damn. I don't think you could have done anything to make your second sentence less confusing...even if you pressed enter a bunch of times to try to indicate you were commenting on something else, it could still be confused as another comment on free marketing.

I think that this is another example of you excluding specific references like pronouns or nouns that has caused me to misunderstand you. This stuff happens all the time with me to you. DAMN IT! You could have said, "Sorry, this argument about AI is just to pathetic to me, I am sure someone else will hand you your ass some more in due time."

You had what I'd call another ambiguous pronoun reference. I erroneously interpreted "this" to reference the free market argument. If you would have said "Sorry, this whole argument..."...it would still have made more sense.


All of this is tangential...but still, it would have bugged the hell out of me until I understood it. Luckily, you guys dropped that argument, or else it would have bugged me even more.

I do make too ambiguous statements.

I am glad we cleared it, though.

Adam_PoE
Suppose a man is struck by a car while crossing the street. He is rushed to the hospital, but the doctors are unable to resuscitate him. To save his life, the doctors upload his mind into a sophisticated android. Is he less a man, because he is now made of silicon and wires, then of flesh and blood, and if so, why?

chithappens
Did he lose his penis also? This is the key question

dadudemon
Originally posted by chithappens
Did he lose his penis also? This is the key question

Indeed. Life without being lead around by my androgens(tee hee) would not be the same.

WrathfulDwarf
I don't see how putting a person's brain inside a machine would be any different than a person plug to a life support machine. As I recall some people are in favor of pulling the plug on someone who is on live support. So....

As for the penis...I would be more worry about the person losing his sanity. Last thing we need is a human with an android body seeking revenge.....Oh gawd! This would make a sweet Sci-fi/Detective novel.

(gets copyrights)

Symmetric Chaos
Originally posted by WrathfulDwarf
I don't see how putting a person's brain inside a machine would be any different than a person plug to a life support machine.

Because the person would still be able to do everything he could before except have kids, only he'd be made of metal. So as he now lacks DNA you would support taking away the man's right, correct?

WrathfulDwarf
Originally posted by Symmetric Chaos
Because the person would still be able to do everything he could before except have kids, only he'd be made of metal. So as he now lacks DNA you would support taking away the man's right, correct?

Well, this might be different. Since a machine is been use to save a person's life. Here the machine is not a replica of a person. It's only a host for the person's body. I would classify this one in another area.

Let's just cut to chase. This is Robocop all over again.

(damn! there goes my previous idea)

Symmetric Chaos
Originally posted by WrathfulDwarf
Well, this might be different. Since a machine is been use to save a person's life. Here the machine is not a replica of a person. It's only a host for the person's body. I would classify this one in another area.

You made it quite clear that being organic and having DNA was the important thing. His body is gone the machine preserves his mind by effectively making him an AI. That qualifies him to lose his rights by your system.

WrathfulDwarf
Originally posted by Symmetric Chaos
You made it quite clear that being organic and having DNA was the important thing. His body is gone the machine preserves his mind by effectively making him an AI. That qualifies him to lose his rights by your system.

It is important. A person's life is also very important. He does not loses his rights. He was a human BEFORE his body was turned into a machine. As a matter of fact this scenerio CAN be change. The human could later choose to transfer his brain to a clone. The cyborg body can only be a momentary station.

No matter how you put it the machine is secondary here. The human is still gets priority.

If they have the technology to transfer human thoughts into a machine...gee, might as well go with a more practical choice... a clone.

inimalist
the neural activity of a human put into a computer capable of simulating that ability is no longer human

human is NOT brain activity

dadudemon
Originally posted by inimalist
the neural activity of a human put into a computer capable of simulating that ability is no longer human

human is NOT brain activity


So you don't subscribe to the concepts presented in Ghost in the Shell?


I fully agree that it would not longer be human. You, among all of us, know how many chemicals are involved in just being human. Emotions? Shouldn't really exist in a simulated body, should it? There are just waaaaaaaay too many things about a human being flesh that would work right if we only had our RSI transferred to a machine.

I guess you could have a fundamental OS that calibrates to the person's emotions(with a pre-upload sync or something) and adapts specific subroutines to simulate those hormones n'stuff that makes people more human. If it doesn't calibrate properly, there'd be a lose of personality at some level...we'd end up with something between a human and Data from startrek.

Symmetric Chaos
Originally posted by WrathfulDwarf
It is important. A person's life is also very important. He does not loses his rights. He was a human BEFORE his body was turned into a machine. As a matter of fact this scenerio CAN be change. The human could later choose to transfer his brain to a clone. The cyborg body can only be a momentary station.

No matter how you put it the machine is secondary here. The human is still gets priority.

If they have the technology to transfer human thoughts into a machine...gee, might as well go with a more practical choice... a clone.

So you're going back on this:

Originally posted by WrathfulDwarf
For me such machines should be able to reproduce. Meaning a female robot have to give birth. If it could do it...then I would say they can be equal or even given rights.

Until then...it remains no different than my Xbox 360.

As a human mind in a machine he cannot have kids. Not human by you standard. He's just a fancy X-Box 360.


This would have to be thrown out too:

Originally posted by WrathfulDwarf
It does matter. They're machines. Period! There is no DNA in them.

No DNA. Not human. He loses his rights in your system regardless of what he was before. He's a machine. Period!


This poses a problem as well:

Originally posted by WrathfulDwarf
Technology should help people in their jobs. Not take them away. It's going to be far worse when a robot with human capacities take over. Even worse giving them absurd rights that they don't deserved. Those rights should go to humans.

The human mind in a robot body will be taking jobs for normal organic people. Thus in the system you have been talking about for pages it would be absurd to him him rights.

Bardock42
Hahahahaha


I'm a ****ing Prophet.

Originally posted by Bardock42

Sorry, this is just to pathetic to me, I am sure someone else will hand you your ass some more in due time.

WrathfulDwarf
Originally posted by Symmetric Chaos
So you're going back on this:



As a human mind in a machine he cannot have kids. Not human by you standard. He's just a fancy X-Box 360.


This would have to be thrown out too:



No DNA. Not human. He loses his rights in your system regardless of what he was before. He's a machine. Period!


This poses a problem as well:



The human mind in a robot body will be taking jobs for normal organic people. Thus in the system you have been talking about for pages it would be absurd to him him rights.

No, I'm not going into anything. I still stand on my opinion. What you're doing here is streching the topic. All those machines did not begin as humans. This is a very different scenerio. We're talking about saving the life of a human. As I said earlier is still very important. He has not lost his ORIGINAL rights because he was born human. Because of this accident there was no choice but put him inside a machine. He has the right to be given medical help.


Again, there is also the option of cloning the person. The machine remains secondary and the human gets priority. This case is similar to amputations. Even if it has a wooden limb the person is still a person. And continues to have rights.

Originally posted by Bardock42
Hahahahaha


I'm a ****ing Prophet.

Failure.

Symmetric Chaos
Originally posted by WrathfulDwarf
No, I'm not going into anything. I still stand on my opinion. What you're doing here is streching the topic. All those machines did not begin as humans. This is a very different scenerio. We're talking about saving the life of a human. As I said earlier is still very important. He has not lost his ORIGINAL rights because he was born human. Because of this accident there was no choice but put him inside a machine. He has the right to be medical help.

But he has lost everything that gives him those right. Letting him keep them is insane.

WrathfulDwarf
Originally posted by Symmetric Chaos
But he has lost everything that gives him those right. Letting him keep them is insane.

I'm sure he has family that can claim him. He has lost nothing. Once he dies then you can consider that he lost his rights.

Bardock42
There is a difference between coming second to humans and giving no rights whatsoever.

dadudemon
Originally posted by Bardock42
I'm a ****ing Prophet.

hmm

A sexually active prophet? Sounds like the good kind to me. big grin

Symmetric Chaos
Originally posted by WrathfulDwarf
I'm sure he has family that can claim him.

But only as an object, a fancy X-Box for his kids.

Originally posted by WrathfulDwarf
He has lost nothing. Once he dies then you can consider that he lost his rights.

No DNA. No ability to reproduce. No biological material.

Those are your standards for being human and deserving rights. He has none of them.

Bardock42
Originally posted by dadudemon
hmm

A sexually active prophet? Sounds like the good kind to me. big grin I warn you. I am not cheep.

dadudemon
Originally posted by Bardock42
I warn you. I am not cheep.


Damn...I guess you are just like every other evangelical...charging people for "salvation". 313

Symmetric Chaos
Originally posted by dadudemon
Damn...I guess you are just like every other evangelical...charging people for "salvation". 313

Everyone deserves a happy ending 131

Bardock42
Another sexual double entendre shifty

Symmetric Chaos
http://i304.photobucket.com/albums/nn178/iamrighturwrong/OhYOU.jpg

inimalist
Originally posted by dadudemon
So you don't subscribe to the concepts presented in Ghost in the Shell?


I fully agree that it would not longer be human. You, among all of us, know how many chemicals are involved in just being human. Emotions? Shouldn't really exist in a simulated body, should it? There are just waaaaaaaay too many things about a human being flesh that would work right if we only had our RSI transferred to a machine.

I guess you could have a fundamental OS that calibrates to the person's emotions(with a pre-upload sync or something) and adapts specific subroutines to simulate those hormones n'stuff that makes people more human. If it doesn't calibrate properly, there'd be a lose of personality at some level...we'd end up with something between a human and Data from startrek.

neuroscience really doesn't understand the architectutre of emotional experience, so I can't necessarily come up with a viable solution.

However, there is nothing special about emotions that would make them any harder to simulate than any other faculty of the mind.

dadudemon
Originally posted by inimalist
However, there is nothing special about emotions that would make them any harder to simulate than any other faculty of the mind.

But what about "monoamine neurotransmitters"?

You know...saturation of all sorts of receptors that drive our emotions, subconscious desires, and our actions?(love n'stuff) I guess that could all be simulated, like I've indicated in my post...

But still, we need all of that to be human.

inimalist
Originally posted by dadudemon
But what about "monoamine neurotransmitters"?

You know...saturation of all sorts of receptors that drive our emotions, subconscious desires, and our actions?(love n'stuff) I guess that could all be simulated, like I've indicated in my post...

But still, we need all of that to be human.

I think it is foolish to try and describe humanity by the quality of their subjective experiences.

Humanity is a species, and as such, has a strictly biological definition. The conscious experience of what it is to be human is different for all people.

Regardless of what a person experiences subjectively, should they maintain the biology that falls in line with that which is defined as human, they are human. No matter how exactly another being's conscious state matches the human experience, should they not have the biology, they aren't human.

Also, with regards to emotions, I would speculate against any specific neurotransmitter and say you should look at the role of the amygdala in information processing. But, put simply, so long as emotions are based on the interactions of physical phenomena, there is no hypothetical reason why we shouldn't be able to simulate them.

dadudemon
Originally posted by inimalist
I think it is foolish to try and describe humanity by the quality of their subjective experiences.

Agreed. We are not the sum of our memories, the interpretation of those memories, the product of environment as coupled with the interpretation of memories, etc. It is a holistic combination of all.

Originally posted by inimalist
Humanity is a species, and as such, has a strictly biological definition. The conscious experience of what it is to be human is different for all people.

Originally posted by inimalist
Regardless of what a person experiences subjectively, should they maintain the biology that falls in line with that which is defined as human, they are human. No matter how exactly another being's conscious state matches the human experience, should they not have the biology, they aren't human.

I agree, I think. I agree that to be human, it means literally the biological organism. Would "synthetic human" be kosher to you?

Originally posted by inimalist
Also, with regards to emotions, I would speculate against any specific neurotransmitter and say you should look at the role of the amygdala in information processing. But, put simply, so long as emotions are based on the interactions of physical phenomena, there is no hypothetical reason why we shouldn't be able to simulate them.


Agreed. Which was my point about it. It isn't as simple as throwing our "sync cording*" into androids and gynoids. It wouldn't be us. There'd have to be a calibration to make it "us" again. Then we could throw that "sync cording" into a biological form. If no human could tell the difference then I don't see a problem.

*"7 Days", with Ahhhhhhhnold.

inimalist
Originally posted by dadudemon
Agreed. We are not the sum of our memories, the interpretation of those memories, the product of environment as coupled with the interpretation of memories, etc. It is a holistic combination of all.


indeed, however, the "we" that we are is in no way related to the biology that makes us "human".


Originally posted by dadudemon
I agree, I think. I agree that to be human, it means literally the biological organism. Would "synthetic human" be kosher to you?

I don't think "human" is that important when discussing morality...

or are you asking what i would call a robot with the mind of a person downloaded into it? I don't know, not something I'm overly concerned with. Technically, it would be a machine. I don't know if robot qualifies, but these things are fairly simple to determine without needing to talk about emotions and the like. The biological human is not defined by the ability to feel emotions.

Originally posted by dadudemon
Agreed. Which was my point about it. It isn't as simple as throwing our "sync cording*" into androids and gynoids. It wouldn't be us. There'd have to be a calibration to make it "us" again. Then we could throw that "sync cording" into a biological form. If no human could tell the difference then I don't see a problem.

*"7 Days", with Ahhhhhhhnold.

Its future science, might as well be magic, so I can't speculate. I think it is going to be much more difficult than anyone thinks, but not impossible.

Adam_PoE
Originally posted by WrathfulDwarf
I don't see how putting a person's brain inside a machine would be any different than a person plug to a life support machine. As I recall some people are in favor of pulling the plug on someone who is on live support. So....

If a man retains his humanity if his mind is placed in an artificial body, then how is the contrapositive of this any different? You have already conceded that it is his mental state, not his body that defines his personhood.

Grand_Moff_Gav
There are two types of human.

Human, the species for which the biological reference is Human. Obviously an android or robot would not be a Human in this sense.

Human, the humanity, the emotions, belief systems consciousness and sentience. Whether your a Human or a Klingon you might still have Humanity. So then, can an artificial being.

inimalist
Originally posted by Grand_Moff_Gav
Human, the humanity, the emotions, belief systems consciousness and sentience. Whether your a Human or a Klingon you might still have Humanity. So then, can an artificial being.

i would agree with this, but would split hairs over terms. I'd prefer sentience, but it is just as ambiguous cool

Grand_Moff_Gav
Originally posted by inimalist
i would agree with this, but would split hairs over terms. I'd prefer sentience, but it is just as ambiguous cool

Indeed, there should be concrete terms haha, perhaps we need to find them (or make them).

inimalist
Originally posted by Grand_Moff_Gav
Indeed, there should be concrete terms haha, perhaps we need to find them (or make them).

lol

oh don't worry, I'm on it

dadudemon
Originally posted by inimalist
lol

oh don't worry, I'm on it

Careful, there's people around these parts that make fun of people who make up words for things not yet defined. Serious business.

<< THERE IS MORE FROM THIS THREAD HERE >>