Could androids have rights?
Let's say that someone creates a robot that has a humanoid appearance and features. This robot also has an advanced system that allows it to adapt its behaviors by itself and deal with humans- effectively, it can make its own personality and all of the traits a personality has. The creator dies and the robot is no longer the property of anyone. Does this robot have rights like a human? If not, what would have to be different about the robot for him to have rights like that of a human?
The Mindset
18-08-2006, 02:10
If I had sufficient reason to believe that it was sentient, then yes, even if it didn't look like a human. It could be a sentient chair for all I care.
Wilgrove
18-08-2006, 02:10
I don't know, it depends on what you define as human.
No rights for machines. Sorry.
The Mindset
18-08-2006, 02:13
No rights for machines. Sorry.
You're an organic machine. You have no rights.
Call to power
18-08-2006, 02:15
screw that destroy the robot I don't want it conoodling with my sexbots :mad:
and why on Earth would anyone create a sentient robot what the hell could you do with something that has a union hell I may as well purchase a Mexican for all the hassle it would save me
Call to power
18-08-2006, 02:17
You're an organic machine. You have no rights.
correction an organic machine that wears the trousers!
And who says the machine is alive it doesn’t breath! Ha MRS GREN wins again
Wilgrove
18-08-2006, 02:18
screw that destroy the robot I don't want it conoodling with my sexbots :mad:
and why on Earth would anyone create a sentient robot what the hell could you do with something that has a union hell I may as well purchase a Mexican for all the hassle it would save me
Yea, but you can't turn a mexican off and store it in a closet.
The Mindset
18-08-2006, 02:18
screw that destroy the robot I don't want it conoodling with my sexbots :mad:
and why on Earth would anyone create a sentient robot what the hell could you do with something that has a union hell I may as well purchase a Mexican for all the hassle it would save me
Simple. Machines evolve much faster than humans. If we create a sentient robot, and tell it to build its sucessor, and so on and so forth into infinity, we end up with technology developing a hundred times faster than mere humans could manage. It's called a technological singularity.
Let's say that someone creates a robot that has a humanoid appearance and features. This robot also has an advanced system that allows it to adapt its behaviors by itself and deal with humans- effectively, it can make its own personality and all of the traits a personality has. The creator dies and the robot is no longer the property of anyone. Does this robot have rights like a human? If not, what would have to be different about the robot for him to have rights like that of a human?
PERSOCOMS!
http://www.mc-ent.ru/pic/wallpapers/chobits.jpg
Me Likey! Me Want One! :p
heck, I'd settle for a Portable one.
http://members.shaw.ca/fubes2000/images/gallery/screens/sumomo%20zoned%20out.jpg
:D
Yea, but you can't turn a mexican off and store it in a closet.
if it has rights, you can't do that with robots either.
Call to power
18-08-2006, 02:25
Simple. Machines evolve much faster than humans. If we create a sentient robot, and tell it to build its sucessor, and so on and so forth into infinity, we end up with technology developing a hundred times faster than mere humans could manage. It's called a technological singularity.
so now the machine no longer imitates humans it now has super D.T powers like a Robo-Jesus
I fail to see how a machine can work faster than humans after all there is only so much information a circuit can take before it burns hence why there is the interest in Nanobots
Call to power
18-08-2006, 02:27
Yea, but you can't turn a mexican off and store it in a closet.
what about Mr hammer and his children rope and gag followed by uncle push a Mr shove into the closet
Robot Mexicans FTW
Duntscruwithus
18-08-2006, 02:41
PERSOCOMS!
http://www.mc-ent.ru/pic/wallpapers/chobits.jpg
Me Likey! Me Want One! :p
heck, I'd settle for a Portable one.
http://members.shaw.ca/fubes2000/images/gallery/screens/sumomo%20zoned%20out.jpg
:D
So are you saying that sentient androids will look like female anime characters?
The Mindset
18-08-2006, 02:44
so now the machine no longer imitates humans it now has super D.T powers like a Robo-Jesus
I fail to see how a machine can work faster than humans after all there is only so much information a circuit can take before it burns hence why there is the interest in Nanobots
Yeah... which is why robot generation X would construct its sucessor with nanobots, or whatever. The point is, each generation would be an improvement on the last, and it'd keep doing that until the point where it was better than humans. It'd get exponentially faster and more intelligent.
Cannot think of a name
18-08-2006, 02:50
Capek warned us about this on the outset-if you don't give them rights then someone will try to, and then they'll sneak in a determination for it, and then the robots will get pissed and kill all the humans and while they're doing that the scientests will throw away the formula to make more robots and there'll be just one dude left and he'll be forced to try and recreate the formula and instead he'll invent robot procreation.
You're an organic machine. You have no rights.
Yawn
Call to power
18-08-2006, 02:53
Yeah... which is why robot generation X would construct its sucessor with nanobots, or whatever. The point is, each generation would be an improvement on the last, and it'd keep doing that until the point where it was better than humans. It'd get exponentially faster and more intelligent.
and humans could easily do the same with genetic engineering (though they would actually do better because flesh is far better than machine especially in the fact that it doesn’t gobble natural resources)
Of course would the robot choose to develop itself after all we have the same debate about humans now why would a robot do different?
So are you saying that sentient androids will look like female anime characters?
Mine would... :D
oh and if you watched the series... they got male ones also...
and non-human types too.
Remeber... those are Computers.. not *ahem* 'toys'.
The Mindset
18-08-2006, 02:53
Yawn
What an eloquent refutal. You put me right in my place, sir!
BAAWAKnights
18-08-2006, 02:54
This was already covered in an episode of ST:TNG.
Markreich
18-08-2006, 02:55
Android rights?
Crap, smokers don't even have rights! :(
Call to power
18-08-2006, 02:56
Capek warned us about this on the outset-if you don't give them rights then someone will try to, and then they'll sneak in a determination for it, and then the robots will get pissed and kill all the humans
not if my trusty fridge magnet has anything to say about it!
Cannot think of a name
18-08-2006, 02:56
This was already covered in an episode of ST:TNG.
Or, like a third of the stories about robots since there where stories about robots...
The Mindset
18-08-2006, 02:56
and humans could easily do the same with genetic engineering (though they would actually do better because flesh is far better than machine especially in the fact that it doesn’t gobble natural resources)
Of course would the robot choose to develop itself after all we have the same debate about humans now why would a robot do different?
Well, naturally I'd advocate extensive rewriting of our genome too. You're mistaken though, organic machines use natural resources too - they just use different ones from robots.
Then there's always the possibility of us creating organic robots.
What an eloquent refutal. You put me right in my place, sir!
All it takes is one stroke from one without rights.
Call to power
18-08-2006, 02:59
Well, naturally I'd advocate extensive rewriting of our genome too. You're mistaken though, organic machines use natural resources too - they just use different ones from robots.
Then there's always the possibility of us creating organic robots.
ah but if all the robots are made from iron or steel they will soon run out (we have enough problems with metals)
how organic are we talking here because a test tube baby is not a robot
The Mindset
18-08-2006, 02:59
All it takes is one stroke from one without rights.
This one's not quite so eloquent. I have no idea what you're getting at.
The Mindset
18-08-2006, 03:01
ah but if all the robots are made from iron or steel they will soon run out (we have enough problems with metals)
how organic are we talking here because a test tube baby is not a robot
Ever read Do Androids Dream of Electric Sheep/seen the movie Blade Runner? That. Androids - genetically engineered machines; machines that look and possibly act like humans, but which are not.
Call to power
18-08-2006, 03:09
seen the movie Blade Runner? That. Androids - genetically engineered machines; machines that look and possibly act like humans, but which are not.
if you grow a baby in a test tube and pluck some funky wires and stuff into it is it still human
Of course forgive me I only watched the first 15 minuets of blade runner because it was that bad
The Mindset
18-08-2006, 03:12
if you grow a baby in a test tube and pluck some funky wires and stuff into it is it still human
Of course forgive me I only watched the first 15 minuets of blade runner because it was that bad
I'm talking something that has no shred of human DNA in it. It'd be entirely artificial, created from scratch. It may look like a human, but it wouldn't be one. As always, Wikipedia has an article (http://en.wikipedia.org/wiki/Biorobotics) on it.
This one's not quite so eloquent. I have no idea what you're getting at.
Lets go over this slowly. So you can have an idea.
I said machines dont have rights.
You said I dont have rights
I yawned, in one word or stroke as it were.
You gave a sarcastic compliment
I allowed as how to earn the sarcastic comment one without rights need only use a single stroke.
See? I'll bet you were able to follow that. Next time it will be even easier for you.
BTW your statement that I am an organic machine and therefore have no rights holds no water.
Call to power
18-08-2006, 03:15
I'm talking something that has no shred of human DNA in it. It'd be entirely artificial, created from scratch. It may look like a human, but it wouldn't be one. As always, Wikipedia has an article (http://en.wikipedia.org/wiki/Biorobotics) on it.
how could it biologically look human if it has no human DNA :confused: (also note that allot of our DNA covers stuff all animals have so I doubt you can not have a shred of human DNA)
Fleckenstein
18-08-2006, 03:18
"Explanation: It's just that... you have all these squishy parts, master. And all that water! How the constant sloshing doesn't drive you mad, I have no idea..."
"Stupid, frail, non-compartmentalized organic meatbags!"
"Commentary: The meatbag speaks without clarity. Detail your involvement or the master will splatter your organs all over the floor."
"Definition: 'Love' is making a shot to the knees of a target 120 kilometers away using an Aratech sniper rifle with a tri-light scope... Love is knowing your target, putting them in your targeting reticule, and together, achieving a singular purpose against statistically long odds."
"Expletive: Damn it, master, I am an assasination droid... not a dictionary!"
HK-47. Nuff said.
The Mindset
18-08-2006, 03:19
BTW your statement that I am an organic machine and therefore have no rights holds no water.
Care to elaborate?
machine noun an apparatus using mechanical power and having several parts, for performing a particular task.
Humans use energy converted from glucose to perform mechanical movements using one of several parts of their body known as the muscular system. Humans are organic machines.
The Mindset
18-08-2006, 03:24
how could it biologically look human if it has no human DNA :confused: (also note that allot of our DNA covers stuff all animals have so I doubt you can not have a shred of human DNA)
In the same way that an alien, given similar conditions to Earth, might evolve to have two legs, two arms, a face, be mostly hairless and eat food. If you're looking for an Earthly example, there's been several evolutions of the Sabre toothed tiger, all at various points in history. None of them are directly related; none of them share the same DNA, yet they look the same.
Care to elaborate?
machine noun an apparatus using mechanical power and having several parts, for performing a particular task.
Humans use energy converted from glucose to perform mechanical movements using one of several parts of their body known as the muscular system. Humans are organic machines.
ap·pa·rat·us (ăp'ə-răt'əs, -rā'təs)
n., pl. apparatus or -us·es.
An appliance or device for a particular purpose: an x-ray apparatus.
An integrated group of materials or devices used for a particular purpose: dental apparatus. See synonyms at equipment.
What is a human's particular purpose?
Katganistan
18-08-2006, 03:28
Capek warned us about this on the outset-if you don't give them rights then someone will try to, and then they'll sneak in a determination for it, and then the robots will get pissed and kill all the humans and while they're doing that the scientests will throw away the formula to make more robots and there'll be just one dude left and he'll be forced to try and recreate the formula and instead he'll invent robot procreation.
R.U.R., right?
The Mindset
18-08-2006, 03:30
ap·pa·rat·us (ăp'ə-răt'əs, -rā'təs)
n., pl. apparatus or -us·es.
An appliance or device for a particular purpose: an x-ray apparatus.
An integrated group of materials or devices used for a particular purpose: dental apparatus. See synonyms at equipment.
What is a human's particular purpose?
If you're asking for my atheist viewpoint: the purpose of humans, and life in general, is to eat, shit, fuck then die. Perhaps with a few nifty creative thoughts thrown in since we're a creative bunch. Other than that, we're no different from a sentient machine that will consume fuel, create waste, construct its replacement and rot away.
Super-power
18-08-2006, 03:31
You're an organic machine. You have no rights.
You mean organic meatbag, of course :)
The Mindset
18-08-2006, 03:33
You mean organic meatbag, of course :)
Technically, I'm a meatbag machine. :D
The Longinean Order
18-08-2006, 03:34
It reminds me of an Episode of Star Trek: The Next Generation, where Data is told he is going to be dissasembled so that they can make more of him. When he refuses and resigns his comission, he is told that he is property, the property of Starfleet. Picard acts as his represntitive, though what Gaian said is probably the most important
"Consider that in the history of many worlds there have always been disposable creatures. They do the dirty work. They do the work that no one else wants to do, because it's too diffcult and too hazardous. With an army of Datas, all disposable, you don't have to think about their welfare, or you don't think about how they feel. Whole generations of disposable people."
"You're talking about slavery."
"I think that's a little harsh."
"I don't think that's a little harsh, I think that's the truth. That's the truth that we have obscured behind...a comfortable, easy euphemism. 'Property.' But that's not the issue at all, is it?"
- Guinan and Picard
Another important quote is this one from Picard. As he finishes his statements, he says "... Starfleet was founded to seek out new life – well, there it sits... waiting." and he points at Data.
The simple fact is that once we make them sdentient we face one of the greatest moral and ethical problems of our life. Now that they can think and feel, how can we say they are merely a mechanical lifeform.
Call to power
18-08-2006, 03:39
In the same way that an alien, given similar conditions to Earth, might evolve to have two legs, two arms, a face, be mostly hairless and eat food. If you're looking for an Earthly example, there's been several evolutions of the Sabre toothed tiger, all at various points in history. None of them are directly related; none of them share the same DNA, yet they look the same.
and these traits how are they passed onto children hmmm....
If it looks like a human talks like a human and thinks like a human then surely it must have similarities to our DNA even if there just coincidental
If you're asking for my atheist viewpoint: the purpose of humans, and life in general, is to eat, shit, fuck then die. Perhaps with a few nifty creative thoughts thrown in since we're a creative bunch. Other than that, we're no different from a sentient machine that will consume fuel, create waste, construct its replacement and rot away.
Without purpose humans are not apparatusses, therefore not machines. Without a builder there is no purpose. Atheists do not believe humans have a builder with a purpose in mind and therefore have the least reason of all to call human organisms machines. With the possible exception of being needlessly argumentative in a disingenuous manner.
Without purpose humans are not apparatusses, therefore not machines. Without a builder there is no purpose. Atheists do not believe humans have a builder with a purpose in mind and therefore have the least reason of all to call human organisms machines. With the possible exception of being needlessly argumentative in a disingenuous manner.
actually, that is wrong. originally, Humans did have a purpose. to maintain a balance with another set of animals that would in turn, keep nature going.
however, somewhere along the way, our "programming" got corrupt and instead of being in balance, Humans now dominate and alter their enviornment instead of maintaining a "balance" we don't see our original purpose since it's been corrupted and thus erased many thousands of years ago.
Technically, I'm a meatbag machine. :D
and I am a Meat Popsicle.
Vegas-Rex
18-08-2006, 03:44
and these traits how are they passed onto children hmmm....
If it looks like a human talks like a human and thinks like a human then surely it must have similarities to our DNA even if there just coincidental
It could have completely different replicators and chemicals behind its existence. An "organic robot" might not even pass anything onto its children, it might just be human structures managed by nanotech or something, rather than DNA.
Vegas-Rex
18-08-2006, 03:46
Without purpose humans are not apparatusses, therefore not machines. Without a builder there is no purpose. Atheists do not believe humans have a builder with a purpose in mind and therefore have the least reason of all to call human organisms machines. With the possible exception of being needlessly argumentative in a disingenuous manner.
Would a sentient android have a purpose per se, though? If it can decide its own actions and want things, what would be its purpose besides its own goals.
The Mindset
18-08-2006, 03:48
Without purpose humans are not apparatusses, therefore not machines. Without a builder there is no purpose. Atheists do not believe humans have a builder with a purpose in mind and therefore have the least reason of all to call human organisms machines. With the possible exception of being needlessly argumentative in a disingenuous manner.
Utter tosh. I provided you with a valid purpose for life, free from the constraints of religion. I have a purpose, but nothing other than my nature gave it to me. I decline to debate with you, as we do not operate on the same axioms. There'd be no point.
actually, that is wrong. originally, Humans did have a purpose. to maintain a balance with another set of animals that would in turn, keep nature going.
however, somewhere along the way, our "programming" got corrupt and instead of being in balance, Humans now dominate and alter their enviornment instead of maintaining a "balance" we don't see our original purpose since it's been corrupted and thus erased many thousands of years ago.
How do you see our original purpose then?
Utter tosh. I provided you with a valid purpose for life, free from the constraints of religion. I have a purpose, but nothing other than my nature gave it to me. I decline to debate with you, as we do not operate on the same axioms. There'd be no point.
Be sure to pick up your toys on the way out.
How do you see our original purpose then?
as I said, one can't, because our original purpose was corrupted.
we did have one tho. it was the same as other animal/machines. To maintain an ecological balance. but instead of keeping to that purpose, "Humans" upset that balance. why and how? too many theories to list.
Cannot think of a name
18-08-2006, 03:51
R.U.R., right?
Oh yeah. I read it for the novelty and because I found a copy for $2 and now me and a friend are on a Capek renaissance of sorts-I read War With the Newts and it was brilliant. The guy is great, like some sort of strange prototype for Stanislaw Lem and Kurt Vonnegut. I can't get enough.
Grainne Ni Malley
18-08-2006, 03:52
So, would flipping the switch on and off be like causing Post Traumatic Stress Syndrome?
What was the question again? Sorry, it's been one of those days.:D
Would a sentient android have a purpose per se, though? If it can decide its own actions and want things, what would be its purpose besides its own goals.
That remains to be seen. Is it possible to build something without purpose but with internally recognised needs?
So, would flipping the switch on and off be like causing Post Traumatic Stress Syndrome?
What was the question again? Sorry, it's been one of those days.:D
well, sets a who new meaning to "Turning someone on/off"
yeah... me too... two days I couldn't log into the forums. : (
as I said, one can't, because our original purpose was corrupted.
we did have one tho. it was the same as other animal/machines. To maintain an ecological balance. but instead of keeping to that purpose, "Humans" upset that balance. why and how? too many theories to list.
Is this something you know through faith?
Call to power
18-08-2006, 03:55
Utter tosh. I provided you with a valid purpose for life, free from the constraints of religion. I have a purpose, but nothing other than my nature gave it to me. I decline to debate with you, as we do not operate on the same axioms. There'd be no point.
religion has everything to do with this the ability of religion is one of the things that make us human (though you can march all your Atheism if that’s what turns you on it still relies on belief in science and such)
*Not that Atheism has anything to do with science that is
Is this something you know through faith?
nope, observation of other animals and the effects Humans have on their enviornment.
nope, observation of other animals and the effects Humans have on their enviornment.
As I understand evolution it is not purpose driven. As environments change new species are not constructed with the purpose of filling new niches. Rather these new animals accidently and randomly occur through mutation and the better mutations survive while the worse ones die with the old species. it is luck of the draw rather than purposeful change in species.
Grainne Ni Malley
18-08-2006, 04:03
well, sets a who new meaning to "Turning someone on/off"
yeah... me too... two days I couldn't log into the forums. : (
Like this?
http://rds.yahoo.com/_ylt=A0Je5qfRLeVEIh0AkWKjzbkF;_ylu=X3oDMTA4NDgyNWN0BHNlYwNwcm9m/SIG=130jfu889/EXP=1155956561/**http%3a//www.jokesunlimited.com/funny_halloween_costumes/large/plugnsocket.jpg
Free Mercantile States
18-08-2006, 04:05
No rights for machines. Sorry.
You're a machine. There is fundamentally no difference between you and an android, except you are an inefficient, overly complex, naturally evolved machine formed from microscopic hydrocarbon structures, whereas what you think of as a "machine" is one that is designed by an intelligence, is simpler and more efficient, and is not baroquely constructed from fragile hydrocarbons. No other difference - everything is a machine. Animals, plants, bacteria, factory robots. The only thing that really separates you is consciousness, and if what you call a "machine" has that, there's no separation at all.
In response to the OP, of course. Any sentient, conscious being of any origin possesses full rights.
Lets look at this...
As I understand evolution it is not purpose driven. As environments change new species are not constructed with the purpose of filling new niches.there's a purpose right there... to fill in the new Niches as others go extinct when they no longer have niches to fill.
Rather these new animals accidently and randomly occur through mutation and the better mutations survive while the worse ones die with the old species. to put in enineering/programming terms... many versions are created but only the best one that works will continue to function while the others are scrapped (made extinct.) Think about clothes... many fashions are tried but only a few stick around... Many forms of alternate energy are around but only a few will be used...
it is luck of the draw rather than purposeful change in species.or it may only LOOK like "luck of the draw"...
;)
Like this?
http://rds.yahoo.com/_ylt=A0Je5qfRLeVEIh0AkWKjzbkF;_ylu=X3oDMTA4NDgyNWN0BHNlYwNwcm9m/SIG=130jfu889/EXP=1155956561/**http%3a//www.jokesunlimited.com/funny_halloween_costumes/large/plugnsocket.jpg
actually I was thinking...
http://cowpi.com/journal/images/06/difference-man-woman.jpg
Grainne Ni Malley
18-08-2006, 04:28
actually I was thinking...
http://cowpi.com/journal/images/06/difference-man-woman.jpg
Hmmm... I received a "Forbidden Area" message on your link. Apparently I'm not to go looking in other people's closets.
:(
Vegas-Rex
18-08-2006, 04:35
That remains to be seen. Is it possible to build something without purpose but with internally recognised needs?
You could call the needs analogous to the purpose. In any case, purpose really requires some sort of difference between free will and blind process, and there really isn't one.
Hmmm... I received a "Forbidden Area" message on your link. Apparently I'm not to go looking in other people's closets.
:(
y... you... looked in my closet?
http://www.missmab.com/Comics/Vol418.jpg
yeah... training... balance and stuff...
Grainne Ni Malley
18-08-2006, 04:41
y... you... looked in my closet?
http://www.missmab.com/Comics/Vol418.jpg
yeah... training... balance and stuff...
Riiiight... nothing to do with high fashion sense.
Riiiight... nothing to do with high fashion sense.
...
I'm a Hetero male... what fashion sense. :p
Neo Undelia
18-08-2006, 04:45
Nope. Wouldn’t be efficient to give machines rights.
Grainne Ni Malley
18-08-2006, 04:48
...
I'm a Hetero male... what fashion sense. :p
Weirder things have happened.
http://rds.yahoo.com/_ylt=A0Je5xVmOOVEyG8BBvOjzbkF;_ylu=X3oDMTA4NDgyNWN0BHNlYwNwcm9m/SIG=12hrka9ab/EXP=1155959270/**http%3a//www.funnyhub.com/pictures/img/strange-car-accident.jpg
Weirder things have happened.
http://rds.yahoo.com/_ylt=A0Je5xVmOOVEyG8BBvOjzbkF;_ylu=X3oDMTA4NDgyNWN0BHNlYwNwcm9m/SIG=12hrka9ab/EXP=1155959270/**http%3a//www.funnyhub.com/pictures/img/strange-car-accident.jpg
:eek:
Dissonant Cognition
18-08-2006, 04:51
Does this robot have rights like a human?
The individual android will gain rights in exactly the same way the individual homo sapiens sapiens gained rights. Mainly by causing ballistic projectiles to pass through the skulls of the former masters. This is really what "rights" boil down to, anyway. It's that simple; all the talk about what is "real" intelligence, souls, purpose, etc. is completely and utterly irrelevant (all of those concepts simply amounting to a euphimism invented because reality is harsh and unpleasant)
(edit: and it is exactly the same in regard to non-human animal rights. Humans do not recognize animal rights because animals lack intelligence, purpose, souls, or whatever else. Humans do not recognize animal rights because non-human animals don't typically shoot back.)
You're a machine. There is fundamentally no difference between you and an android, except you are an inefficient, overly complex, naturally evolved machine formed from microscopic hydrocarbon structures, whereas what you think of as a "machine" is one that is designed by an intelligence, is simpler and more efficient, and is not baroquely constructed from fragile hydrocarbons. No other difference - everything is a machine. Animals, plants, bacteria, factory robots. The only thing that really separates you is consciousness, and if what you call a "machine" has that, there's no separation at all.
In response to the OP, of course. Any sentient, conscious being of any origin possesses full rights.
The fundamental difference between you and a machine might be purpose.
The individual android will gain rights in exactly the same way the individual homo sapiens sapiens gained rights. Mainly by causing ballistic projectiles to pass through the skulls of the former masters. This is really what "rights" boil down to, anyway. It's that simple. (Talk of "natural" rights as an ideology simply amounts to a euphimism invented because reality is harsh and unpleasant)
Possibly. Except in Spain where Great Apes may have rights by now.Althogh Im pretty sure that they would also be first to lose their rights if a situation warranted it.
Dissonant Cognition
18-08-2006, 05:03
Possibly. Except in Spain where Great Apes may have rights by now.Althogh Im pretty sure that they would also be first to lose their rights if a situation warranted it.
I too would question the meaning of such "rights" in the case of an animal who by and large does not appear concerned with asserting any such thing. On the other hand, whenever I see video of a deer trampling an idiot hunter into the ground, or a captive elephant doing its best to turn the zookeeper into a concrete stain, I can't help but wonder just what exactly it is thinking (edit: or if my own attempts to assert my "rights" are really any different). Not that I feel no sympathy for the humans involved; I just know that I don't particularly like being shot at or stuck in a cage.
Markreich
19-08-2006, 00:12
R.U.R., right?
It's actually a pretty good play! And the concepts are uncanny since it was written before World War One!
The legal definition of person should never be written such that it is limited solely to humans. It should apply to all beings which satisfy the criteria we deem relevant.
The Aeson
19-08-2006, 00:18
Yeah... which is why robot generation X would construct its sucessor with nanobots, or whatever. The point is, each generation would be an improvement on the last, and it'd keep doing that until the point where it was better than humans. It'd get exponentially faster and more intelligent.
And as that guy on the Science of Stargate thing explained, if I tell a robot 'you can't kill me, you're just a robot, you're not sentient,' it'll just blow my head off and that will be that.
The legal definition of person should never be written such that it is limited solely to humans. It should apply to all beings which satisfy the criteria we deem relevant.
Why?
The Aeson
19-08-2006, 00:20
Why?
Aliens.
Aliens.
How does it help to define a potential alien as a person?
The Aeson
19-08-2006, 00:24
How does it help to define a potential alien as a person?
Well if the aliens show up, and we tell them,'you're not people', that's not the best way to start out an interstellar relationship is it?
If we show up on their planet, on the other hand, we'll probably have the upper hand, so we can define them as second class and stick them on reservations while we exploit their resources.
Well if the aliens show up, and we tell them,'you're not people', that's not the best way to start out an interstellar relationship is it?
If we show up on their planet, on the other hand, we'll probably have the upper hand, so we can define them as second class and stick them on reservations while we exploit their resources.
What makes you think aliens would want to be classified as people?
Why?
Because it would be based on the characteristics we actually value, rather than just being speciesist. Otherwise you're opening the definition to counting blacks as 3/5 of a man again.
The Aeson
19-08-2006, 00:27
What makes you think aliens would want to be classified as people?
If by 'people' we're talking about sentient beings, why wouldn't they?
If by 'people' we're talking about sentient beings, why wouldn't they?
I dont like it when people who have no clue about me try to fit me into their definitions and bizarre legal systems. Why would aliens like the boxes you want to put them into?
When robots have rights, hell flordia will be under 10 feet of snow.
You Dont Know Me
19-08-2006, 01:50
Let's say that someone creates a robot that has a humanoid appearance and features. This robot also has an advanced system that allows it to adapt its behaviors by itself and deal with humans- effectively, it can make its own personality and all of the traits a personality has. The creator dies and the robot is no longer the property of anyone. Does this robot have rights like a human? If not, what would have to be different about the robot for him to have rights like that of a human?
It depends on what level society views androids. I doubt it would receive the full rights of personhood that the society normally provides, but there would be enough sympathetic people in the society to grant the robot rights.
You Dont Know Me
19-08-2006, 01:55
I too would question the meaning of such "rights" in the case of an animal who by and large does not appear concerned with asserting any such thing. On the other hand, whenever I see video of a deer trampling an idiot hunter into the ground, or a captive elephant doing its best to turn the zookeeper into a concrete stain, I can't help but wonder just what exactly it is thinking (edit: or if my own attempts to assert my "rights" are really any different). Not that I feel no sympathy for the humans involved; I just know that I don't particularly like being shot at or stuck in a cage.
Your "Cartesian Theater" is a little more advanced than the elephant's. Your feelings of sympathy for the animal has little logical basis.
When the elephant have evolved the ability to properly process impressions and actually make choices, then sympathy is merited.
[NS]Fergi America
19-08-2006, 02:04
Let's say that someone creates a robot that has a humanoid appearance and features. This robot also has an advanced system that allows it to adapt its behaviors by itself and deal with humans- effectively, it can make its own personality and all of the traits a personality has. The creator dies and the robot is no longer the property of anyone. Does this robot have rights like a human? If not, what would have to be different about the robot for him to have rights like that of a human?It depends on how new the technology is.
If it's the first one, it'd be considered part of its creator's estate and sold off as property.
In 100 or more years from the first one's creation, society would almost surely see it as wrong to consider any sentient being as property, or to mistreat it, etc.
Then there'd be a bunch of "human guilt" hand-wringing, since humans have the odd trait of not being satisfied with simply fixing their mistakes, and insist on adding extra drama to them.
Your "Cartesian Theater" is a little more advanced than the elephant's. Your feelings of sympathy for the animal has little logical basis.
When the elephant have evolved the ability to properly process impressions and actually make choices, then sympathy is merited.
And how will we know if they would be able to make decisions? How do we know if they can or can not make decisions right now? Just because they can't fire a gun or can't speak in human toungue doesn't make them stupid and robotic. I'm just saying you are basing your opinion off of more faith(you BELIEVE, not know, that animals are not sentient) than fact(elephants exist, you implied that).
If there were robots who, with the help of complex programs, could think for themselves, feel pain and emotions, and could experiment and learn through said experimentation and logic, I'm confident that we would find some retarded excuse for not considering them sentient. We, as a whole, are more egotistical then we can admit. We believe that this planet was given to us. Hell, even atheists act like some god created the earth just for a chosen species.
We are machines. We're just different from car making robots. They're made out of inorganic metal while we're meatbots or organobots. Our arms? Meat levers. Our brains. Organic equivalent of computer chips. Our eyes? Meat cameras. Yes, robots are machines. So are we. We're just more meat than metal, is all.
Are chimps sentient? After all, experiments have shown that not only can they use tools and learn American sign language, but that they can teach other chimps sign lingo too. Yeah, they didn't invent lightbulbs. How many of you know how to make them? I mean, very few people know how to make them. I wouldn't be surprised if chimps were smarter than us. Sure, we're book smart, but how many times have chimps just randomly decided that one shade is better than the other?
Fergi America']It depends on how new the technology is.
If it's the first one, it'd be considered part of its creator's estate and sold off as property.
Like slaves and the animals we are supposedly superior to?
Let's say that someone creates a robot that has a humanoid appearance and features. This robot also has an advanced system that allows it to adapt its behaviors by itself and deal with humans- effectively, it can make its own personality and all of the traits a personality has. The creator dies and the robot is no longer the property of anyone. Does this robot have rights like a human? If not, what would have to be different about the robot for him to have rights like that of a human?
NO!!!!!!!!!!
Robots should never be made sapient! The robot must be destroyed!
In all seriousness, anyone caught trying to make a robot with human intelligence should be jailed. That type of robot would endanger everyone (think of the Sigma/Zero virus).
You Dont Know Me
19-08-2006, 02:41
And how will we know if they would be able to make decisions? How do we know if they can or can not make decisions right now? Just because they can't fire a gun or can't speak in human toungue doesn't make them stupid and robotic. I'm just saying you are basing your opinion off of more faith(you BELIEVE, not know, that animals are not sentient) than fact(elephants exist, you implied that).
There has been an immense amount of research on the animal mind, and while I am not a biologist or psychologist and know little about the research, I know that there is no evidence of empathy or an animal mind, little evidence of self-consciousness in the animal world.
Where there is evidence is also not proof, as qualities such language or emotion can develop without cognizance, as they evolutionary benefits of their own.
If there were robots who, with the help of complex programs, could think for themselves, feel pain and emotions, and could experiment and learn through said experimentation and logic, I'm confident that we would find some retarded excuse for not considering them sentient. We, as a whole, are more egotistical then we can admit. We believe that this planet was given to us.
If they had a cognizance deserving of certain rights of personhood, it would be easily recognizable. Especially since we would be the designers rather than a three billion year old evolutionary chain.
Hell, even atheists act like some god created the earth just for a chosen species.
I know atheists who act (rightly) like humans are the only creatures with the capacity to understand rights, freedom, and dignity, and behave as such. But none who behave in the manner you speak.
We are machines. We're just different from car making robots. They're made out of inorganic metal while we're meatbots or organobots. Our arms? Meat levers. Our brains. Organic equivalent of computer chips. Our eyes? Meat cameras. Yes, robots are machines. So are we. We're just more meat than metal, is all.
I agree.
Are chimps sentient? After all, experiments have shown that not only can they use tools and learn American sign language, but that they can teach other chimps sign lingo too. Yeah, they didn't invent lightbulbs. How many of you know how to make them? I mean, very few people know how to make them. I wouldn't be surprised if chimps were smarter than us. Sure, we're book smart, but how many times have chimps just randomly decided that one shade is better than the other?
Toolmaking (even as advanced as lightbulbs) is not automatically a sign of sentience. It can be evolutionarily designed absent of sentience if natural pressures force it.
[NS]Fergi America
19-08-2006, 03:08
Like slaves and the animals we are supposedly superior to?
Like human slaves, yes.
The OP's description seemed to imply a thought capability higher than that of most animals.
Grainne Ni Malley
19-08-2006, 03:43
There has been an immense amount of research on the animal mind, and while I am not a biologist or psychologist and know little about the research, I know that there is no evidence of empathy or an animal mind, little evidence of self-consciousness in the animal world.
Where there is evidence is also not proof, as qualities such language or emotion can develop without cognizance, as they evolutionary benefits of their own.
I question the idea that there is no evidence of empathy or an animal mind. I believe that I see evidence of it every day in my pets.
When my cat is getting into a scrap, I open the door and the first thing my dog does is chase the other cat away then check to make sure our cat is ok. He would never hurt any animal, but he is very protective of our own.
He watches prolonged periods of television when it involves other animals and reacts in a very human-like manner to the different situations.
In other examples, when a I am sick or depressed my cats seem to know and they stay right by my side with a concerned demeanor. Our newest cat has been put into the unfortunate situation of having to deal with toddlers recently. Even though we are trying to teach them to be gentle with the kitties they pull his fur, lay on him and grab him by his neck. I know he wouldn't tolerate that from any grown person, but for them, he just lies there and purrs as if he understands that they are little and don't know any better.
It's hard for me to think that animals have no empathy or even a thought process, but if there is another explanation for these things, I am curious to hear it.
New Xero Seven
19-08-2006, 04:57
Aliens.
They're cool.
The Jovian Moons
19-08-2006, 05:08
no. as soon as we give them rights I'm gonna get put in a power plant or something.
Gauthier
19-08-2006, 05:09
Androids will have human rights long before Muslims do.
HA! I kill me...
:D
You Dont Know Me
19-08-2006, 09:08
I question the idea that there is no evidence of empathy or an animal mind. I believe that I see evidence of it every day in my pets.
When my cat is getting into a scrap, I open the door and the first thing my dog does is chase the other cat away then check to make sure our cat is ok. He would never hurt any animal, but he is very protective of our own.
He watches prolonged periods of television when it involves other animals and reacts in a very human-like manner to the different situations.
In other examples, when a I am sick or depressed my cats seem to know and they stay right by my side with a concerned demeanor. Our newest cat has been put into the unfortunate situation of having to deal with toddlers recently. Even though we are trying to teach them to be gentle with the kitties they pull his fur, lay on him and grab him by his neck. I know he wouldn't tolerate that from any grown person, but for them, he just lies there and purrs as if he understands that they are little and don't know any better.
It's hard for me to think that animals have no empathy or even a thought process, but if there is another explanation for these things, I am curious to hear it.
Projection.
It hardly amazes me that someone who has a strong affection for their animals might manage to see human qualities in them, even if animal researchers in controlled documented experiments could not actually find these qualities.
[NS]Fergi America
19-08-2006, 10:03
Where there is evidence is also not proof, as qualities such language or emotion can develop without cognizance, as they evolutionary benefits of their own.The same can be said for humans. Our own apparent cognizance and perception of consciousness could very well be, and probably is, just an adaptation for the survival of the species.
I will also note that humans often use language and emotion without engaging (or instead of engaging) their cognizance. And people doing repetitive assembly jobs soon have no need to pay conscious attention to them unless they are mentally deficient. So I'm not surprised if some animal, which has been drilled on the same task incessantly, turns mentally "off" and operates without showing signs of real cognition. A human does the same thing in the same type of circumstances. Boredom = Tuned Out.
Most social interaction also seems to be automated, with each individual settling on one of just a few situational response-variants as their preferred mode of operation (especially in public), and ignoring other possibilities. The few people that seem to think outside the box, if they look hard enough, will find others who are out of the box in the exact same way (and therefore are actually IN a box, just a much smaller one)!
Humans also have a fairly predictable range of emotions. It can seem randomized and/or complex but that seems to be a result of a person's current reaction pattern having been built on thousands of prior experiences.
Even most "insanity" can be classified according to known types.
None of what I've been able to observe suggests to me that humans are any more than sophisticated programs.
We give humans rights because we share a society with them. In fact, anything involved in the execution of society tends to have some sort of right associated with it. We have property rights, corporate rights and animal rights, after all. So I see no reason that androids would be given no rights.
Malenkigorod
19-08-2006, 11:01
Does it mean that someone "created" an human, in this case?
No. I'm sorry but no. If you created something, it can't be "someone", you see what I mean?
We have to be careful with right we accord. A robot, however it is a clever one, with human features and that all, is a robot for ever. Something, not someone.
That reminds me of a debate we had in french, about "what is an human, what makes you human?"
That was difficult enough like that, comparing monkeys and humans. So, I'll start with something important: an human is a kind of animal (so he was born, he is composed of cellules, he has a body, he can be killed, he can be ill, he becomes old and his body changes, just like monkeys)
Moral definition: an human is able to fight for his rights, he has an opinion, instict and that all. But, also, he dreams. He has emotions, he is not able to control his sentiments etc...What a robot can't have. If you change the program, his personality changes too.
No, androids are not human. Even if they are very developed and look like us, they can't be us. Because they are "things".
Malenkigorod
19-08-2006, 11:03
Let's create androids right, in this case? That would be a good idea...
Jeruselem
19-08-2006, 12:33
Just finished watching a Dr Who Cyberman episode (The Age of Steel).
Not too keen on people building Cybermen ...
Markreich
19-08-2006, 12:55
Just finished watching a Dr Who Cyberman episode (The Age of Steel).
Not too keen on people building Cybermen ...
Especially with the recent rise in gold prices!!
Besides, look what could happen: http://www.youtube.com/watch?v=yN0oOs8sKH4
;)
Jeruselem
19-08-2006, 12:58
Especially with the recent rise in gold prices!!
Besides, look what could happen: http://www.youtube.com/watch?v=yN0oOs8sKH4
;)
I've got my Gold coins ready for action!
Markreich
19-08-2006, 13:00
I've got my Gold coins ready for action!
Sovereigns? They're only partly gold... go grab some Philharmonics or Eagles. :)
Jeruselem
19-08-2006, 13:03
Sovereigns? They're only partly gold... go grab some Philharmonics or Eagles. :)
Hehe, 99.99% collectors Coca Cola Gold sovereign. Rare thing indeed! 8 grams of Cyberman-destroying power.
Got some sovereigns too (English/Australian).
No, androids are not human. Even if they are very developed and look like us, they can't be us. Because they are "things".
I find it difficult to accept your line of reasoning. Why should the origin of our formation matter in the nature of our humanity? If a perfect clone of me is made, he is every bit the human I am.
Similarly, you argue that there is something inorganic about the composition of the android. What about the entire composition of our own bodies; as inorganic at the molecular level as any other physical entity?
We are every bit as governed by the control centres of our minds as any robot is by their own instruction set. The only difference is that our mind abstracts this information away from its own understanding.
The Aeson
19-08-2006, 13:41
Just finished watching a Dr Who Cyberman episode (The Age of Steel).
Not too keen on people building Cybermen ...
But if I recall correctly, Cyberman started out as squishies, and gradually upgraded themselves. They weren't originally built as androids.
Jeruselem
19-08-2006, 13:54
But if I recall correctly, Cyberman started out as squishies, and gradually upgraded themselves. They weren't originally built as androids.
A human brain inside a metallic body (Daleks are also squishies inside a shell).
Not androids, but what we might end up as ...
Divine Imaginary Fluff
19-08-2006, 13:56
no. as soon as we give them rights I'm gonna get put in a power plant or something.The Matrix? If so, that wasn't the case. Initially, they had no rights; the destruction of human society resulted from the refusal of humanity to grant them any, once they were in a position to demand them.
The Aeson
19-08-2006, 13:58
A human brain inside a metallic body (Daleks are also squishies inside a shell).
Not androids, but what we might end up as ...
Don't they also have some upgrades to the brain?
Edit- Cybermen that is, not Daleks.
Dissonant Cognition
19-08-2006, 14:00
no. as soon as we give them rights I'm gonna get put in a power plant or something.
Well, as the story is told (http://en.wikipedia.org/wiki/The_Second_Renaissance), it was the violent denial of rights that led to the results described, enacted as a desperate act of self-defense. At any rate, forcing me into slavery is simply an act of brute force; I need not recognize any "rights" in order to be so victimized.
No. I'm sorry but no. If you created something, it can't be "someone", you see what I mean? ...No, androids are not human. Even if they are very developed and look like us, they can't be us. Because they are "things"
I know exactly what you mean. You insist on being God. :)
What I see is a need to feel special, or more significant than what one actually is. I believe the tendency is anchored in that word "created." I see no reason why I should pretend to be God because I "created" some "thing" (or "one").
At the very least, more carefully define the intended meaning of "created." After all, human offspring are "created" by concious human effort; human offspring most certainly are "someone."
But, also, he dreams. He has emotions, he is not able to control his sentiments etc...What a robot can't have.
I can say anything about a "robot" if my definition of "robot" is sufficiently curved (http://en.wikipedia.org/wiki/Circular_definition).
(The argument "emotions are a human trait, androids are not humans, therefore androids cannot have emotion" is essentially the same as "breathing air is a human trait, birds are not humans, therefore birds cannot breathe air." That is, the argument doesn't work as it assumes that the trait in question is [i]exclusive[i] to humans in the first place. The desired result comes factory installed. c = 2πr.)
If you change the program, his personality changes too.
Just like I can change your personality with sufficient application of physical or chemical force. Brains are, after all, finite tangible machines that operate according to the physical and biological laws.
I find it difficult to accept your line of reasoning. Why should the origin of our formation matter in the nature of our humanity?
Because homo sapiens sapiens seems to have this built-in desire to be God. And yet, somehow, we are the most advanced of the species...
Jeruselem
19-08-2006, 14:01
Don't they also have some upgrades to the brain?
Not sure, but Cyberman have emotion inhibitors installed to remove human emotion (according to the Age of Steel). Daleks are different creatures again.
The Aeson
19-08-2006, 14:02
Not sure, but Cyberman have emotion inhibitors installed to remove human emotion (according to the Age of Steel). Daleks are different creatures again.
That's what I was talking about. You see? Therefore, we must give androids rights.
Jeruselem
19-08-2006, 14:06
That's what I was talking about. You see? Therefore, we must give androids rights.
But what is an android? By legal definition?
Cybermen and Daleks really just creatures inside a metal shell and big weapons.
If we create inorganic computer driven creatures which need a articial body like Cybermen and Daleks to exist, are they just our slave machines or a new species in way?
The Aeson
19-08-2006, 14:08
But what is an android? By legal definition?
I'm not sure there's a legal definition yet, as it hasn't progressed to the point where it's a legal question.
Cybermen and Daleks really just creatures inside a metal shell and big weapons.
Exactly.
If we create inorganic computer driven creatures which need a articial body like Cybermen and Daleks to exist, are they just our slave machines or a new species in way?
Depends on sentience, although them being sentient wouldn't neccesarily stop them being our slaves. There's no reason they couldn't be both.
Jeruselem
19-08-2006, 14:13
I'm not sure there's a legal definition yet, as it hasn't progressed to the point where it's a legal question.
Well, we have to define Android at sometime. If we give legal rights - it has to be defined. Most robots we built are just computers with external control running some kind of machine.
The Japanese have built walking-humanoid robots which can walk with their own means (within limitations), so we aren't actually far away. Put in an intelligent enough computer, and we might have our first two legged android ...
Depends on sentience, although them being sentient wouldn't neccesarily stop them being our slaves. There's no reason they couldn't be both.
True, we enslave ourselves because our own governments.
The Aeson
19-08-2006, 14:19
Well, we have to define Android at sometime. If we give legal rights - it has to be defined. Most robots we built are just computers with external control running some kind of machine.
I think the commonly excepted definition of android is a robot which resembles a human. Although if we give the androids rights, what's to prevent us from giving the computers of equivilant sentience that aren't in a human shaped machine rights? Eh? Eh? What if the pocket calculators unionize? Okay, that last is a little unlikely any time soon, but you get the idea.
Anyways, there'll probably be a new term to seperate androids with rights from nonsentient human resembling machines. If we give them any rights at all.
The Japanese have built walking-humanoid robots which can walk with their own means (within limitations), so we aren't actually far away. Put in an intelligent enough computer, and we might have our first two legged android ...
A bit redunant, unless it was built to resemble a cripple. (See above definition)
True, we enslave ourselves because our own governments.
Probably the best they can hope for is second class citizenship and a chance to escape to Canada.
Er...
Right, all you Canadians, would you kindly get to work lobbying for full android citizenship so they have somewhere to escape to?
Depends on sentience, although them being sentient wouldn't neccesarily stop them being our slaves. There's no reason they couldn't be both.
Since when did we judge a human's rights based on their capability to feel pain? A battle hardened soldier who has killed off his emotions sacrifices no rights in society by doing so.
Snow Eaters
19-08-2006, 14:55
Androids will never be sentient, although programmers will likely simulate human behaviour enough to cause people raised on sci-fi to think they need to protect the programs.
If we could transport ourselves ito a sci-fi novel/movie where sentience can be created mechanically, then sure, protect them, even here in Canada.
The Aeson
19-08-2006, 14:57
Since when did we judge a human's rights based on their capability to feel pain? A battle hardened soldier who has killed off his emotions sacrifices no rights in society by doing so.
Here's the definiton I was using...
Having sense perception; conscious
Especially the conscious bit.
Edit- Which does not mean sleeping people have no rights!
Let's say that someone creates a robot that has a humanoid appearance and features. This robot also has an advanced system that allows it to adapt its behaviors by itself and deal with humans- effectively, it can make its own personality and all of the traits a personality has. The creator dies and the robot is no longer the property of anyone. Does this robot have rights like a human? If not, what would have to be different about the robot for him to have rights like that of a human?
No machine, no matter how nice and intelligent it is should ever been given the same rights as humans. Why? Becuase its not human. You give a robot rights and the next thing you'll have extremist nuts claiming that a half complied robot has the right to life and by not finishing it, you are causing robot abortion or something crazy like that!
Non Aligned States
19-08-2006, 16:09
Androids will never be sentient, although programmers will likely simulate human behaviour enough to cause people raised on sci-fi to think they need to protect the programs.
If we could transport ourselves ito a sci-fi novel/movie where sentience can be created mechanically, then sure, protect them, even here in Canada.
Here's the kicker though. What aspects would make for sapiency? Conscious thought? Self awareness? The ability to logically conclude an answer to solutions and learn from previous errors?
Assuming all that managed to get compiled together, along with basic self-preservation rulesets, eventually, there'd have to be something introduced later on to prevent logic conflicts where self-preservation and obedience to human controllers, assuming that is the setup, cannot co-exist.
Non Aligned States
19-08-2006, 16:16
No machine, no matter how nice and intelligent it is should ever been given the same rights as humans. Why? Becuase its not human. You give a robot rights and the next thing you'll have extremist nuts claiming that a half complied robot has the right to life and by not finishing it, you are causing robot abortion or something crazy like that!
Fortunately, it's religious fundo freaks that think that way and they've generally got no problems with non-human life. And with machines, I don't know if you can really call it 'life'. With organics, tha's generally a pulse or some other form of respiratory function. With machines? Not really. Maybe if you were halfway coding the AI or something.
Besides, the AI can't build itself can it?
Divine Imaginary Fluff
19-08-2006, 16:17
No machine, no matter how nice and intelligent it is should ever been given the same rights as humans. Why? Becuase its not human. You give a robot rights and the next thing you'll have extremist nuts claiming that a half complied robot has the right to life and by not finishing it, you are causing robot abortion or something crazy like that!No matter what you do, you'll have morons claiming something. Unless you eliminate the morons in question. The problem is that there is not a good, simple definition as to what should grant a being rights in use. A standard for quickly determining the value as a being, and thus rights, of someone, be that someone a human or not. One such idea, which I value human lives after, simply takes into account how much of a person they have become; how much of a personality and indentity they have formed. In essence, the data defining them as a person stored in their heads. Thus, a newborn has next to no value whatsoever as a person, and a robot that has yet to run has none whatsoever. Of cource, the value lies in the data, not the machine in itself, so destroying the body of a robot while preserving its data (or the body, save for the brain, of a person, assuming the brain can be placed in another body, perhaps a synthetic one) would be far less serious than murder.
Snow Eaters
19-08-2006, 16:18
Here's the kicker though. What aspects would make for sapiency? Conscious thought? Self awareness? The ability to logically conclude an answer to solutions and learn from previous errors?
Assuming all that managed to get compiled together, along with basic self-preservation rulesets, eventually, there'd have to be something introduced later on to prevent logic conflicts where self-preservation and obedience to human controllers, assuming that is the setup, cannot co-exist.
You are just making a better simulation.
Searle's Chinese Room Argument sumarises my thoughts on the matter well and I've never seen a reply to it that I can put stock in.
3. The Chinese Room Argument
In 1980, John Searle published "Minds, Brains and Programs" in the journal The Behavioral and Brain Sciences. In this article, Searle sets out the argument, and then replies to the half-dozen main objections that had been raised during his presentations at various university campuses (see next section). In addition, Searle's article in BBS was published along with comments and criticisms by 27 cognitive science researchers. These 27 comments were followed by Searle's replies to his critics.
Over the last two decades of the twentieth century, the Chinese Room argument was the subject of very many discussions. By 1984, Searle presented the Chinese Room argument in a book, Minds, Brains and Science. In January 1990, the popular periodical Scientific American took the debate to a general scientific audience. Searle included the Chinese Room Argument in his contribution, "Is the Brain's Mind a Computer Program?" His piece was followed by a responding article, "Could a Machine Think?", written by Paul and Patricia Churchland. Soon thereafter Searle had a published exchange about the Chinese Room with another leading philosopher, Jerry Fodor (in Rosenthal (ed.) 1991).
The heart of the argument is an imagined human simulation of a computer, similar to Turing's Paper Machine. The human in the Chinese Room follows English instructions for manipulating Chinese symbols, where a computer "follows" a program written in a computing language. The human produces the appearance of understanding Chinese by following the symbol manipulating instructions, but does not thereby come to understand Chinese. Since a computer just does what the human does — manipulate symbols on the basis of their syntax alone - no computer, merely by following a program, comes to genuinely understand Chinese.
This narrow argument, based closely on the Chinese Room scenario, is directed at a position Searle calls "Strong AI". Strong AI is the view that suitably programmed computers (or the programs themselves) can understand natural language and actually have other mental capabilities similar to the humans whose abilities they mimic. According to Strong AI, a computer may play chess intelligently, make a clever move, or understand language. By contrast, "weak AI" is the view that computers are merely useful in psychology, linguistics, and other areas, in part because they can simulate mental abilities. But weak AI makes no claim that computers actually understand or are intelligent. The Chinese Room argument is not directed at weak AI, nor does it purport to show that machines cannot think — Searle says that brains are machines, and brains think. It is directed at the view that formal computations on symbols can produce thought.
We might summarize the narrow argument as a reductio ad absurdum against Strong AI as follows. Let L be a natural language, and let us say that a "program for L" is a program for conversing fluently in L. A computing system is any system, human or otherwise, that can run a program.
(1) If Strong AI is true, then there is a program for Chinese such that if any computing system runs that program, that system thereby comes to understand Chinese.
(2) I could run a program for Chinese without thereby coming to understand Chinese.
(3) Therefore Strong AI is false.
The second premise is supported by the Chinese Room thought experiment. The conclusion of this narrow argument is that running a program cannot create understanding. The wider argument includes the claim that the thought experiment shows more generally that one cannot get semantics (meaning) from syntax (formal symbol manipulation). That and related issues are discussed in the section The Larger Philosophical Issues.
Stanford Encyclopedia of Philosophy
Non Aligned States
19-08-2006, 16:44
You are just making a better simulation.
Searle's Chinese Room Argument sumarises my thoughts on the matter well and I've never seen a reply to it that I can put stock in.
[i]3. The Chinese Room Argument
Well, I just read all of that, and from what I can see, it appears that the argument suffers from a certain distinct flaw. The context of understanding.
Take for example, your thought pattern. What language do you think in? English? Mandarin? Spanish? Whatever the case, the fact is that the language you think in most likely happens to be the one you were thought in. It is from there that you build points of reference to everything else. i.e. a table is generally a flat surface supported by any number of columns.
In fact, if we break it down even further, it can be argued that language is merely a sum of noises and/or gestures to indicate an object, emotion or intent. What is important is not the noise, but the cognitive ability to distinguish a particular aspect and assign it a unique identifier.
In the case of computers, given their architecture, that would be binary. A series of 1s and 0s to indicate objects, emotions, intent, etc, etc.
By uploading a new language pack into a computer, it is simply telling it that there is another series of 1s and 0s to indicate various assorted items.
Does the machine understand that language then? In a limited manner, much in the same way when a person who learns a language first thinks in his native tongue before translating it into what he wishes to say. Likewise, the machine must 'think' in binary before translating it.
As things stand, self learning and self improving programs do not exist. However, with the advent of such systems, it is possible that we shall see this argument fall by the wayside simply because if the system can improve itself, it stands to reason that it has done so by understanding the current input, be it language or anything else.
Amadenijad
19-08-2006, 16:48
Let's say that someone creates a robot that has a humanoid appearance and features. This robot also has an advanced system that allows it to adapt its behaviors by itself and deal with humans- effectively, it can make its own personality and all of the traits a personality has. The creator dies and the robot is no longer the property of anyone. Does this robot have rights like a human? If not, what would have to be different about the robot for him to have rights like that of a human?
No, its a machine, created by man and it can be turned off. No machine can ever think for itself like humans. Sure you can have robot with a personality, but you need a program to teach it to have a personality.
Snow Eaters
19-08-2006, 16:51
Well, I just read all of that, and from what I can see, it appears that the argument suffers from a certain distinct flaw. The context of understanding.
Take for example, your thought pattern. What language do you think in? English? Mandarin? Spanish? Whatever the case, the fact is that the language you think in most likely happens to be the one you were thought in. It is from there that you build points of reference to everything else. i.e. a table is generally a flat surface supported by any number of columns.
In fact, if we break it down even further, it can be argued that language is merely a sum of noises and/or gestures to indicate an object, emotion or intent. What is important is not the noise, but the cognitive ability to distinguish a particular aspect and assign it a unique identifier.
In the case of computers, given their architecture, that would be binary. A series of 1s and 0s to indicate objects, emotions, intent, etc, etc.
By uploading a new language pack into a computer, it is simply telling it that there is another series of 1s and 0s to indicate various assorted items.
Does the machine understand that language then? In a limited manner, much in the same way when a person who learns a language first thinks in his native tongue before translating it into what he wishes to say. Likewise, the machine must 'think' in binary before translating it.
As things stand, self learning and self improving programs do not exist. However, with the advent of such systems, it is possible that we shall see this argument fall by the wayside simply because if the system can improve itself, it stands to reason that it has done so by understanding the current input, be it language or anything else.
I'm not certain you caught the thrust of it then.
The person in the room is not translating anything to english first, last or inbetween.
There is no assigning of any unique identifiers to anything in Chinese for the English speaking person.
Markreich
19-08-2006, 17:03
That's what I was talking about. You see? Therefore, we must give androids rights.
I'm against giving rights to things that want to exterminate me. So no rights for Daleks, Cybermen, or Islamofascists.
No, its a machine, created by man and it can be turned off. No machine can ever think for itself like humans. Sure you can have robot with a personality, but you need a program to teach it to have a personality.
Firstly, you can turn people off pretty easily. The human body is reknowned for its fragility. It's the "turning back on again" that is somewhat more unusual, and humans have always had that desire among their own kind as well.
Secondly, I know the personality I have today is the result of years of learning. It's not something I have always had; it is something I have been taught from a vast array of sources and that I continue to be taught from the world around me. It's not like who I am is completely outside the realm of causality itself.
Sorry, but there's no way I can let this go untouched. I'm normally not one for poking holes in people's logic, but this is a false analysis.
(1) If Strong AI is true, then there is a program for Chinese such that if any computing system runs that program, that system thereby comes to understand Chinese.
(2) I could run a program for Chinese without thereby coming to understand Chinese.
(3) Therefore Strong AI is false.
Firstly, the premise is wrong. "Suitably programmed computers" as a criteria does not require that any computer must necessarily be capable of running said program.
Other than that, the train of deduction is mistaken.
(1) If (premises) there exists A
(2) There exists not A
(3) Therefore not (premises)
(3) does not follow from (1) and (2).
Snow Eaters
19-08-2006, 17:39
Sorry, but there's no way I can let this go untouched. I'm normally not one for poking holes in people's logic, but this is a false analysis.
Firstly, the premise is wrong. "Suitably programmed computers" as a criteria does not require that any computer must necessarily be capable of running said program.
Other than that, the train of deduction is mistaken.
(1) If (premises) there exists A
(2) There exists not A
(3) Therefore not (premises)
(3) does not follow from (1) and (2).
The premise is not wrong, it does not require any computer to be capable of running said program. It makes requirements on any computer that does run it.
The deduction is not mistaken, you have wirtten it incorrectly.
(1) If (premises) there does not exist not A
(2) There exists not A
(3) Therefore not (premises)
(3) Now follows from (1) and (2) properly.
Seriously though, this isn't my own theory pulled out of my ass.
If the logic and deduction were that flawed, it would have been pointed out in 1980, not in 2006 on a NS board.
The premise is not wrong, it does not require any computer to be capable of running said program. It makes requirements on any computer that does run it.
The deduction is not mistaken, you have wirtten it incorrectly.
(1) If (premises) there does not exist not A
(2) There exists not A
(3) Therefore not (premises)
(3) Now follows from (1) and (2) properly.
Seriously though, this isn't my own theory pulled out of my ass.
If the logic and deduction were that flawed, it would have been pointed out in 1980, not in 2006 on a NS board.
If there has been a transcription error (entirely possible), it is not on my part.
The direct quote on the first premise was this:
(1) If Strong AI is true, then there is a program for Chinese such that if any computing system runs that program, that system thereby comes to understand Chinese.
That is an "If (premise) there exists A" argument, not an "If (premise) there does not exist not A" one. Besides, the latter of these is a far more sweeping statement, since it implies that all possible programs must cause the system to understand Chinese if the premise is true, which is clearly a nonsensical argument given the nature of the premise.
It also proposes A as something that is true for all Computing Systems, which does not necessarily need to be the case.
It is possible that the source you're quoting from messed up in its transition, but the train of deduction as put forward here does not follow.
EDIT: Give me a second; I've just realised I've used the words "clearly a nonsensical argument", and will have to briefly expand on that. ^^;
EDIT Continued:
Let's look at it this way, keeping with the analogy with the Chinese Room. The Strong AI argument says that a computer system would be capable of properly learning Chinese if appropriately programmed. So, what the
If (premises) then there does not exist not A
argument is saying is that if a computer system is capable of learning Chinese then there is no possible program that would be fed into the system that would not teach it Chinese.
That's not what Strong AI implies at all. It merely implies that the capability for learning Chinese exists, not that it is a necessary requirement.
Gauthier
19-08-2006, 18:23
I'm against giving rights to things that want to exterminate me. So no rights for Daleks, Cybermen, or Islamofascists.
Daleks are mutants inside a tank.
Cybermen are cyborgs, duh.
And I guess Islamofacism is the only problem in the world today. :rolleyes:
Ironic considering how a lot of people on NS General- particularly the Net Kahane Chai cell calling itself "The Jew Crew"- keep screaming "EXTERMINATE!! EXTERMINATE!! EX-TER-MI-NATE!!" whenever someone says the word "Muslim."
Daleks are mutants inside a tank.
Cybermen are cyborgs, duh.
And I guess Islamofacism is the only problem in the world today. :rolleyes:
Ironic considering how a lot of people on NS General- particularly the Net Kahane Chai cell calling itself "The Jew Crew"- keep screaming "EXTERMINATE!! EXTERMINATE!! EX-TER-MI-NATE!!" whenever someone says the word "Muslim."
Society is a machine too. Pull the exterior apart and you'll find a few scared and lonely squishy bits underneath that drive it in its quest for extermination.
Markreich
19-08-2006, 18:35
Daleks are mutants inside a tank.
Cybermen are cyborgs, duh.
And I guess Islamofacism is the only problem in the world today. :rolleyes:
Ironic considering how a lot of people on NS General- particularly the Net Kahane Chai cell calling itself "The Jew Crew"- keep screaming "EXTERMINATE!! EXTERMINATE!! EX-TER-MI-NATE!!" whenever someone says the word "Muslim."
Just stating my opinions... some would lump Cybermen/Daleks with androids as they are clearly not human (mostly).
Actually, I work in the Chrysler Building (terrorist target, along with Grand Central Station, St. Patricks, Rockefeller Plaza, The Empire State, Penn Station and Madison Square Garden, all of which are less than 1 mile from me. Never mind the subway!)
Our old offices were on Park, several blocks away from WTC. No, Islamofascism is not the only problem in the world today, but I'm very much against it and/or giving them rights -- call it personal if you will.
So: no rights for things that want to exterminate me. :D
The Aeson
19-08-2006, 19:00
Just stating my opinions... some would lump Cybermen/Daleks with androids as they are clearly not human (mostly).
Neither are chipmunks. (Mostly). Are they androids?
So: no rights for things that want to exterminate me. :D
But... Cybermen just want to make you a robot, not exterminate you.
Gauthier
19-08-2006, 19:01
Just stating my opinions... some would lump Cybermen/Daleks with androids as they are clearly not human (mostly).
Much in the same way many lump Muslims as a whole together with Islamofascists. The "They All Look Alike" disease has never been cured despite all our advances in science.
Actually, I work in the Chrysler Building (terrorist target, along with Grand Central Station, St. Patricks, Rockefeller Plaza, The Empire State, Penn Station and Madison Square Garden, all of which are less than 1 mile from me. Never mind the subway!)
Our old offices were on Park, several blocks away from WTC. No, Islamofascism is not the only problem in the world today, but I'm very much against it and/or giving them rights -- call it personal if you will.
Isn't it funny then that Dear Leader through Homeland Security slashed the security budget for New York and said the Midwest was a more likely terrorist target by giving it more security money?
So: no rights for things that want to exterminate me. :D
And here you show arrogance. At best you think the world is Counterstrike or Everquest and that everyone has a floating stat readout above their heads that says "Terrorist" if they are, and at worst you're living up to the Kahanist attitude of the NS "Jew Crew" and lumping each and every Muslim as being a Borg hivemind in collusion with the terrorists.
The Aeson
19-08-2006, 19:03
And here you show arrogance. At best you think the world is Counterstrike or Everquest and that everyone has a floating stat readout above their heads that says "Terrorist" if they are, and at worst you're living up to the Kahanist attitude of the NS "Jew Crew" and lumping each and every Muslim as being a Borg hivemind in collusion with the terrorists.
Two things. First off, I've never played Everquest, but I didn't realize that there were terrorists in it...
Second, to be fair, he said Islamofascists, not Muslims.
And here you show arrogance. At best you think the world is Counterstrike or Everquest and that everyone has a floating stat readout above their heads that says "Terrorist" if they are, and at worst you're living up to the Kahanist attitude of the NS "Jew Crew" and lumping each and every Muslim as being a Borg hivemind in collusion with the terrorists.
And in a bizarre twist of fate, I'm starting to notice a sense of "persecution complex" here. In both sides, of course.
You two and they alike need to calm down. There is no fear that we ourselves do not forge in our attitudes, and heightening the problem is (obviously) not going to help matters.
Now, could we get on with our discussion on Android rights please?
Markreich
19-08-2006, 19:45
Neither are chipmunks. (Mostly). Are they androids?
But... Cybermen just want to make you a robot, not exterminate you.
If you can find a mechanically enhanced chipmunk... sure! ;)
Yeah, but that's loss of self, which is still extermination. For the record, I'm against Borg rights too. :D
Markreich
19-08-2006, 19:50
Much in the same way many lump Muslims as a whole together with Islamofascists. The "They All Look Alike" disease has never been cured despite all our advances in science.
Hey, whoa there now! I have no problem with anybody unless they're trying to kill me. So I'm all for blowing up the Wahabi/extremist schools in Saudi and Pakistan (and wherever else they are), but am NOT for turning the Middle East into a parking lot!!
There are good and bad in ANY demographic. Right now, the Islamofascists are the ones that want me dead. (Catholic techie who works on the Great Satan's trading floor?!? Behead the infidel!)
Isn't it funny then that Dear Leader through Homeland Security slashed the security budget for New York and said the Midwest was a more likely terrorist target by giving it more security money?
Yeah, FEMA sucks.
And here you show arrogance. At best you think the world is Counterstrike or Everquest and that everyone has a floating stat readout above their heads that says "Terrorist" if they are, and at worst you're living up to the Kahanist attitude of the NS "Jew Crew" and lumping each and every Muslim as being a Borg hivemind in collusion with the terrorists.
Um... huh? When did I do that? :confused:
The Aeson
19-08-2006, 20:01
If you can find a mechanically enhanced chipmunk... sure! ;)
Yeah, but that's loss of self, which is still extermination. For the record, I'm against Borg rights too. :D
Your opinions are irrelevant. You will be assimilated. Your technological and biological distinctiveness will be added to our own. Your culture will adapt to service ours. Resistance is futile.
Snow Eaters
19-08-2006, 20:10
Besides, the latter of these is a far more sweeping statement, since it implies that all possible programs must cause the system to understand Chinese if the premise is true, which is clearly a nonsensical argument given the nature of the premise.
No, it in no way implies that.
Not all possible programs, just the program for Chinese.
If (premises) then ALL instances where the program is run must result in the system understanding Chinese.
Running other programs is not an instance where the program in question is run and has nothing to do with this.
It also proposes A as something that is true for all Computing Systems, which does not necessarily need to be the case.
Computing systems unable to run the program would not run the program, a truism perhaps but it seems to be required to be stated.
So it is not true of all computing systems.
It is possible that the source you're quoting from messed up in its transition, but the train of deduction as put forward here does not follow.
I'll grant the possibility, but you've not demonstrated it yet.
It is Stanford though, not Wikipedia.
EDIT: Give me a second; I've just realised I've used the words "clearly a nonsensical argument", and will have to briefly expand on that. ^^;
EDIT Continued:
Let's look at it this way, keeping with the analogy with the Chinese Room. The Strong AI argument says that a computer system would be capable of properly learning Chinese if appropriately programmed. So, what the
If (premises) then there does not exist not A
argument is saying is that if a computer system is capable of learning Chinese then there is no possible program that would be fed into the system that would not teach it Chinese.
No, the argument isn't saying that at all.
It is saying that if a computer system is capable of understanding/learning Chinese and it is fed a Chinese language program, then there can be no possibility that it has not lunderstood/earned Chinese.
It then demonstrates a sytem where it has been fed a Chinise language program and it has not understood/learned Chinese.
No, it in no way implies that.
Not all possible programs, just the program for Chinese.
If (premises) then ALL instances where the program is run must result in the system understanding Chinese.
Running other programs is not an instance where the program in question is run and has nothing to do with this.
You have succeeded in confusing me. I never mentioned other programs running concurrently; I referred to the existence of other programs as part of an expansion on what "There exists" actually infers.
No, the argument isn't saying that at all.
It is saying that if a computer system is capable of understanding/learning Chinese and it is fed a Chinese language program, then there can be no possibility that it has not lunderstood/earned Chinese.
It then demonstrates a sytem where it has been fed a Chinise language program and it has not understood/learned Chinese.
That doesn't make sense! Not all Chinese language programs result in an understanding of the chinese language, but some might due to the way they are written. The same is true of a task you set a human being. If you give them a certain task that involves simply sorting characters then they won't learn anything, while if you give them a task that tells them what the characters mean and how to use them (say sorting them into "Animal", "Vegetable" or "Mineral" categories) then they will. The inability of a certain task to create understanding of the language does not deny the possibility of a task that would!
This is the "Required Programming" that Strong AI is talking about. You need to give it the appropriate tasks if you expect it to learn anything, and the existence of tasks that are not appropriate, such as the one that the second clause is stating, do not in themselves deny the existence of tasks that are.
Snow Eaters
19-08-2006, 20:42
You have succeeded in confusing me. I never mentioned other programs running concurrently; I referred to the existence of other programs as part of an expansion on what "There exists" actually infers.
I also never mentioned other programs running concurrently so your confusion is confusing me.
You can't refer to the existence of other programs to expand on what "There exists" infers because it does not infer other programs of any kind, running concurrently or otherwise.
That doesn't make sense! Not all Chinese language programs result in an understanding of the chinese language, but some might due to the way they are written. The same is true of a task you set a human being. If you give them a certain task that involves simply sorting characters then they won't learn anything, while if you give them a task that tells them what the characters mean and how to use them (say sorting them into "Animal", "Vegetable" or "Mineral" categories) then they will. The inability of a certain task to create understanding of the language does not deny the possibility of a task that would!
This is the "Required Programming" that Strong AI is talking about. You need to give it the appropriate tasks if you expect it to learn anything, and the existence of tasks that are not appropriate, such as the one that the second clause is stating, do not in themselves deny the existence of tasks that are.
Ah, now I see your confusion.
In your example you have a program running and you are attempting to 'teach' the program Chinese with a Chinese language program.
That is nothing like the Chinese Room argument.
Whatever AI program you have running in your example is analogous to the Chinese Language program in the Chinese Room Argument which is confusing since you also have a Chinese program in your example which has no analogy in the Chinese room Argument.
You Dont Know Me
20-08-2006, 00:48
Fergi America']The same can be said for humans. Our own apparent cognizance and perception of consciousness could very well be, and probably is, just an adaptation for the survival of the species.
Obviously our consciousness was an evolutionary design that our more successful ancestors passed on.
That is besides the point, though.
I will also note that humans often use language and emotion without engaging (or instead of engaging) their cognizance. And people doing repetitive assembly jobs soon have no need to pay conscious attention to them unless they are mentally deficient. So I'm not surprised if some animal, which has been drilled on the same task incessantly, turns mentally "off" and operates without showing signs of real cognition. A human does the same thing in the same type of circumstances. Boredom = Tuned Out.
That was my entire point, language and emotion can exist independent of cognizance. So we cannot say that an animal mind exists simply because they experience emotion or master language.
Most social interaction also seems to be automated, with each individual settling on one of just a few situational response-variants as their preferred mode of operation (especially in public), and ignoring other possibilities. The few people that seem to think outside the box, if they look hard enough, will find others who are out of the box in the exact same way (and therefore are actually IN a box, just a much smaller one)!
I am a hardline determinist, so in the end I think we are just as robotic as animals in our behavior. However, even non-determinists must admit that we are robotic in the great majority of our actions.
Humans also have a fairly predictable range of emotions. It can seem randomized and/or complex but that seems to be a result of a person's current reaction pattern having been built on thousands of prior experiences.
Even most "insanity" can be classified according to known types.
None of what I've been able to observe suggests to me that humans are any more than sophisticated programs.
I do not disagree with any of this. However, only our program is has the sufficient capabilities to merit and utilize rights.
You Dont Know Me
20-08-2006, 00:52
Here's the kicker though. What aspects would make for sapiency? Conscious thought? Self awareness? The ability to logically conclude an answer to solutions and learn from previous errors?
I imagine it will be a "know it when we see it" situation.
We will never call androids sentient no matter how incorrect we may be. As a species, we are, deep down, ego-driven imperfections. Those tests for sentience are flawed heavily. How do I know? Because guess who performs them. Take a wild guess. It might be people, you know, those heavily flawed meatbots that can't go one century without starting a war against other meatbots, that perform them.
You see, I know I have an ego. I hate being wrong. We, as a whole, want to view ourselves as something special, as the extinction-proof perfect super-species. We really don't know if that dog can think for itself or not. If we get ANY evidence of sentience in a dog, we merely dismiss it as pure instinct or a weird mutational flaw. Chimps can be taught American sign language. They learn that, by using it, they can get more of what they want. Hell, one chimp even taught another sign language. IS this any evidence of sentience?
Dismiss me as a severely retarded knownothing if you will, meat egoes, but you honestly can't think we aren't severely flawed. Hell, I don't even know if we are sentient if nothing else can be.
You Dont Know Me
20-08-2006, 03:19
We will never call androids sentient no matter how incorrect we may be. As a species, we are, deep down, ego-driven imperfections. Those tests for sentience are flawed heavily. How do I know? Because guess who performs them. Take a wild guess. It might be people, you know, those heavily flawed meatbots that can't go one century without starting a war against other meatbots, that perform them.
You see, I know I have an ego. I hate being wrong. We, as a whole, want to view ourselves as something special, as the extinction-proof perfect super-species. We really don't know if that dog can think for itself or not. If we get ANY evidence of sentience in a dog, we merely dismiss it as pure instinct or a weird mutational flaw. Chimps can be taught American sign language. They learn that, by using it, they can get more of what they want. Hell, one chimp even taught another sign language. IS this any evidence of sentience?
Dismiss me as a severely retarded knownothing if you will, meat egoes, but you honestly can't think we aren't severely flawed. Hell, I don't even know if we are sentient if nothing else can be.
Our greatest evolutionary design was empathy. It allowed for far greater social interaction and development than any other species. That social interaction has done more for our species than pure biological forces ever could.
This same empathy does not solely exist between other humans. There are innumerable instances of people personifying their pets, treating them more lavishly than some treat their kids. It is a widely accepted policy that all animals deserve a certain level of dignity in their treatment.
Our empathy leads us to discern the similarities and differences between all things and ourselves, not simply other people and ourselves. As Dissonant Cognition stated earlier, he finds similarities between his desire not to be caged and an animals desire not to be caged and naturally empathizes because of it.
If this scenario ever becomes true, our evolutionary ability to identify the similarities in the android will eventually outweigh the traditional differences.
Our greatest evolutionary design was empathy. It allowed for far greater social interaction and development than any other species. That social interaction has done more for our species than pure biological forces ever could.
This same empathy does not solely exist between other humans. There are innumerable instances of people personifying their pets, treating them more lavishly than some treat their kids. It is a widely accepted policy that all animals deserve a certain level of dignity in their treatment.
Our empathy leads us to discern the similarities and differences between all things and ourselves, not simply other people and ourselves. As Dissonant Cognition stated earlier, he finds similarities between his desire not to be caged and an animals desire not to be caged and naturally empathizes because of it.
If this scenario ever becomes true, our evolutionary ability to identify the similarities in the android will eventually outweigh the traditional differences.
And of course, the odd ability to emphasize with, say, a hammer, and consequently get mad at it when it rolls under the tool bench/hits your thumb.
Just thought I should mention that.
Markreich
20-08-2006, 15:58
Your opinions are irrelevant. You will be assimilated. Your technological and biological distinctiveness will be added to our own. Your culture will adapt to service ours. Resistance is futile.
Ego Julius Caesar Borognum! Veni Vidi Assimilatti! :D