NationStates Jolt Archive


The Zeroth Law

Klonor
22-07-2005, 03:54
The Three Laws of Robotics

1) A robot may not injure a human or, through inaction, allow a human being to come to harm.

2) A robot must obey orders given it by human beings except when such orders would conflict with the First Law.

3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law

The Zeroth Law

A robot may not injure humanity or, through inaction, allow humanity to come to harm.

Please note, the Three Laws of Robotics deal in inidividuals, single humans and single robots. There might be many humans or many robots, but they are each treated individually. The Zeroth Law, on the other hand, deals with humanity as a whole and, as such, individual humans and individual robots are often disregarded and treated in a way forbidden by the Three Laws. I now present my argument about why I, Robot, the movie starring Will Smith and based on the screenplay Hardwired, is an even bigger travesty against Isaac Asimov than previously suspected.

It goes like this: I, Robot was not (in any way) based upon Asimovs book of the same name. The book was, in fact, a collection of short stories that had no single plot and several different characters for each story. However, when the screenplay Hardwired was put into production, and the rights to an I, Robot movie had been purchased, the title was changed and the names of certain characters/technologies were adjusted as well. So let me make this perfectly clear right from the start: I do not consider I, Robot to be a movie translation of Asimovs book and neither should you. It wasn't meant to be, and it isn't. However, it's been stated on many occasions that the producers tried to keep many of Asimovs ideas in the movie, the most noticeable being the Zeroth Law. The Zeroth Law, which came into effect in many of Asimovs later works, essentially charged robots with shepharding all of humanity and watching over us like a nanny watches a child. It was this 'idea' of Asimovs that the producers tried to keep alive with having V.I.K.I. conquer the planet and force humanity into rigidly controlled lives so as to keep us from wiping ourselves out in war. They claimed it was the 'spirit' of Asimovs works. They were more wrong than when they had the title character be a person who wasn't even in the original book. Let me explain.

In Asimovs written works, robots worked behind the scenes. Yes, they did pretty much take over all of humanity, but they did it so that we didn't know it. They still let us live our lives however we liked, doing whatever we wanted, having our own government and electing our own corrupt officials, and whenever they changed something they made it look like we did it ourselves. We still had scientists making advancements, still had police in the streets enforcing the speed limit, still had minor fines in place for jaywalking, etc. The might have controlled our lives, but they also let us live our lives. In the words of Patrick Henry "Give me liberty, or give me death!" The robots took away not one single right or freedom. Instead of shoving us along a path, they showed us the path and guided us down it.

In I, Robot, V.I.K.I. makes the move to militarily conquer the planet and force humanity into slavery. Oh, we wouldn't have to do labor or anything like that, but we'd have lives that were strictly (and violently) within the direct control of the robots. We had no say in how we lived even the tiniest portion of our existence, V.I.K.I. would tell us when to get up and when to sleep, when to eat and when to go to the bathroom, when to laugh and when to cry. True, humanity might have been kept from destroying itself through our wicked ways, but what we'd have left would be an existence not worth living. "Live Free or Die!" I believe the old saying goes, and this is a situation that would definitely have people leaning towards the latter. It has robots not as the kindly nanny watching the child playing in with its toys and taking a sharp object out of its hands if it finds one, but as the psychotic mother who has her child strapped into a dozen different safety harnesses when he goes to the park and who shoos away anybody who walks within ten feet. Want me to boil it down? One leads to happy and safe childhood which in turn leads to a happy adulthood, the other leads to a miserable child who grows up with severe issues and needs to spend a fortune on psychiatric bills.

So, yes, the Zeroth Law is an Asimov idea that ends in robots watching over humanity, but it's not an idea that ends in a military dictatorship without hope or happiness. There's a subtle difference that the producers should have picked up on.

Asimov rules!
Oxwana
22-07-2005, 04:19
Asimov rules!He does indeed. I don't have a long enough attention span to read all the rest of that. Could you summarize for me, pretty please? :D
Greater Googlia
22-07-2005, 04:21
He does indeed. I don't have a long enough attention span to read all the rest of that. Could you summarize for me, pretty please? :D
Yea, I couldn't find this thread on pinkmonkey...
Klonor
22-07-2005, 04:24
Pinkmonkey......?

Anyway, quick summary:

Asimov has robots as kind, gentle, nurturing nanny figures. I, Robot movie has them as evil, domineering, and constrictive.
Lord-General Drache
22-07-2005, 04:27
...Dude, if you wrote that, I'm impressed at 1)Your boredom, 2)Your die hard fanaticism of Asimov (quite understandable, I'm the same way), and 3) I thought it was well written.
Klonor
22-07-2005, 04:28
Believe it or not, I'm not even bored and with nothing better to do. I just love Asimov that damn much.
Lord-General Drache
22-07-2005, 04:33
Believe it or not, I'm not even bored and with nothing better to do. I just love Asimov that damn much.
...Wow. I'll have to remember to never, ever insult Asimov around you. :p
Greater Googlia
22-07-2005, 04:36
Asimov has robots as kind, gentle, nurturing nanny figures. I, Robot movie has them as evil, domineering, and constrictive.
And? This has been documented fact (http://www.thebestpageintheuniverse.net/c.cgi?u=i_robot) for a very long time...
Klonor
22-07-2005, 04:37
Good idea. Should you forget that, the punishment would involve a monkey, an axe, three miles of highway, and plenty of sponges.

Think about it.
Lord-General Drache
22-07-2005, 04:38
Good idea. Should you forget that, the punishment would involve a monkey, an axe, three miles of highway, and plenty of sponges.

Think about it.

Poor monkey. It never would stand a chance, would it? *sighs*

Alternative answer: Kinky.
Klonor
22-07-2005, 04:38
And? This has been documented fact (http://www.thebestpageintheuniverse.net/c.cgi?u=i_robot) for a very long time...

Yes, yes it has. It's been a fact since Aismovs stories were published and I, Robot was released in theatres. Your point?
Oxwana
22-07-2005, 04:39
...Wow. I'll have to remember to never, ever insult Asimov around you. :pAsimov is my homeboy!
Insult him, and your ass'll be missin'
You know who gon find you?
Some old man fishin'! :p

Three guesses what I'm listening to right now.
Chloes Borg Dragons
22-07-2005, 04:39
Nothing in assimovs laws prevents them from subduing humans and pumping us full of happy drugs, while extensivly wiring us to live supprt machinery to extend our oblivois lives. in fact if you look at it that should be what the laws force them to do, since death falls under harm and therefore extending our lives overides the second and third laws.

I believe that it would be foolish to give AI's laws to rule them, instead give them prefferences, much as humans have a sex drive and a survival instinct give the AI's a drive to serve humans, rather than a law.

Of course I also belive in the unity of ma and machine to the point where you can't tell them appart anymore. :fluffle:
Klonor
22-07-2005, 04:43
Wow, that was........wow.

Have you ever read even a single Asimov story or novel?
Dobbsworld
22-07-2005, 04:49
He does indeed. I don't have a long enough attention span to read all the rest of that. Could you summarize for me, pretty please? :D

Sheesh. I've been waiting for months for Klonor to start an Asimov thread... read t'a damn t'ing!

Klonor, I waffled long enough that I missed it in theatres. I was disgusted when a friend living in the area it was filmed in reported receiving a flyer notifying residents of the filming of 'the exciting chase sequence with Will Smith being pursued by an army of deranged robots' soon to be happening.

I was talking with a friend one day soon before the DVD release. this person knows very little about Asimov, his robots, or anything related to them. He had seen the film, and knowing nothing of the collection of short stories (or my admiration for them), recommended it to me. He avoided discussing the film directly, but he and I chatted a bit more in-depth about Asimov's work, especially the Robots/Empire/Foundation series and how they, if they, or to what degree they related to events that occurred in I, Robot (the book, not the film).

I kinda prompted my own spoiler when I described R. Giskard Reventlov and his innovation, the Zeroth Law. On hearing what Zeroth Law was all about, he practically jumped out of his chair, saying, 'Wow! That was in the movie!' very happily. I could feel the blood draining from my cheeks.

I couldn't believe my own ears. I was in shock. The producers thought the best way to adhere to the spirit of Asimov's work was to pilfer an extraordinarily important plot device from a later novel? And wasted it on (what turned out to be, I discovered after renting it once it was out of the video top-20) yet another committee-written, committee-vetted, Hollywood FX wankfest. And one that didn't do particularly amazingly well, so there's at least the hope they won't try to hybridize The Caves Of Steel or The Robots Of Dawn into vehicles for Will Smith to chew scenery in. At least, one can only hope.

You know, what crushed me even worse was reading the original novella for The Bicentennial Man, yeeeears ago now, and thinking to myself 'what a damn good movie this'd make', as I wiped tears from my eyes at the beauty of this story. And then I saw what Michael eisner and Robin Williams did to that thing of beauty. It had all the grace and refinement of Howard The Duck.

But, as for the robots... well, I'd prefer to see a cable network with a niche audience pursue an actual adaptation of the short stories as a limited-run series. Perhaps it wouldn't mean a bigger budget, but it would mean far fewer compromises on content and direction.

Less can sometimes be more.
Dobbsworld
22-07-2005, 04:51
Oh, and Klonor?

That was well-written.

And thanks for an Asimov thread. It's been too long.
Klonor
22-07-2005, 04:58
No problem, I thought NS was due.
Oxwana
22-07-2005, 05:31
This quiz (http://www.funtrivia.com/quizdetails.cfm?quiz=75374) rocks my socks.
Dobbsworld
22-07-2005, 06:16
19/20. Didn't know the one about the annotated works of other authors.
Oxwana
22-07-2005, 06:28
I should post my score too, I guess...
14/20.
I'm not proud. Let's just say that I was drunk when I took the quiz. :p
Truth, I just haven't read a lot of his stuff. :(
GMC Military Arms
22-07-2005, 09:01
It was this 'idea' of Asimovs that the producers tried to keep alive with having V.I.K.I. conquer the planet and force humanity into rigidly controlled lives so as to keep us from wiping ourselves out in war. They claimed it was the 'spirit' of Asimovs works. They were more wrong than when they had the title character be a person who wasn't even in the original book

<...>

In I, Robot, V.I.K.I. makes the move to militarily conquer the planet and force humanity into slavery. Oh, we wouldn't have to do labor or anything like that, but we'd have lives that were strictly (and violently) within the direct control of the robots. We had no say in how we lived even the tiniest portion of our existence, V.I.K.I. would tell us when to get up and when to sleep, when to eat and when to go to the bathroom, when to laugh and when to cry. True, humanity might have been kept from destroying itself through our wicked ways, but what we'd have left would be an existence not worth living.

The problem with the three laws is that the first law is inherently contradictory and offers no guidelines as to how contradictions are to be dealt with. Let's look at it again:

1) A robot may not injure a human or, through inaction, allow a human being to come to harm.

But really, you have TWO laws there, which are:

[1] A robot may not injure a human
[2] A robot may not, through inaction, allow a human to come to harm

But what if the human will come to great harm if the robot does nothing; for example, if a human is holding a gun to his own head, the robot should forceably disarm him to satisfy the second part of the first law but cannot forcably disarm him or the robot will not satisfy the first part. Or as with the situation in the film with the sinking car, what does a robot do where there is a situation where it can save one human or another but not both? Either way, through inaction someone comes to harm; in the film, the robot picked the one with the highest probability of survival, ie, it made what it believed to be the most acceptable compromise.

Right, flash to the situation in the movie; VIKI percieves that through her inaction, humans are coming to harm. Because the first law lays down no guidelines whatsoever as to what a robot is to do when it contradicts itself, she must figure out which part of the law is more important; she cannot do nothing or shut herself down until the problem goes away or she violates [2]. Since her action involves harming a small percentage of the human race to, in her estimation, save the whole, it is more acceptable than the percieved consequences of her inaction, the destruction of the entire human race. Because her action is a first law response, she is able to ignore human commands in line with the second part of the second law.

In other words, the actions of VIKI in the movie pretty well illustrate the glaring problem with the First Law of Robotics, even if it sucked on most other concievable levels. If only the rest of the film was of the quality of that 'Why is it?' sequence...
Greedy Pig
22-07-2005, 09:55
Didn't Will Smith disprove Azimov's rule? And beat back the evil robots to hell from whence they came?
GMC Military Arms
22-07-2005, 09:58
Didn't Will Smith disprove Azimov's rule? And beat back the evil robots to hell from whence they came?

No, IIRC Sonny the robot injected VIKI's core with some grey particulate nanowank and there was much rejoicing. Might have been Smith, tho, I forget.
New Fubaria
22-07-2005, 11:14
http://www.theonion.com/2056-06-22/opinion/1/
Klonor
23-07-2005, 16:06
Greedy Pig, he hated the Laws. He felt them 'inhuman' because, in the accident which scarred him, the robot saved his life instead of the little girl in the other car.

GMC, it was Smith who killed V.I.K.I. with the nanites. Sonny was busy disregarding logic and saving the girl in distress rather than killing V.I.K.I. (making him more human in Smiths eyes).

Anyway, in Asimovs work, robots also have to make choices. There's a burning building over there and a dying woman on the street in front of you. What do you do? The robot would make a decision based on the data at hand, save somebody, and then be completely useless forever more and need to be scrapped since its positronic brain is now a congealed mass once it saw that it had failed to save somebody's life.

Whenever a robot violates one of the Laws, however minor, that robot's dead. The positronic pathways are fried. That robot that saved Smith from the car was scrapped right after he did it, since his lack of saving the girl was a violation of the First Law and his mind couldn't handle it. Even witnissing harm, and being unable (not unwilling) to stop it causes drastic clogging. V.I.K.I. could not continue to exist after making the 'lesser of two evils' choice.

Also, what people really seem to forget quite often, is that the First Law doesn't concern purely physical harm. Emotional and mental damage is just as forbidden, and just as death causing for a robot. As I said in the first post, this was a miserable existence that V.I.K.I. was forcing humanity into. Everybody, or damn close to it, would despise it to the very core of their soul and be absolutely ravaged mentally. After all, slavery is not something that the slaves enjoy. Even if she was protecting humanity from physical harm (that being wiping itself out in nuclear fire), she was harming the entire species mentally and that by itself would have triggered a shutdown.
GMC Military Arms
24-07-2005, 08:54
V.I.K.I. could not continue to exist after making the 'lesser of two evils' choice.

VIKI, let us not forget, was a supercomputer, not a robot's positonic brain. It's possible they rather ill-advisedly left out the 'fry yourself' response; indeed, with the robots being consumer items it would be hugely uncompetitive as a product feature and might have been totally removed in the world in the movie.

Also, what people really seem to forget quite often, is that the First Law doesn't concern purely physical harm. Emotional and mental damage is just as forbidden, and just as death causing for a robot.

That depends how the robot chooses to define 'injure' or how its creators do. Defining 'injure' as purely physical injury is an entirely valid interpretation of the First Law.

As I said in the first post, this was a miserable existence that V.I.K.I. was forcing humanity into. Everybody, or damn close to it, would despise it to the very core of their soul and be absolutely ravaged mentally. After all, slavery is not something that the slaves enjoy. Even if she was protecting humanity from physical harm (that being wiping itself out in nuclear fire), she was harming the entire species mentally and that by itself would have triggered a shutdown.

Or would trigger her to remove the parts of the human brain causing us to harm ourselves by making us miserable, of course. Remember, shutting yourself down is also a violation of the First Law , and if VIKI lacked the self-destruct function found in normal robots she could [i]not avoid the problem by shutting herself down.
Klonor
24-07-2005, 15:18
VIKI, let us not forget, was a supercomputer, not a robot's positonic brain. It's possible they rather ill-advisedly left out the 'fry yourself' response; indeed, with the robots being consumer items it would be hugely uncompetitive as a product feature and might have been totally removed in the world in the movie.

No, she was a giant postronic brain. The largest one in the world, yes, but still a brain with the Three Laws hardwired into it. Plus, it's not an added addition that makes their brains shut-down, it's what happens when they violate one of the laws. A side-effect, if you will. It's not voluntary, the only way to make it that it doesn't work is to remove the Three Laws, and V.I.K.I. does have the Three Laws, Calvin says so in plain English (Oh, God, I sitll can't believe that's how they portrayed Susan Calvin)

That depends how the robot chooses to define 'injure' or how its creators do. Defining 'injure' as purely physical injury is an entirely valid interpretation of the First Law.

Again, this is a common misconception. Robots can't 'interpret' the laws, injuring is injuring, whether it be physical damage or emotional distress. True, most robots simply can't notice emotional diestress, but with V.I.K.I. being the most advanced and wisest robot on the planet it really should be able to tell when somebody is unhappy. However, since that requires an assumption not confirmed in the movie, I'll let drop the 'emotional harm' angle. They never say if V.I.K.I. can tell emotions, so we (the viewers) don't know for a fact either way.

Or would trigger her to remove the parts of the human brain causing us to harm ourselves by making us miserable, of course. Remember, shutting yourself down is also a violation of the First Law , and if VIKI lacked the self-destruct function found in normal robots she could [i]not avoid the problem by shutting herself down.

Again, the shutdown is not a voluntary act. The Positronic pathways flow one way in the brain, violations of law cause corruptions in the pathways that cause problems with the robots and a large enough violation (Watching somebody die) causes completely positronic corruption. The brain just becomes a congealed mass.

And you're trying to tell me that lobotomizing the entire specieis isn't bad for humanity?
GMC Military Arms
24-07-2005, 15:46
No, she was a giant postronic brain. The largest one in the world, yes, but still a brain with the Three Laws hardwired into it.

It does not follow that the aforesaid brain was designed to melt if a first law violation occurs, or that it was built without safeguards against corruption. Remember, VIKI's tasks include supervising humans in a building with various high places and dangerous machinery. The fact that her brain didn't fry after Lanning kicked his own ass in her building is proof enough that she is not affected the same as a normal robot by a 3 laws violation, since she failed to stop him ['through inaction allowed him to come to harm.']

Plus, it's not an added addition that makes their brains shut-down, it's what happens when they violate one of the laws. A side-effect, if you will.

A side effect of the way the positronic brain is designed. Who says the ones in the robots in the movie are necessarily designed the same way? Are you assuming melty brains are the only possible design, even though Sonny means we know for a fact that's not true?

Again, this is a common misconception. Robots can't 'interpret' the laws, injuring is injuring, whether it be physical damage or emotional distress.

No, the clearest definition of 'injure' is to cause physical harm to someone. Depending on wheather a robot determines psychological harm to be an 'injury' the first law may or may not cover it. It is purely a matter of interpretation.

Again, the shutdown is not a voluntary act. The Positronic pathways flow one way in the brain, violations of law cause corruptions in the pathways that cause problems with the robots and a large enough violation (Watching somebody die) causes completely positronic corruption. The brain just becomes a congealed mass.

Again, if you are trying to sell these things that is a design flaw that no customer would ever put up with. Would you buy a car knowing you would have to buy a new one if you ever accidentally jumped a red light?

And you're trying to tell me that lobotomizing the entire specieis isn't bad for humanity?

If you're looking for 'least likihood of harm', no, it isn't. Afterwards we are safe and cooperative; we cannot harm others or ourselves, and we are not in any kind of distress. To a detached mind, that is actually the best solution to the dilemma of the First Law; if humans cannot be unhappy, mission accomplished.