NationStates Jolt Archive


U.S. Army Robots Break Asimov's First Law

Kievan-Prussia
16-03-2006, 18:21
http://hardware.slashdot.org/article.pl?sid=06/03/15/1359233

Mass awesome.
DrunkenDove
16-03-2006, 18:31
Makes sense. You can't kidnap a robot, or shake it's moral. The American public is hardly going to have a kniption fit when ten robots are destroyed.

The only problem is that if it's controlled by radio waves then it's possible to hack. Brings a new meaning to the phrase "information warfare"
Strathdonia
16-03-2006, 18:32
The thing is they aren't really "robots" in the Aismov sense (which IIRC applies to more or less true artificial intellgence constructs capable of making decisions) and are more just remotely operated weapons that require a direct man in the loop control system to engage a target.

Now a small tracked vehicle with a MG is nothing, its the unmanned helicotpers with the 71mm rockets, and predators with hellfires and SDBs you need to worry about.

Actually IIRC the US navy used an unmanned helicopter to fire anti submarine torpedos during the 60s and 70s...
Norleans
16-03-2006, 18:33
Well, since they are controlled by human operators who verify targets and give the OK to fire, I'd say they were not really violating the 1st law since they don't have an independent choice whether to fire or not. Asimov's laws assume the robot in question has a choice over what action to take and the first law limits choice by taking harm to humans off the table as a viable option during the decision making process. Since these robots don't make a choice, then the 1st law doesn't apply to them. They are not so much robots as they are remote controlled, mobile, machine guns.
RetroLuddite Saboteurs
16-03-2006, 18:35
in the real world we are never going to adopt asimov's laws. he was hoping for a more sane , rational and peaceful future than we can ever expect. keith laumer's bolos are more likely to be the selfaware robots of our future.
Farthingsworth Reborn
16-03-2006, 18:57
I think the whole point of creating robots is so that they can do those tasks that are dangerous, tedious, or just distasteful to humans. In light of that idea, it seems inevitible that a major purpose for robotic development is going to be warfare. What could be more distasteful and potentially dangerous that using a weapon to kill a human?
Kerubia
16-03-2006, 19:19
I think the whole point of creating robots is so that they can do those tasks that are dangerous, tedious, or just distasteful to humans. In light of that idea, it seems inevitible that a major purpose for robotic development is going to be warfare. What could be more distasteful and potentially dangerous that using a weapon to kill a human?

*Thinks of the United Civilized States (UCS) in the Earth series*
Gauthier
16-03-2006, 19:21
in the real world we are never going to adopt asimov's laws. he was hoping for a more sane , rational and peaceful future than we can ever expect. keith laumer's bolos are more likely to be the selfaware robots of our future.

Although given current political climates, an Ogre is more likely.
Dododecapod
16-03-2006, 19:24
Unfortunately true. But neither an Ogre nor a Bolo could do what the US wants: Infantry work in hostile environments (such as occupied cities). They could only flatten said cities.
Megaloria
16-03-2006, 19:25
Awesome, now let's see some battlemechs. And hope that the robots don't form a rebellion and go to war with the republic. Roger Roger.
The Infinite Dunes
16-03-2006, 19:26
What could be more distasteful and potentially dangerous that using a weapon to kill a human?Sending in a machine to do the job for you? The whole idea of a machine making a rational choice as to whether kill a human or not sends shivers down my spine. But then the whole idea of war sends shivers down my spine anyway.

I can't remeber if it was Bladerunner or another film, but this whole thing reminds me of a scene from a film where robots are sent into a run down housing district to track down this guy and kill him. Must of been bladerunner as I seen to remember them scanning people's iris' to check who they were. That whole scene... *shudders*

edit: bah, it was Minority Report. Useless memory.
Neu Leonstein
17-03-2006, 00:20
I had a thread with almost the exact same topic a while back. The thing is that Asimov's law is fiction, it's not really of any importance when they design robots like this.

http://www.army-technology.com/projects/taifun/
This was the thing that sparked my thread. I mean, there is something slightly worrying about robots flying about like that and killing people.
Moantha
17-03-2006, 00:28
Couple of points.

First, as they were never programmed with the three laws, they cannot break them.

Second, are you saying that there have been no cases where robotic machines have accidentally injured humans, or have not reacted to prevent humans from coming to harm?
Bobs Own Pipe
17-03-2006, 04:23
in the real world we are never going to adopt asimov's laws. he was hoping for a more sane , rational and peaceful future than we can ever expect. keith laumer's bolos are more likely to be the selfaware robots of our future.
Well, we all know the perils of extrapolating a future based on one's present/immediate past, don't we? I say there's still lots of room for Asimovian robots. Don't sell me a bleak future, 'cause I'm not buying.
RetroLuddite Saboteurs
17-03-2006, 05:04
Well, we all know the perils of extrapolating a future based on one's present/immediate past, don't we? I say there's still lots of room for Asimovian robots. Don't sell me a bleak future, 'cause I'm not buying.

maybe you're not buying, but i think there is alot of money to be made selling short on dark futures.
Novoga
17-03-2006, 05:09
This was the thing that sparked my thread. I mean, there is something slightly worrying about robots flying about like that and killing people.

They would be killing the enemy, which is not the same as killing people. At least that is how one must view it to accept it.
Bobs Own Pipe
17-03-2006, 05:09
maybe you're not buying, but i think there is alot of money to be made selling short on dark futures.
There's no shortage of dupes these days. Doom-and-gloomers'll believe just about anything you feed 'em, provided you sound like you know what you're talking about and wear a suit and tie. Or medals and gold braid.
RetroLuddite Saboteurs
17-03-2006, 05:15
There's no shortage of dupes these days. Doom-and-gloomers'll believe just about anything you feed 'em, provided you sound like you know what you're talking about and wear a suit and tie. Or medals and gold braid.
of course the bolos were actually kinda cool, in a tragic self aware war machine sorta way.
Bobs Own Pipe
17-03-2006, 05:21
Well, I'll trump you all: Philip K. Dick's Leadies. Built to fight, they carried on one the most incredible mass-deceptions in order to secure a better future for mankind.

Nyahh.:p
RetroLuddite Saboteurs
17-03-2006, 05:24
Well, I'll trump you all: Philip K. Dick's Leadies. Built to fight, they carried on one the most incredible mass-deceptions in order to secure a better future for mankind.

Nyahh.:p
love old horselover fats but i don't think i know that story
Bobs Own Pipe
17-03-2006, 05:27
It's from the short story, "The Defenders".
Kroisistan
17-03-2006, 05:55
I'm not the least bit comfortable with this, because things like this will eventually replace human military personell.

When we can reduce our war casualites to 0 with robotics... what's to stop us from going on a warring rampage? After all, most people weary of war when the casualities on their side begin to mount. Remove the casualties on our side and we could probably get most people in the US to go along with any war for an extended period of time, because the only cost to us will be replacing the few robots our 3rd world enemies manage to blow up.

That will spell the end of guerilla warfare, and thus the end of the underdog's ability to fight the powerful. We'd essentially become militarily omnipotent, and I shudder to think what this country might do with that.
Kanabia
17-03-2006, 06:08
That will spell the end of guerilla warfare, and thus the end of the underdog's ability to fight the powerful. We'd essentially become militarily omnipotent, and I shudder to think what this country might do with that.

Spread "freedom and democracy", of course.
Non Aligned States
17-03-2006, 06:13
I'm not the least bit comfortable with this, because things like this will eventually replace human military personell.

That will spell the end of guerilla warfare, and thus the end of the underdog's ability to fight the powerful. We'd essentially become militarily omnipotent, and I shudder to think what this country might do with that.

Maybe, maybe not. If the replacement actually goes on a large enough scale, guerilla warfare will probably evolve into internet warfare with the actual guerilla's trying to hack into the coms networks. Granted, this would limit the actual amount of guerillas to those with the resources and skills, but the effects would be more disastrous.

Imagine if a division of home stored robots suddenly had IFF disabled and were set on a rampage.

Will we see a fully automated robot capable of making independent decisions in an infantryman's role? Not in our lifetimes I guess. Human ingenuity hasn't yet been programmable.
Kerubia
17-03-2006, 06:17
Will we see a fully automated robot capable of making independent decisions in an infantryman's role? Not in our lifetimes I guess. Human ingenuity hasn't yet been programmable.

To fix that, simply make the robot human controlled.

And make war just like a video game.

*shrug*
Kaledan
17-03-2006, 07:01
The best part is that we can use this droid army to invade Naboo, thus triggering the Clone Wars all over again. It almost seems too easy...
Saint Curie
17-03-2006, 11:06
U.S. Army robots are breaking Asimov's First Law...

Next, you'll tell us that we've deviated from the Seldon Plan.
Bolol
17-03-2006, 12:38
Heh...Reading your title, I thought the US had deployed robots that turned on THEM.

I keep telling people technology will one day rule us...I mean RULE us.
BackwoodsSquatches
17-03-2006, 13:06
The U.S will eventually have robotic drones to do most of the combat, and its likely that these will be controlled by humans.
Fortuntately, these will probably be damned exspensive, and future warmongers will be as likely to drive the economy further into the toilet, las our current warmongering fool.

I envision future warfare to be entirely guerrilla tactics, as these bands of ragtag soldiers discover ways to avoid electronic surveillance of flying drones, and force most combat to tunnels, and urban surroundings.

Most likely, you'll have robotic heavily armed drones to go in to hostile situations, and eliminate heavy-fire situations, followed by human soldiers, to complete mopping up and securing procedures.