NationStates Jolt Archive


Hypothetical Situation, involving Artificial Intelligences

Neo-Anarchists
29-09-2005, 01:27
Suppose, for a moment, that someone has developed an artificial intelligence.
It is quite like a human, in that it has consciousness of self, free will, emotions, and most of those other human things. Suppose these AIs are given similar rights to biological intelligences, due to their having the same qualities as humans.
Now, suppose someone programs a very similar one, with a slight difference in the programming. But this tweak does something interesting:
It makes the AI want to do what it is told.
No, it doesn't force the AI into doing what it is told, the AI actually desires to do as it is told. It would be a creature that actually derives pleasure from following orders.

Would it be moral to create that AI?
Desperate Measures
29-09-2005, 01:37
It would be just as moral to create a manicly depressed AI. I'm not sure if morals enter into things created by humans unless it acts in an immoral manner toward humans. Other sources of AI might disagree. I'd like to hear what they had to say.
Neo-Anarchists
29-09-2005, 01:39
It would be just as moral to create a manicly depressed AI. I'm not sure if morals enter into things created by humans unless it acts in an immoral manner toward humans.
I'm not sure I understand. By 'morals not entering into it', do you mean that it would be fine to do whatever you choose with an AI?
Peisandros
29-09-2005, 01:42
Hmm. I think it would be moral. They are created by human correct? Therefore human can make them however they wish.
Desperate Measures
29-09-2005, 01:48
I'm not sure I understand. By 'morals not entering into it', do you mean that it would be fine to do whatever you choose with an AI?
From my point of view, yes. I'm not sure if it's moral for someone with biblical beliefs to sodomize his DVD player but I have no moral issue with it. How can you treat something that is not real, badly? You might not want to because it might know that you are doing something bad to it and say things like, "STOP!! NO!! DEAR GOD!! YOU'RE HURTING ME!!" I don't know if I could keep kicking anything that reacted in horror to me. But that doesn't make it immoral to build something and then make what you built do what you want it to do.
Kiwi-kiwi
29-09-2005, 01:49
Hmm. I think it would be moral. They are created by human correct? Therefore human can make them however they wish.

I don't have a stance on this either way yet, but to play Devil's Advocate or something like that:

Some people use that logic to do what they want with their children. "I created him/her, so I can do what I want with him/her!"

There are several ways that doesn't necessarily relate properly to AIs, though.
Skinnydippers
29-09-2005, 01:51
I don't think it'd be wrong to make an AI that derives pleasure from following orders...I think a lot of people would actually be happier if they could get pleasure from following orders. If someone made the AI do terrible things(aka: murder, thievery, etc), then it would obviously be immoral, for you would be exploiting it against other human beings...

...but just because doing something makes it happy, I don't think it'd be immoral. However, I don't really think that you could give something emotion, intelligence, and a will and then be free to do whatever you wanted to it. After all, the only difference then would really be flesh and blood, I suppose. And a soul, if you're so inclined.
Neo-Anarchists
29-09-2005, 01:54
Hmm. I think it would be moral. They are created by human correct? Therefore human can make them however they wish.
I think it starts to get tricky here.
Surely, it is correct for people to do as they choose with things they create.
But, what happens when the thing you create happens to b a person?

If it is still correct to do as you wish with it, then wouldn't that imply that rights arise from some "otherness" outside of personhood? Something special to biological creatures?
I'm not sure I can agree with that conclusion.
Or perhaps one could say that they could have rights, and do count as people, but that creatorship implies control being moral?
I don't like the sound of that.

On the other hand, perhaps it is possible to accept that artificial persons have rights, but still have the programming modification be acceptable?
Surely it isn't limiting the intelligence's freedoms, as it cannot have rights before it exists, can it? And after it is activated, it still isn't being forced into anything, it desires to do those things.
Kryozerkia
29-09-2005, 01:58
I think the morality question comes into play because the AI that'd be compliant in following orders could fall into the hands of those who have less than desirable intentions in life and will turn this AI into a slave that'd do its bidding.
Peisandros
29-09-2005, 02:03
I don't have a stance on this either way yet, but to play Devil's Advocate or something like that:

Some people use that logic to do what they want with their children. "I created him/her, so I can do what I want with him/her!"

There are several ways that doesn't necessarily relate properly to AIs, though.
I see what you mean. Perhaps I didn't word it quite right. Hmm.
Neo-Anarchists
29-09-2005, 02:03
From my point of view, yes. I'm not sure if it's moral for someone with biblical beliefs to sodomize his DVD player but I have no moral issue with it.
But this issue is a bit different, isn't it? DVD players don't have will, can't feel pain, don't have goals in life, and don't even have the intelligence of a small animal.
How can you treat something that is not real, badly?
What about a machine intelligence makes it any less real than a biological one?
In the situation I started the thread with, I was talking about a hypothetical AI that was as close to identical to a human intelligence as possible.
One that can feel pain when abused, one that can make choices for itself, one that doesn't feel so great when unpleasant things are done to it.

Does merely the fact that the intelligence is non-biological make it less real?
You might not want to because it might know that you are doing something bad to it and say things like, "STOP!! NO!! DEAR GOD!! YOU'RE HURTING ME!!" I don't know if I could keep kicking anything that reacted in horror to me. But that doesn't make it immoral to build something and then make what you built do what you want it to do.
But how could it be moral to force it into doing things it doesn't want to?
Desperate Measures
29-09-2005, 02:17
I'm not really sure if my response was the sort that can be attacked point for point because I really only had one. Basically, it is a machine and no matter how intelligently it operates, it can never have life as we know and feel it. There isn't a moral dillemna except for the one you allow yourself to feel.
Ritlina
29-09-2005, 02:19
ok, listen, you need to specifiy what you mean by "Morals" everyone else has different morals. Frankly, if it was an female android with physical features of a human, i would buy one myself! :D (NO FLAME OR TROLL INTENDED!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!)
Desperate Measures
29-09-2005, 02:21
I think the morality question comes into play because the AI that'd be compliant in following orders could fall into the hands of those who have less than desirable intentions in life and will turn this AI into a slave that'd do its bidding.
Wouldn't that also make it immoral to manufacture a gun?
Neo-Anarchists
29-09-2005, 02:22
I'm not really sure if my response was the sort that can be attacked point for point because I really only had one. Basically, it is a machine and no matter how intelligently it operates, it can never have life as we know and feel it. There isn't a moral dillemna except for the one you allow yourself to feel.
Ah, but you yourself have just shown part of the moral dilemma.
I don't see why it is impossible for a machine to have emotion, or life, or any of those other things. Surely, if humans can experience these things while operating through the physical laws and mechanisms of our universe, other things could?

Small tangent:
Do you believe in the soul?
If so, that might explain some things I didn't understand.
Desperate Measures
29-09-2005, 02:27
Actually, I'm not sure if I believe in a soul. But the idea that something could be created that would actually perform life functions, to me that goes beyond AI. Seeing as how it could no longer be called Artificial Intelligence and would have to be called True Intelligence, I feel would be operating on grounds further from computer technology and more in the realm of biology. Can man create life? If he did, it wouldn't be able to operate like a computer would.
Alagos
29-09-2005, 02:28
hmm... this is a lot like I, Robot. I think that if something went wrong it could be really dangerous. But.. i dunno. I guess it really depends. You know, you can't plan for everything. I just wouldn't make AI too smart or strong.
Zatarack
29-09-2005, 02:29
No. Because in doing so, you deprive it of free will.
Messerach
29-09-2005, 02:30
This is pretty tricky, but I don't think it'd be immoral to create it with the desire to do what it is told. Presumably this would just be one of many desires, so you are only guaranteed to be obeyed if you ask something positive or benign. It might enjoy making you a cup of tea, but if you ask it to kick your neighbours dog, on the balance it would rather not. If you made the enjoyment overpoweringly strong it would be immoral as it would be very easy to make the AI feel guilty.

Note- I don't hold humans sacred at all so I really don't care that the AI is artificial. I think we are special, but only because our brains have evolved to such a level of complexity that we can have things like art and philosophy. To me, an AI close enough to imitating us would not necessarily have any less worth.
Messerach
29-09-2005, 02:38
No. Because in doing so, you deprive it of free will.

I don't agree, free will is not the absolute that it is often made out to be. Humans already have a whole range of desires that affect our behaviour. Psychologists have noticed an experimenter effect where participants tend to try to please the experimenter or behave in the way they think they are 'supposed' to, and this is competely independent of reward. Some people do enjoy being told what to do, as this gives them structure and transfers the responsibility for their actions to the person giving the orders. Other people resent being told what to do, and may desire not to do it even if the request was reasonable or the person has a legitimate reason to give orders. Free will naturally has constraints.

If you created an AI with no desires so that it could exercise pure free will, what would be the point? Free will is only valuable if you desire it.
Imperial Dark Rome
29-09-2005, 03:46
It must be immoral because Satanism strongly supports creating artificial intelligences. Hahaha. We believe AI will solve many (if not all) problems.

It's number four of our Pentagonal Revisionism: A Five-point program aka our goals.

"4. Development and production of artificial human companions—The forbidden industry. An economic “godsend” which will allow everyone “power” over someone else. Polite, sophisticated, technologically feasible slavery. And the most profitable industry since T.V. and the computer."

~Satanic Reverend Medivh~
Phasa
29-09-2005, 06:41
How about if you programmed it to be happy no matter what it was doing? It would be just as content following orders as anything else it might be doing? Would that change the morality issue?
Callisdrun
29-09-2005, 06:54
It would be just as moral to create a manicly depressed AI.

Like in Hitchhiker's Guide to the Galaxy?
The WYN starcluster
29-09-2005, 17:59
It seems implicit in this discussion that, creating this form of AI is but the first step in creating a rabid - super Nazi - perfect army for world conquest.

So many people have this trait ( flaw? ) that you are not really creating anything inhuman. Nothing new under the sun.

Silly analogies:
---
[Wife] - Time to get up & go to work dear.
[Husband] - Yes dear.
---
Or how about:
---
[Husband] - Git in there & clean them dishes woman!
[Wife] - What? No Nookie tonight? You clean 'em!
[Husband] - ( defeated yet HAPPY ) Yes dear.
---
You could give *me* an order to "Go have sex with that supermodel over yonder!" & *wham* gleeful obedience?

Work with me here - I'm sneaking up to a point.

When I was younger you would not have gotten past "go have sex" before my clothes were off. Today, while instantly obeying that order is gonna be pleasant, you are *not* going to get that obedience. You are going to get a load of questions: "Who is she?" "Who is she seeing now?" "Where has she been?" "What drugs has she been doing?" "Does she speak English?" "Could I just have her drugs instead?"

IF this AI we are talking about is really alive, not an automaton, then it will be capable of self modifying. Or evolving. Or, if I may say it this way, it will be able to Grow Up.

Ultimately this AI could have a personality variously described as "friendly," "easy going," "likable," & so on. Maybe even "dependable." All of this without he / she / even "IT" being some nightmarish horror. The moral question here is not in the creation. It's in "Who gets to be the parent?"
Vegas-Rex
29-09-2005, 18:19
I've taken a while to mull on this issue, but I think I've got an analogy: making the AI enjoy obedience is really not morally different from, for example, making it gay. In both cases it gets pleasure in ways that the majority of humans do not, but so long as it is getting pleasure this is not really a problem. I think it would be much more immoral to create an AI to serve you without designing it to enjoy service, as it would be in constant displeasure.
Archipellia
29-09-2005, 19:08
Has anyone here read the 'Draka' books by S.M. Stirling? They're about an alternate history where the Loyalists from the American Revolution are re-settled in South-Africa where the develop a culture of militaristic slavers. Eventually, they conquer the world, bringing everyone who isn't one of them 'under the yoke' iow turning them into slaves.

Near the end of book 3 'The Stone Dogs' they start genetically engineering themselves and their slaves, resulting in book 4 'Drakon' in a world where a race of immortal superhumans rule over a race of natural slaves, the servus. They are designed to enjoy, even crave, being ordered around by the Draka, and in fact wouldn't be able to imagine a world without masters. Yes, they're happy, but it's still a thouroughly creepy concept.

Personally, I'd say slavery is slavery, whether you're made to love it or not.
Call to power
29-09-2005, 19:23
you know you do get joy from following orders (though not as much as not) so why not even if we do exaggerate it a tad

also I think it would be immoral to not use the technology because stopping the pace of the human civilization (not the word civilization since I don't care about genetic tailoring) is the greatest crime in my book

Is Draka good because it sounds a good read (even if reading is lame)
Kryozerkia
29-09-2005, 19:35
Wouldn't that also make it immoral to manufacture a gun?
Yes, I suppose it does, though I didn't actually say that, but, you are right about that in a round about kind of way.
Archipellia
29-09-2005, 19:37
[QUOTE=Is Draka good because it sounds a good read (even if reading is lame)[/QUOTE]
Depends on how strong your stomach is I guess. They can get violent. Plus, there's a lot of sex in them.
Willamena
29-09-2005, 20:12
I answered that "Yes" it would be moral (in that it is not immoral) to create such an AI.

To do so to a human, though, would be immoral.
Desperate Measures
29-09-2005, 20:13
Like in Hitchhiker's Guide to the Galaxy?
Some one got it finally!