NationStates Jolt Archive


A thought on civilization

South Lizasauria
14-08-2008, 04:55
Suppose that a in the future mankind has mastered space travel and has found many barren lifeless worlds that are rich in minerals. By now humankind would have already secured many resource nodes so as an experiment they create AI robots that are designed to take in and refine minerals and use them to repair damage and create other robots of their likeness. As mankind watched from a distance these AI begin to construct a civilization. A decade later however mankind discovers a chilling truth, they too were engineered and designed by aliens who were doing a similar experiment only instead of with machinery their experiment was done with bio-engineering (obviously).

So after taking that into consideration does that make the AIs like people? And if the above were true how would it effect your views?

Discuss.
Bann-ed
14-08-2008, 04:58
The AI could never be people. They can be similar to people however. A lot of this depends on whether or not you believe humans have souls or something that makes them more than a biological mass. However, since you basically outlined that we were created as an alien experiment, we probably don't have souls and are then in a similar situation as the AI robots.

This would not change my views really.
Science has failed to do that much so far.
Free Soviets
14-08-2008, 04:59
is the question whether constructed things are persons, or is it whether computers are persons like bio-engineered people are?
South Lizasauria
14-08-2008, 05:00
The AI could never be people. They can be similar to people however. A lot of this depends on whether or not you believe humans have souls or something that makes them more than a biological mass. However, since you basically outlined that we were created as an alien experiment, we probably don't have souls and are then in a similar situation as the AI robots.

This would not change my views really.
Science has failed to do that much so far.

But what if those robots were designed to have minds, emotions, personalities and intelligence like humans. What if mankind purposefully made these experiments sentient as the aliens did with them?
Free Soviets
14-08-2008, 05:02
The AI could never be people.

why not?
Liminus
14-08-2008, 05:02
*shrug* I consider myself a functionalist in terms of philosophy of mind and such. They'd be "people" as much as we are people. Self-editing sets of data with such and such parameters is all it takes for personhood, imo.
Bann-ed
14-08-2008, 05:04
why not?

Maybe I should have said humans. How would you define a 'person' or 'people'?
South Lizasauria
14-08-2008, 05:16
is the question whether constructed things are persons, or is it whether computers are persons like bio-engineered people are?

The question is whether or not sentient machines created by mankind would be equal to a human.

The second part about mankind discovering they were bio-engineered is the second part to that question. How would that discover change things?
Gartref
14-08-2008, 05:19
How would that discover change things?

It means that denying their equality would make us hypocrites - and humans would never be that!
Lunatic Goofballs
14-08-2008, 05:19
Suppose that a in the future mankind has mastered space travel and has found many barren lifeless worlds that are rich in minerals. By now humankind would have already secured many resource nodes so as an experiment they create AI robots that are designed to take in and refine minerals and use them to repair damage and create other robots of their likeness. As mankind watched from a distance these AI begin to construct a civilization. A decade later however mankind discovers a chilling truth, they too were engineered and designed by aliens who were doing a similar experiment only instead of with machinery their experiment was done with bio-engineering (obviously).

So after taking that into consideration does that make the AIs like people? And if the above were true how would it effect your views?

Discuss.

I would look for the Goofball robot and together we'd plot. *nod*
Anti-Social Darwinism
14-08-2008, 06:26
Not been watching BSG, have we.

Also, never seen Space: Above and Beyond either, have you.

Or Star Trek: The Next Generation.

Or 2001: A Space Odyssey.

Or Andromeda.

Or read any Isaac Asimov or Robert Heinlein or many, many, many other sci fi authors.

They all address this topic or similar topics. They all come up with similar responses - AIs that show self awareness are defined as "human."
Skalvia
14-08-2008, 06:30
Dont worry, I trust Skynet...
Free Soviets
14-08-2008, 06:42
AIs that show self awareness are human.

wouldn't a computer intelligence stand a good chance of being almost totally alien? personhood, sure, but i don't know about human.
Anti-Social Darwinism
14-08-2008, 06:48
wouldn't a computer intelligence stand a good chance of being almost totally alien? personhood, sure, but i don't know about human.

It's a broad definition of human. You could read White's Hospital Station. While the science is iffy at best, the theme of the story is about what it takes to define human. Also, Data the android on TS:TNG is a computer intelligence and is, ultimately, defined as human. Just as the ship, Andromeda, is human and even the Borg partake of a certain "humanness." Microft Holmes, the computer in The Moon is a Harsh Mistress is so human, he cracks jokes and starts revolutions. The definition, broad as it is, includes alien intelligences as "human" so long as they are both sapient and sentient.
South Lizasauria
14-08-2008, 15:14
Not been watching BSG, have we.

Also, never seen Space: Above and Beyond either, have you.

Or Star Trek: The Next Generation.

Or 2001: A Space Odyssey.

Or Andromeda.

Or read any Isaac Asimov or Robert Heinlein or many, many, many other sci fi authors.

They all address this topic or similar topics. They all come up with similar responses - AIs that show self awareness are defined as "human."

Cylons don't count as much since only the leadership castes seem to have acquired personhood, the rest are drones that maintain the hive. In the Space Odyssey humanity wasn't engineered, it was only being watched and observed. I have yet to see the Star Trek and Andromeda episodes you are reffering to.
Liminus
14-08-2008, 18:39
Cylons don't count as much since only the leadership castes seem to have acquired personhood, the rest are drones that maintain the hive. In the Space Odyssey humanity wasn't engineered, it was only being watched and observed. I have yet to see the Star Trek and Andromeda episodes you are reffering to.

Cyclon Centurions have an inhibitor that, in the most recent episodes, was removed by a Cylon faction and so they are now all self-aware.

Still, though, unless you are positing the existence of a soul, I don't see how you can define personhood in anything other than a functionalist method. If you are hinging your argument on the existence of a soul, then you have a different view of reality from mine and there is no room for debate as belief in a soul seems to disregard, well, empiricism and all that nice stuff that is necessary for real debate.

It's a question similar to the Chinese Box/Room (http://en.wikipedia.org/wiki/Chinese_room) thought experiment, I'd say.
Dontgonearthere
14-08-2008, 18:53
One might define simply being self-aware as having a 'soul', in the religious sense or not.
I do think it could well be possible for an artificial (in the sense of 'non-organic') sentience to exist. It would probably require a computer far more advanced than our own today, but even nowadays we've managed to at least imitate intelligence to a degree. Just dont ask any modern AI's to, say, write a book and expect it to be any good.

As to humanity being designed by aliens, I dont really think it'd have that big of an effect in the long term. In the short term a few religions based around alien-worship might pop up, and probably some people would commit suicide/fly off into space/whatever, but the majority of humanity would probably say, "Alright, how does it affect ME?" And then realize that it doesn't really change anything. Unless the aliens came back, in which case we'd have an awesome science fiction space battle or something.
Which would be awesome.

On a side note, did you recently read the Uplift series? It deals with a somewhat similar concept (minus the sentient machines, who are only briefly mentioned, mores the pity.)
Ashmoria
14-08-2008, 20:45
i would expect them to worship us as we worship our creators.

they wouldnt be people they would be creations who are people to each other.
Kyronea
15-08-2008, 02:33
Any AI that reaches sapient intelligence will most likely have some semblance of emotions and morality as well, either because it was programmed into the base coding or because it would automatically develop.

Of course the required computing technology for a sapient AI would mean said AI would be able to grow just like a human, so obviously current computer systems couldn't accomplish it.

But once an AI reaches sapient intelligence, that AI is a person, as far as I'm concerned, just as any other sapient being would be, whether it's a human or some sort of alien life form or something artificial.
Antilon
15-08-2008, 03:43
For me, I would accept any sentient entity (biological or otherwise) as an equal. As a human being, who am I to decide any less or more?
Ascelonia
15-08-2008, 03:57
Suppose that a in the future mankind has mastered space travel and has found many barren lifeless worlds that are rich in minerals. By now humankind would have already secured many resource nodes so as an experiment they create AI robots that are designed to take in and refine minerals and use them to repair damage and create other robots of their likeness. As mankind watched from a distance these AI begin to construct a civilization. A decade later however mankind discovers a chilling truth, they too were engineered and designed by aliens who were doing a similar experiment only instead of with machinery their experiment was done with bio-engineering (obviously).

So after taking that into consideration does that make the AIs like people? And if the above were true how would it effect your views?

Discuss.


Only if AIs can love.

Sure thinking is cool, but can they love?
Ascelonia
15-08-2008, 03:58
For me, I would accept any sentient entity (biological or otherwise) as an equal. As a human being, who am I to decide any less or more?

As a human, every thing is lesser. Seriously, this isn't racism/bigotry against other species. This is simple logic. We can mate with people of other "races"/skin colors, but we cannot mate with other animals or aliens (if we could with their approval, I would approve of that entity).
Antilon
15-08-2008, 04:24
Only if AIs can love.

Sure thinking is cool, but can they love?

I agree with how the Matrix defined "love."
Neo: Is that what you're doing here?
Kamala: Rama, please
Rama-Kandra: I do not want to be cruel, Kamala. He may never see another face for the rest of his life.
Neo: I'm sorry. You don't have to answer that question.
Rama-Kandra: No. I don't mind. The answer is simple. I love my daughter very much. I find her to be the most beautiful thing I've ever seen. But where we are from, that is not enough. Every program that is created must have a purpose; if it does not, it is deleted. I went to the Frenchman to save my daughter. You do not understand.
Neo: I just have never...
Rama-Kandra: ...heard a program speak of love?
Neo: It's a... human emotion.
Rama-Kandra: No, it is a word. What matters is the connection the word implies. I see that you are in love. Can you tell me what you would give to hold on to that connection?
Neo: Anything.
Rama-Kandra: Then perhaps the reason you're here is not so different from the reason I'm here.
Cameroi
15-08-2008, 09:19
civilization is a myth, but the abscence of the dominance of aggressiveness is a really good idea, at ANY "tec level".

as for the other thing; an awairness is an awairness, whatever artifact, or life form it may happen to occupy, or even whether it occupies any sort of physical form or object at all.
Sleepy Bugs
15-08-2008, 10:14
I liked Stanislaw Lem's take in The Invincible, where "AI" devolves to the most rudimental system possible in this universe of Tooth and Claw we inhabit.

My Little Pony universes will probably turn out dif'rently.
Vydro
15-08-2008, 11:08
As long as the AI doesn't become self-aware and take over.

Because then we have to wait for a suitable martyr before we can launch the holy Jihad to destroy all machines made in the likeness of the human mind, and that might take a thousand years!
South Lizasauria
16-08-2008, 07:19
As long as the AI doesn't become self-aware and take over.

Because then we have to wait for a suitable martyr before we can launch the holy Jihad to destroy all machines made in the likeness of the human mind, and that might take a thousand years!

But in the hypothetical situation they were self aware upon activation. How can anyone be sentient yet not self aware?