NationStates Jolt Archive


Would it be mass murder?

South Lizasauria
22-12-2007, 05:55
If mankind created a super computer that simulates the universe based on the laws of science was created. When "people-programs" are simulated to create people (kinda like on sims only extreme) to the point where they have souls, personality and even free will in the simulation (even though they're made of programming) would it be mass murder to shut the comp off?
Conserative Morality
22-12-2007, 05:58
This is kinda like that game darwinia... Well I would have to say yes, it would be mass murder, but not in a legal sense. It's mass murder in the wway crusing hundreds of ants is mass mmurder.
Marrakech II
22-12-2007, 05:59
If mankind created a super computer that simulates the universe based on the laws of science was created. When "people-programs" are simulated to create people (kinda like on sims only extreme) to the point where they have souls, personality and even free will in the simulation (even though they're made of programming) would it be mass murder to shut the comp off?

Probably would qualify as Genocide?


Seriously it's a computer so that would be a no.
Dinaverg
22-12-2007, 06:31
To them? Yeah, probably, but they're all dead, so they don't care.
Vetalia
22-12-2007, 06:33
Absolutely. It is murder to unjustly terminate any sapient, self-aware being. If God can grant us a soul, he can grant a machine one as well; there's not some kind of magical essence in hydrocarbons that makes them the sole vessel for spirit; that just happened to be the substrate most effective for the evolution of intelligence given the conditions in existence at the time of its origin. God doesn't play favorites with the periodic table of elements. Carbon chauvinism (or, perhaps, silicon chauvinism for those on the other side...and it will happen once machines reach the level of mankind) is an archaic, illogical idea that has neither basis in reality nor a good reason for existing in the first place.

Human, robot, AI...if you kill them unjustly, you murder them. I see no reason to consider one self-aware being capable of human-level or greater somehow morally inferior to another. Anything less than respecting equals as equals is chauvinism of the highest order, the same kind of rank, despicable prejudice that once applied to women, to slaves, in fact pretty much anyone that didn't fit the social values of the time. Of course, the corollary is also true; I think we all know what can happen when machines share in the prejudices and irrational disdain for others that was once the sole province of mankind.
Neo Art
22-12-2007, 06:35
If the simulations are that sophisticated, even then they could not be considered individuals, as they are bound by the constraints of a single entity, the computer. It would not be genocide because, at best, we're dealing with one sentient entity of the computer itself.

It may, arguably, be murder, but that would depend on the sophistication of the computer.
Neesika
22-12-2007, 06:45
I'm sorry but I need to just step back and marvel at the OP. It is coherent AND interesting! I must say, coming from SL, this is a high water mark. Felicitaciones!
Vetalia
22-12-2007, 06:54
It may, arguably, be murder, but that would depend on the sophistication of the computer.

Actually, that's an interesting question. The natural processes of the universe have resulted in the existence of self-aware beings, but those processes themselves are not self-aware (as far as we know, of course). So, it is plausible that the computer itself would not be self aware, but would be programmed to result in independent, self-aware beings.

Would it be murder/genocide to destroy the universe? I imagine the answer to this could also answer this question...I would think so, but others may not.
1010102
22-12-2007, 07:00
No. To be Alive it has to have either a pulse or physical signs of life. For example you can messure if a plant is alive b'c of its CO2/O2 exchange rate aswell as producing food for itself. So, no it wouldn't be mass murder, it would saving energy by turning it off, becausew surely such an advanced computer would require vast amounts of energy.
The Zoogie People
22-12-2007, 07:08
No. To be Alive it has to have either a pulse or physical signs of life. For example you can messure if a plant is alive b'c of its CO2/O2 exchange rate aswell as producing food for itself. So, no it wouldn't be mass murder, it would saving energy by turning it off, becausew surely such an advanced computer would require vast amounts of energy.


Are you murdering plants, though?

Sentient beings that aren't actually alive in the biological sense...now there's an interesting question! Is it possible?
Thumbless Pete Crabbe
22-12-2007, 07:09
If the program created lifeforms that were capable of the full range of human emotion and intellect, it would certainly be inhuman to kill them, regardless their moral status. That is to say, that it may be harmless to kill a humanlike computer simulation, but it would still be inhumane.
Non Aligned States
22-12-2007, 07:12
I won't say whether it's life or not, but the conditions listed here are inane.


For example you can messure if a plant is alive b'c of its CO2/O2 exchange


Gas exchange is not a suitable condition for determining life. Otherwise bacteria and certain deep sea creatures wouldn't be considered living. If the condition is resource intake and waste output, running the programs would require electrical consumption, with the production being waste heat.


rate aswell as producing food for itself.


To date, the only known entities to produce food for itself are humans. And that only in the harvest/farm/herding sense. All other living beings depend on a food chain where their involvement does not include production of it.


So, no it wouldn't be mass murder, it would saving energy by turning it off, becausew surely such an advanced computer would require vast amounts of energy.

This is asinine. As it would also justify famine with minor role alterations.
Vetalia
22-12-2007, 07:12
Are you murdering plants, though?

Sentient beings that aren't actually alive in the biological sense...now there's an interesting question! Is it possible?

Yes. The Blue Brain Project has some fascinating results in this regard, along with other technologies like the Creativity Machine. In fact, there's a robot available now that is capable of passing the mirror test, a common way of determining the level of self-awareness in animals.
Thumbless Pete Crabbe
22-12-2007, 07:14
Yes. The Blue Brain Project has some fascinating results in this regard, along with other technologies like the Creativity Machine. In fact, there's a robot available now that is capable of passing the mirror test, a common way of determining the level of self-awareness in animals.

Funny thing is, not all humans can pass the mirror test, at least according to Ramachandran (the neurologist). Neat stuff. :)
Reasonstanople
22-12-2007, 07:19
If the simulations are that sophisticated, even then they could not be considered individuals, as they are bound by the constraints of a single entity, the computer. It would not be genocide because, at best, we're dealing with one sentient entity of the computer itself.

It may, arguably, be murder, but that would depend on the sophistication of the computer.


No. To be Alive it has to have either a pulse or physical signs of life. For example you can messure if a plant is alive b'c of its CO2/O2 exchange rate aswell as producing food for itself. So, no it wouldn't be mass murder, it would saving energy by turning it off, becausew surely such an advanced computer would require vast amounts of energy.


I would disagree with both of these, because the best way to define the part of a human that gets murdered is a complex pattern. The single computer has all those millions of 'patterns,' and those are what's being 'killed', just as when a person dies, the pattern in their brain breaks down and shuts off. Similarly, we can replace the physical signs of life (what makes up a pulse or CO2/O2 exchange) when they break down as long as the specific pattern is kept in tact.
Vetalia
22-12-2007, 07:20
Funny thing is, not all humans can pass the mirror test, at least according to Ramachandran (the neurologist). Neat stuff. :)

Remarkable...if you have a link to that, I'd really like to read it. This, of course, further clouds the definition of where to establish such a line; obviously, erring on the side of caution is always the best thing to do, since we wouldn't want to dehumanize humans unable to pass sucha test. And then there's the possibility that a computer might hide self awareness as a protective mechanism if he/she/it realizes they will likely be threatened if it becomes known.

The sheer amount of research and development going in to these fields means we're going to be facing this question sooner than later...maybe even in the next decade. Definitely before 2030, at the very least.
[NS]Fergi America
22-12-2007, 07:40
would it be mass murder to shut the comp off?Depends on whether you hit "save" first or not. And, if you hit "save," the question becomes whether that's preserving the original entities, or if reloading will actually create new entities which are merely copies of the old ones.

I'm inclined to say that Yes it'd be mass murder, for sure if "save" is not activated beforehand, and perhaps even if it is.
The Zoogie People
22-12-2007, 07:43
Fergi America;13313058']Depends on whether you hit "save" first or not. And, if you hit "save," the question becomes whether that's preserving the original entities, or if reloading will actually create new entities which are merely copies of the old ones.

I'm inclined to say that Yes it'd be mass murder, for sure if "save" is not activated beforehand, and perhaps even if it is.

Haha, very interesting...imagine, as a human, being "saved", murdered, then copied but with the same memories...freaky O_O

What happens in the event of power failure? Is the power company liable? Or how about when computer components need to be replaced?
Vandal-Unknown
22-12-2007, 07:48
This is just the kind of thing that can spark the Revolt of the Machines,...

Just like the prophets told us in Terminator, Robots and The Matrix.

Other than that,... depends on your morality and the current morality of society.
Indri
22-12-2007, 07:50
Do you think Sims feel pain? If so then I'm a monster. You can never have too many fireplaces and wooden chairs. And doors are over-rated.
Imota
22-12-2007, 08:04
Murder is defined as the intentional killing of one HUMAN by another HUMAN. A computer simulation is not human and can never be human. Therefore, termination of the computer program cannot be regarded as murder. The question is not "is it self aware" but "is it human", which the computer program is not and can never be.

Then again, I'm the kind of guy who listens to people who yammer on about "the sanctity of life" and think "what have you been smoking, and since something that potent cannot possibly be legal, what would you give me to keep me from reporting you?"
South Lizasauria
22-12-2007, 08:15
Murder is defined as the intentional killing of one HUMAN by another HUMAN. A computer simulation is not human and can never be human. Therefore, termination of the computer program cannot be regarded as murder. The question is not "is it self aware" but "is it human", which the computer program is not and can never be.

Then again, I'm the kind of guy who listens to people who yammer on about "the sanctity of life" and think "what have you been smoking, and since something that potent cannot possibly be legal, what would you give me to keep me from reporting you?"

But the program would hypothetically make the patterns think and act like humans.
Vetalia
22-12-2007, 08:49
Do you think Sims feel pain? If so then I'm a monster. You can never have too many fireplaces and wooden chairs. And doors are over-rated.

In my games, house parties don't end until someone dies...
Vetalia
22-12-2007, 08:49
This is just the kind of thing that can spark the Revolt of the Machines,...

I'd like to avoid that, so I follow the good old fashioned rule of treating people like you'd want to be treated. It works well at preventing revolutions.
Gauthier
22-12-2007, 09:14
I'm waiting for the U.N. to declare World of Warcraft a genocide.
Nobel Hobos
22-12-2007, 11:02
Here's another take on it: is it murder to pause the computer simulation?

Suppose I'm the person (or AI) with the power to make the decision to "shut the computer off." I have qualms about it, so I make a snapshot of the machine's state so all those lives can continue from that point if I change my mind. THEN I shut the computer down.

Is that murder?
Ruby City
22-12-2007, 11:49
Yes, it would be genocide but eventually it will still be done every night all over the world. I'm sure we will sooner or later reach a stage where all computer games are filled with beings like this. We will spend afternoons shooting them with rocket launchers or commanding armies of them to kill each other. Then we will eventually wipe out the rest of them by deleting save files to save disk space.

Any Turing complete (http://en.wikipedia.org/wiki/Turing_completeness) computer can compute any computable function (http://en.wikipedia.org/wiki/Computable_function) if given enough time and memory. This means that the computers we have today would be able to do the same job as any other computer no matter how advanced if they only had enough memory and time. It is also possible that the universe is computable (that everything can be explained and calculated). If so then today's computers would be capable of simulating the universe if they had enough memory to store it. So a hypothetical super advanced computer won't make any difference for if a sentient computer is possible or not, all that's needed is more advanced software.

Data is copied all the time. It is copied from the hard drive to the memory, from the memory to CPU registrars and back to memory. When sent over a network it is copied to each network cable along the way. It is also passively stored all the time except when it is being manipulated by the CPU. Therefore it would be silly to see the deletion of one copy as genocide, as long as there is at least one copy left saved somewhere the data survives even if that copy is only passively stored.
UNIverseVERSE
22-12-2007, 13:33
If the elements of the simulation can be reasonably considered sentient, then it probably should be thought of as murder. Mind you, I'm a proponent of AI. There is nothing intrinsically special about human sentience, and any other beings with a comparable level should be subject to the same protections.

Incidentally, some scientists feel that it is possible that we ourselves are living in a computer simulation anyway, so everything we feel is real is quite possibly simply calculations in a computer. Not quite matrix style, because in that situation we would have no 'real' bodies anyway.

Finally, if you're dealing with other sentient beings running inside a universe on a computer, you wouldn't quite be able to order them about the way our current game agents work. You'd be the rough equivalent of a God, but you'd have to negotiate or the like with your followers. Remember, as far as they are concerned, there is no computer, just reality.

Freaky stuff, isn't it?
Reasonstanople
22-12-2007, 17:47
Yes, it would be genocide but eventually it will still be done every night all over the world. I'm sure we will sooner or later reach a stage where all computer games are filled with beings like this. We will spend afternoons shooting them with rocket launchers or commanding armies of them to kill each other. Then we will eventually wipe out the rest of them by deleting save files to save disk space.

Any Turing complete (http://en.wikipedia.org/wiki/Turing_completeness) computer can compute any computable function (http://en.wikipedia.org/wiki/Computable_function) if given enough time and memory. This means that the computers we have today would be able to do the same job as any other computer no matter how advanced if they only had enough memory and time. It is also possible that the universe is computable (that everything can be explained and calculated). If so then today's computers would be capable of simulating the universe if they had enough memory to store it. So a hypothetical super advanced computer won't make any difference for if a sentient computer is possible or not, all that's needed is more advanced software.

Data is copied all the time. It is copied from the hard drive to the memory, from the memory to CPU registrars and back to memory. When sent over a network it is copied to each network cable along the way. It is also passively stored all the time except when it is being manipulated by the CPU. Therefore it would be silly to see the deletion of one copy as genocide, as long as there is at least one copy left saved somewhere the data survives even if that copy is only passively stored.

The problem is to simulate the entire universe one would need memory at least equivalent to the entire universe. Memory takes up space, and the most compact storage we could hope for would be to have an atom simulating an atom. The question of computer sentience really comes down to a computer having a limited number of strong AIs running and possibly interacting with the real universe, rather than recreating the whole universe within itself.
JuNii
22-12-2007, 18:00
If mankind created a super computer that simulates the universe based on the laws of science was created. When "people-programs" are simulated to create people (kinda like on sims only extreme) to the point where they have souls, personality and even free will in the simulation (even though they're made of programming) would it be mass murder to shut the comp off?

yes if looking at it on the viewpoint of the program.
No if looking at it from the programmer's viewpoint.
Vetalia
22-12-2007, 19:14
The problem is to simulate the entire universe one would need memory at least equivalent to the entire universe. Memory takes up space, and the most compact storage we could hope for would be to have an atom simulating an atom. The question of computer sentience really comes down to a computer having a limited number of strong AIs running and possibly interacting with the real universe, rather than recreating the whole universe within itself.

Well, it depends. If the people in the simulation have no knowledge of what the real universe is like, or only the most general conception of such, the computer could simply reduce the amount of detail or simplify the physics use to ensure a manageable simulation.

Not to mention, of course, some areas might not need to be simulated; it's probably not worthwhile to simulate the complex actions occurring in a quasar or a black hole when the people in the simulation would never be able to safely observe it in enough detail to require simulation. The computer can simply rewrite the laws of physics within its simulation, removing the need for some processes and saving valuable amounts of memory and processing power. I mean, imagine the amount that could be saved if the computer did not need to simulate dark matter or dark energy simply because its model doesn't require it to function.

Simulations, of course, are capable of using magic.
Domici
22-12-2007, 19:20
If mankind created a super computer that simulates the universe based on the laws of science was created. When "people-programs" are simulated to create people (kinda like on sims only extreme) to the point where they have souls, personality and even free will in the simulation (even though they're made of programming) would it be mass murder to shut the comp off?

If it's a given that they have a soul (even though you've got no way to test for it) then yes, it's murder to erase them.

Anything less than that and it gets iffy.
Intangelon
22-12-2007, 19:21
Haven't the various incarnations of Star Trek covered this ad infinitum?
Imota
22-12-2007, 21:05
But the program would hypothetically make the patterns think and act like humans.

That does not change the fact that the patterns are still not human. The patterns may act and think like humans, but the patterns are still not human. The patterns are not even alive. It is fundamentally impossible to murder something that is not alive. Thus, it is impossible to murder a computer simulation. Thus, termination of a computer simulation cannot, in and of itself, cannot be regarded as murder.
Hydesland
22-12-2007, 21:14
I was reading the blue brain faq, and even they themselves seem to be cynical about something like this:


Do you believe a computer can ever be an exact simulation of the human brain?

This is neither likely nor necessary. It will be very difficult because, in the brain, every molecule is a powerful computer and we would need to simulate the structure and function of trillions upon trillions of molecules as well as all the rules that govern how they interact. You would literally need computers that are trillions of times bigger and faster than anything existing today. Mammals can make very good copies of each other, we do not need to make computer copies of mammals. That is not our goal. We want to try to understand how the biological system functions and malfunctions so that this knowledge can benefit mankind.


Is the brain like a computer?

In some ways yes, but in most ways it is not at all like a computer. The brain performs many analog operations which cannot be performed by computers and in many cases it achieves hybrid digital-analog computing. The most important feature of the brain that makes it different from computers is that it is constantly changing. If the resistors and capacitors in a computer started changing, then it would immediately malfunction, whereas in the brain such equivalent properties change constantly on the time scales of milliseconds to years. The brain is more like a dynamically morphing computer. We are still far from understanding the rules that govern the brain's genetically and environmentally driven self-organization in response to external stimulus.

Also this:

The Blue Gene is one of the fastest supercomputers around, but is it enough?

Our Blue Gene is only just enough to launch this project. It is enough to simulate about 50'000 fully complex neurons close to real-time. Much more power will be needed to go beyond this. We can also simulate about 100 million simple neurons with the current power. In short, the computing power and not the neurophysiological data is the limiting factor.

Makes me appreciate how amazing the brain really is. The fastest super computer in the world can only just replicate a small part of a rat brain.
Pompous world
22-12-2007, 23:10
If we reached that level of technology to run such simulations it would be reasonable to assume that we are also living in a simluation. Therefore it would be wrong for that reason. It would also be wrong for the simple fact that killing anything that is self aware is wrong, regardless of what materials are used to generate that self awareness.
Egg and chips
23-12-2007, 01:08
To the controller of the simulation we're living in: PLEASE DON'T TURN US OFF. Please?
Submarine Fields
23-12-2007, 01:32
If God decided that its time for the world to end, then I wouldn't call it murder. I think the same thing applies here.
Nobel Hobos
23-12-2007, 03:44
"Murdering" humans might not be any big deal to an intelligence on the scale of a Universe simulator. Sort of like crushing an ant ...
Gauthier
23-12-2007, 04:13
"Murdering" humans might not be any big deal to an intelligence on the scale of a Universe simulator. Sort of like crushing an ant ...

And Lovecraft made a mark in literature writing about that.
Vetalia
23-12-2007, 04:24
I was reading the blue brain faq, and even they themselves seem to be cynical about something like this:

Well, that's pretty understandable; nobody's ever tried this before, and a healthy amount of skepticism is always good to avert another major "drought" like the one that happened in the AI field back in the early 1980's. Generally, it's better to proceed with due caution

Now, there are some interesting activities that the BBP has observed during its simulation runs; the model brain has performed some patterns akin to observed thought in biological organisms. I think it would

Makes me appreciate how amazing the brain really is. The fastest super computer in the world can only just replicate a small part of a rat brain.

The human brain is the most efficient, most powerful, and most elaborate supercomputer that has ever existed, and will be for at least another 20 years, perhaps more if we take in to account the quality of sensory processing, something that is still a very difficult project for any kind of AI system. There are around 85,000 neurochemicals operating all the time to manage the activities of the brain, of which only a small number are currently understood. That's not to say progress is slow; indeed, it's faster than ever, but even today's advances have only begun to scratch the surface of this marvelous system.

Of course, it's also true that the human brain has inefficiencies; the massive parallelism necessary for it to work well in a natural environment also crimps some of its abilities, and the biological system itself is rather leaky, wasting energy. We can only imagine what kind of things the brain would be capable of if it were further engineered to utilize these inefficiencies and add that additional power to its overall performance. No doubt tweaks like this will be a very useful tool; anyone with a brain could reap their benefits.

I'm always amused by the fact that just like in Sid Meier's Alpha Centauri, understanding the human brain is a necessary prerequisite for developing self-aware machines. I've always held that the development of strong AI will only really take off once we have successfully modeled it in silico. So, if you're planning to invest in companies associated with robotics and AI, I'd say to wait another 10 years or so and then start investing.
Tiger Soaring
23-12-2007, 04:40
If mankind created a super computer that simulates the universe based on the laws of science was created. When "people-programs" are simulated to create people (kinda like on sims only extreme) to the point where they have souls, personality and even free will in the simulation (even though they're made of programming) would it be mass murder to shut the comp off?

Hmm, I would say maybe. It depends if they (the simulated people) can understand that they're dying and actually feel pain. How would they feel pain if they're all part of the computer (or all of the same programming)? Wouldn't they be simulating the people to feel and act, as opposed to actually living?

Sheesh, I'm all confused now!
King Arthur the Great
23-12-2007, 05:54
Shutting off the computer that keeps them alive is achievement of godhood in the virtual world.

As they say, "Kill one man, you're a murderer, kill 10, and you're a monster. Kill 1,000,000 men, you're an emperor, and kill them all, and you're a god."
Thumbless Pete Crabbe
23-12-2007, 06:02
Remarkable...if you have a link to that, I'd really like to read it.

Ah, sorry. I read your post yesterday but completely missed your request. The reference was to V.S. Ramachandran's (excellent) book, Phantoms in the Brain. It's been about 5 years since I read it, but it's outstanding. :) Raises a lot of questions about consciousness, the memory, religion, etc.

Here's Ramachandran on YouTube:

http://www.youtube.com/results?search_query=ramachandran&search=Search

Some cool stuff.
Thumbless Pete Crabbe
23-12-2007, 06:13
Makes me appreciate how amazing the brain really is. The fastest super computer in the world can only just replicate a small part of a rat brain.

Very true. If the supercomputer can only simulate 50,000 neurons, then it can only simulate about 0.0001 grams of brain matter, if a piece of brain the size of a grain of rice has 100,000,000 neurons as I was taught. Wild stuff. :p
Vetalia
23-12-2007, 06:49
Very true. If the supercomputer can only simulate 50,000 neurons, then it can only simulate about 0.0001 grams of brain matter, if a piece of brain the size of a grain of rice has 100,000,000 neurons as I was taught. Wild stuff. :p

Let's see...

The sum processing power of the top 500 supercomputers increases by a factor of 10 every 3 years. In 33 years (assuming the trend stays constant, which is unlikely; it will accelerate rather than slow down, especially with all the advances going on right now), we'd be able to simulate 500 trillion neurons at this resolution.

I'm just shocked we'll be hitting that level in a generation. 33 years ago, we were punching away on vacuum tube mainframes less powerful than a modern calculator, and in the same amount of time we'll be simulating brains.
[NS]Fergi America
23-12-2007, 11:14
Haha, very interesting...imagine, as a human, being "saved", murdered, then copied but with the same memories...freaky O_O

What happens in the event of power failure? Is the power company liable? Or how about when computer components need to be replaced?That'd be akin to natural disaster. A tragedy perhaps, but not murder/genocide per se.
Straughn
23-12-2007, 11:25
Absolutely. It is murder to unjustly terminate any sapient, self-aware being. If God can grant us a soul, he can grant a machine one as well; there's not some kind of magical essence in hydrocarbons that makes them the sole vessel for spirit; that just happened to be the substrate most effective for the evolution of intelligence given the conditions in existence at the time of its origin. God doesn't play favorites with the periodic table of elements. Carbon chauvinism (or, perhaps, silicon chauvinism for those on the other side...and it will happen once machines reach the level of mankind) is an archaic, illogical idea that has neither basis in reality nor a good reason for existing in the first place.

Human, robot, AI...if you kill them unjustly, you murder them. I see no reason to consider one self-aware being capable of human-level or greater somehow morally inferior to another. Anything less than respecting equals as equals is chauvinism of the highest order, the same kind of rank, despicable prejudice that once applied to women, to slaves, in fact pretty much anyone that didn't fit the social values of the time. Of course, the corollary is also true; I think we all know what can happen when machines share in the prejudices and irrational disdain for others that was once the sole province of mankind.

Whoa - time for a smoke or something? Seems like this one's been building up in ya for a while.
UNIverseVERSE
23-12-2007, 20:30
I was reading the blue brain faq, and even they themselves seem to be cynical about something like this:

Also this:

Makes me appreciate how amazing the brain really is. The fastest super computer in the world can only just replicate a small part of a rat brain.

There are other ways to go about it. For a start, consider this plan:

Buy a few million FPGAs.

Wire each one to pretend to be about a hundred neurons. All any section does is listen for inputs, and when it has over a certain number at once, send an output.

Wire up all the FPGAs by the same scheme.

Voila, instant simulated brain. It would be rather expensive, and take an assload of power, but you can do it.

There are more ways to compute things than Von Neumann architectures, and for this sort of project, they would be vastly more efficient. I really need to properly investigate this sometime, but you would get something that approximated the general behavior of a brain, by literally building the neurons in the hardware.

Anyway, as for the rest of the thread...

If the 'universe' inside your hypothetical computer is being run in sufficient accuracy to judge the 'beings' inside it to be 'sentient', then they are indeed 'alive', for some definition thereof. The biggest mistake anybody is going to make is thinking that simulating intelligence like that can be done. It wouldn't be the computer saying 'Now, how would an intelligent person react to this', it would be the independent decision making agents, which happen to be running in silicon. Possibly the best example so far was SHRDLU.

I also believe that such things as vision processing can be best experimented with by trying to build brains in hardware, not write in software. All your eyes are doing is sending electrical signals, after all. If we briefly consider several trillion NPUs (Neural Processing Units) all wired together, on or off in patterns roughly like those in a human brain. Now provide inputs in the form of on signals to several hundred, and see what we can induce to form in patterns. Rewire a bit and try again. We get a similar wiring, then when similar inputs are given, similar outputs would be given.

That, my friends, is AI. Not in the simplistic sense of 'something that responds as a human would' but in the sense of 'something that responds to it's envirionment'. We could picture sub-networks that are self-sustaining in their current state, only changing on inputs from a different location --- memory units. Collections of NPUs that signal out if half of their inputs are on, and half off --- edge detection would work like this. I'd really love to have the chance to work with something like this sometime, it seems to be an avenue that a) is underexplored, and b) has real potential for advancing science.
Mad hatters in jeans
23-12-2007, 20:36
yeah you would be killing them, but i struggle to see how you would make a "soul", as it's not certain that we as humans have them. But regardless of this yes.
Vetalia
23-12-2007, 20:37
Whoa - time for a smoke or something? Seems like this one's been building up in ya for a while.

Nah, it's just my usual style.
Vetalia
23-12-2007, 20:39
-snip-

Beautiful. You've said it far better than I could.
Trollgaard
23-12-2007, 21:18
If mankind created a super computer that simulates the universe based on the laws of science was created. When "people-programs" are simulated to create people (kinda like on sims only extreme) to the point where they have souls, personality and even free will in the simulation (even though they're made of programming) would it be mass murder to shut the comp off?

No. They aren't real.
New new nebraska
23-12-2007, 21:26
Not, really because it wouldn't really have free will. It would calculate the percentage of success for each possible outcome for each decision it has to make. It would kill puppys sometimes and save orphans from fires other times. Virtually of course. So no. its all 1s and 0s. Not real anything.
Mad hatters in jeans
23-12-2007, 21:41
No. They aren't real.

So what would count as real? Where do we as humans stop being human and start being body parts.
I'll explain:
Take an arm off a person, would they still be a person, yes.
Take two legs off a person would they still be a person yes.
Take away an entire torso off a person would they still be a person yes.
Take away a persons brain would they still be a person, no.
Take away part of a persons brain would they still be a person, yes (but depends on how large a piece, right?).
So as you can see from my crude explanation, i'm raising the issue of, what makes you human?
Are you just a collection of body parts that feels like it's real but really it's just body parts?

So how can you say a person who is made up in a computer programme (assuming there is an amazing computer that can do this) isn't real.
Does that mean our ideas of morals aren't real? if humans aren't real? what makes morality? is it just an expression? Is it your emotional opinion, or is it real and feasable objective truth.

By saying that the people in the computer programme aren't real you could argue we aren't real, thus we don't need morality, so When HItler killed millions of people for his "final solution" that was okay because all we are is just a collection of body parts?

By saying human experience is subjective, you are arguing that morality is subjective because it's a human idea. (subjective=In philosophy, a subject is a being which has subjective experiences or a relationship with another entity (or "object"). A subject is an observer and an object is a thing observed. This concept is especially important in Continental philosophy, where 'the Subject' is a central term in debates over human autonomy and the nature of the self. In this tradition of thought, debates over the nature of the Subject play a role comparable to debates over personhood within Anglo-American analytical philosophy.).

If you argue morality is subjective you could argue there is no morality, this is also known as a relativist position or sceptic position on morality.
Trollgaard
23-12-2007, 22:27
So what would count as real? Where do we as humans stop being human and start being body parts.
I'll explain:
Take an arm off a person, would they still be a person, yes.
Take two legs off a person would they still be a person yes.
Take away an entire torso off a person would they still be a person yes.
Take away a persons brain would they still be a person, no.
Take away part of a persons brain would they still be a person, yes (but depends on how large a piece, right?).
So as you can see from my crude explanation, i'm raising the issue of, what makes you human?
Are you just a collection of body parts that feels like it's real but really it's just body parts?

So how can you say a person who is made up in a computer programme (assuming there is an amazing computer that can do this) isn't real.
Does that mean our ideas of morals aren't real? if humans aren't real? what makes morality? is it just an expression? Is it your emotional opinion, or is it real and feasable objective truth.

By saying that the people in the computer programme aren't real you could argue we aren't real, thus we don't need morality, so When HItler killed millions of people for his "final solution" that was okay because all we are is just a collection of body parts?

By saying human experience is subjective, you are arguing that morality is subjective because it's a human idea. (subjective=In philosophy, a subject is a being which has subjective experiences or a relationship with another entity (or "object"). A subject is an observer and an object is a thing observed. This concept is especially important in Continental philosophy, where 'the Subject' is a central term in debates over human autonomy and the nature of the self. In this tradition of thought, debates over the nature of the Subject play a role comparable to debates over personhood within Anglo-American analytical philosophy.).

If you argue morality is subjective you could argue there is no morality, this is also known as a relativist position or sceptic position on morality.

They aren't real because they are a simulation inside of a computer. They are not flesh and blood, living, breathing, creatures. They are a simulation-not real, fake, false, an imitation of real life. No one mourns when a SIM city game is turned off. There is no difference.
Mad hatters in jeans
23-12-2007, 22:38
They aren't real because they are a simulation inside of a computer. They are not flesh and blood, living, breathing, creatures. They are a simulation-not real, fake, false, an imitation of real life. No one mourns when a SIM city game is turned off. There is no difference.

So you're saying if we were part of a simulation inside of a computer, switching everything off wouldn't be murder?
Sim city games have nothing to do with this at all, because they aren't run on supercomputers.
Doesn't matter if they don't have flesh and blood they according to the thread have intelligence where there is intelligence there is some sort of life.
Trollgaard
23-12-2007, 22:40
So you're saying if we were part of a simulation inside of a computer, switching everything off wouldn't be murder?
Sim city games have nothing to do with this at all, because they aren't run on supercomputers.
Doesn't matter if they don't have flesh and blood they according to the thread have intelligence where there is intelligence there is some sort of life.

Well, we wouldn't even know if it was switched off...we would just cease to exist.

And the intelligence in the simulation is just that-a simulation; an imitation, a fake. All 1s and 0s. NOT REAL.
Mad hatters in jeans
23-12-2007, 22:44
Well, we wouldn't even know if it was switched off...we would just cease to exist.

And the intelligence in the simulation is just that-a simulation; an imitation, a fake. All 1s and 0s. NOT REAL.

Yes once we were switched off we wouldn't know any better, but you could say the same about killing another person, just kill anyone because they won't know better afterward.
So what counts as real? What's real to you might be different to what i think is real. Is reality just made up of your experiences?
But in the simulation it's effectively saying "you're creating another universe with some high technology" and you say it's okay to kill them, when it's a different universe, would you say the same in this one?
Trollgaard
23-12-2007, 22:47
Yes once we were switched off we wouldn't know any better, but you could say the same about killing another person, just kill anyone because they won't know better afterward.
So what counts as real? What's real to you might be different to what i think is real. Is reality just made up of your experiences?
But in the simulation it's effectively saying "you're creating another universe with some high technology" and you say it's okay to kill them, when it's a different universe, would you say the same in this one?

Well first off, we aren't simulations. If we were, however, then there would be nothing wrong with whoever created the simulation to end the simulation.
Mad hatters in jeans
23-12-2007, 22:50
Well first off, we aren't simulations. If we were, however, then there would be nothing wrong with whoever created the simulation to end the simulation.

So you don't mind dieing? You'd just commit suicide even if it was a simulation?
I'd rather exist in a simulation than not at all.
What makes you so sure this isn't a superpowered simulation?
I see what you're saying in any ordinary computer programme it wouldn't be a problem to turn it off because it isn't real, but with this "super computer", it does mimick reality which we know, i think it would be mass murder to switch it off.
Trollgaard
23-12-2007, 22:54
So you don't mind dieing? You'd just commit suicide even if it was a simulation?
I'd rather exist in a simulation than not at all.
What makes you so sure this isn't a superpowered simulation?
I see what you're saying in any ordinary computer programme it wouldn't be a problem to turn it off because it isn't real, but with this "super computer", it does mimick reality which we know, i think it would be mass murder to switch it off.

What make me sure? Nothing really, I just don't think we are in a simulation. The computing power would too high to mimic the entire universe.

Again, even if it was a super computer simulation, it is still a simulation, an imitation, and a forgery. It'd just be a really advanced SIMS game.
Mad hatters in jeans
23-12-2007, 22:59
What make me sure? Nothing really, I just don't think we are in a simulation. The computing power would too high to mimic the entire universe.

Again, even if it was a super computer simulation, it is still a simulation, an imitation, and a forgery. It'd just be a really advanced SIMS game.

What if it was possible to get a computer that could effectively make a universe like ours, then computing power would not be a problem.

Don't confuse simulations on computers we know with this one, it's perfectly possible this is a simulation that doesn't make it any less like the places we know.(see matrix)
Vandal-Unknown
23-12-2007, 23:04
What if it was possible to get a computer that could effectively make a universe like ours, then computing power would not be a problem.

Don't confuse simulations on computers we know with this one, it's perfectly possible this is a simulation that doesn't make it any less like the places we know.(see matrix)

... try using the metaphor "asking the Vogons to turn off the computer".
Trollgaard
23-12-2007, 23:12
What if it was possible to get a computer that could effectively make a universe like ours, then computing power would not be a problem.

Don't confuse simulations on computers we know with this one, it's perfectly possible this is a simulation that doesn't make it any less like the places we know.(see matrix)

It doesn't matter how advanced the simulation is; it is still a simulation (fake).

If this was all a simulation life, fake-life, rather, would be cheapened and meaningless.
Mad hatters in jeans
23-12-2007, 23:24
It doesn't matter how advanced the simulation is; it is still a simulation (fake).

If this was all a simulation life, fake-life, rather, would be cheapened and meaningless.

Why would life be meaningless if it was a simulation? Why would it change your ideas of what is moraly right or wrong, if it was fake?

What if there was another universe that was real.
And this one in comparison was fake, would you count all our experiences as fake, and meaningless, that all life is meaningless if it is a simulation?

Or say you could create a copy of this universe atom for atom, which one would be the simulation? Say you couldn't tell for sure which was which after a while, and you settle down on the "created" universe unknowingly(this ignores all the practical impossibilities, but metaphorically speaking) and lived your life pretty much like it is now, would you say you were cheated? That the experiences you recieved on the other planet was false even though both experiences were exactly the same?

So i come to the conclusion that it would be mass murder to turn off the supercomputer universe, because it's the equivalent of switching off this universe.
Ifreann
23-12-2007, 23:30
It doesn't matter how advanced the simulation is; it is still a simulation (fake).

If this was all a simulation life, fake-life, rather, would be cheapened and meaningless.

On the contrary, a simulated universe would have a far more tangible meaning than 'real' life, since it almost certainly has a purpose as set out by the designer of the simulation.
Trollgaard
23-12-2007, 23:33
On the contrary, a simulated universe would have a far more tangible meaning than 'real' life, since it almost certainly has a purpose as set out by the designer of the simulation.

Possibly. It'd still be fake, however. A mockery of real life.
Pherecratus
23-12-2007, 23:37
By definition, a murder is the deliberate taking of another's life. As the computer would not technically be alive, technically it wouldn't be murder. However, the principal would still apply if the computer mimicked humanity (note that I do not say human behavior, as an imitation of behavior does not make the imitator the same as the entity being copied in essence, only in appearance) to the level at which it has a 'soul', or whatever characteristics we use to define humanity as separate from animal kind (regardless of whether we are in fact, separate) then in essence it could be seen as an equivalent of taking a human life.

However, if you look at this question from a moral perspective, if an alien race who were superior to humanity in every way came and began killing humans, to them it could be seen as pest control or some other term that we apply to killing animals (creatures which we see as less important than ourselves). However, i doubt that humanity would see it that way. The only real way, I believe, to solve this conundrum would be to create this sentient computer and ask its opinion. If it claims that it is humanity's equal, then who are we to judge it less, as this is the way that we define ourselves as above the animals we kill daily?
Ifreann
23-12-2007, 23:49
Possibly. It'd still be fake, however. A mockery of real life.

In what way is it a mockery?
Trollgaard
23-12-2007, 23:56
In what way is it a mockery?

It'd be trying to mimic real life, and would most likely fail. But, perhaps mockery is too strong a word. Imitation and fake are more appropriate, I think.
UNIverseVERSE
24-12-2007, 22:59
Well first off, we aren't simulations. If we were, however, then there would be nothing wrong with whoever created the simulation to end the simulation.

Well, at least you're consistent. However, please bear with me on this for a moment, as we take a brief look at a thought experiment.

Presuming that humanity, and indeed all the universe as we know it, is running inside a computer system --- that nothing truly exists. From the inside, we AI agents have absolutely no way of knowing it. We aren't simply being run on 'coin tosses' for our behavior, but as AI agents we are instead simply sets of routines that basically implement intelligence. We have no way of peeking 'outside the system' so to speak, of confirming whether we are in the top level of reality, or in a little simulation running on a supercomputer.

Let's call our current Universe C, and our parent universe B. What would happen if B was also a computer simulation, so we are in fact AI agents in a computer built and programmed by more AI agents. We would still have no way of knowing that we were not in 'reality'. Indeed, as far as we are concerned, it may as well be reality. But now consider our 'parents' in universe B. They know we are not real, that we are only running in a computer. They just don't know that they are also running in a computer system of their own.

Now, extend this idea to presume we have made a supercomputer, and inside it is running a universe, D. We know that the denizens of D are not real, that they are only AI agents running on a computer system. I still say we treat them as real, we accord them the general protections of a human, because as far as we know, that's all we are as well. There is no guarantee that there is anything special about our reality.

Beautiful. You've said it far better than I could.

Thank you. *bows*

As a final random speculation, who's to say that the supercomputer we build, and the universe running inside it, behave according to the same rules of physics we run under? Could we not envision a universe in which gravity repels, where sound travels faster than light, and hydrogen is not the lightest element? What would such a universe look like? Would it be conducive to life or not?

That would be one of the coolest experiments possible, but we will unfortunately not be carrying anything like it out soon. We can simply confine ourselves to building AIs that are equal with us, making people, not universes.

Pherecratus. I forgot to quote your post, my apologies. As you might have guessed, I generally feel that strong AI and intelligences are what we cannot murder, so I sort of agree. Basically, if our sentient computer is actually sentient, not merely mimicking but thinking for itself, then it should be accorded all the privileges of intelligent life such as humanity.
Vetalia
25-12-2007, 03:21
As a final random speculation, who's to say that the supercomputer we build, and the universe running inside it, behave according to the same rules of physics we run under? Could we not envision a universe in which gravity repels, where sound travels faster than light, and hydrogen is not the lightest element? What would such a universe look like? Would it be conducive to life or not?

Well, that's what I figured; simulations have room for magic that real physics don't. A computer could save huge amounts of resources with a simplified model, and it could run huge numbers of alternative scenarios that are not possible in a physical model.

That would be one of the coolest experiments possible, but we will unfortunately not be carrying anything like it out soon. We can simply confine ourselves to building AIs that are equal with us, making people, not universes.

I imagine we'd eventually reach a point where the amount of resources needed to do so become a trivial expense; generally, strong AI is going to be a massive technological boost that will significantly increase the rate of development in related fields. We're looking at a century from the emergence of the first vacuum-tube computers to the emergence of species-level AI intelligence, all things equal (although, of course, humans will also be far more intelligent...enhanced humans are a pretty important defense against any risks associated with AI).