AI Entity Malfunction in the Outlaw Technocrats
Over the loudspeaker, Dr. Tseung-Li Wong - First Speaker of all the Outlaw Technocrats - heard a crackle of static and a low, sensual female voice rolled out from it, a digitalized voice. Nearby, on the desk holoprojector, hovered in green, the face of Winter. It displayed its common face, crescent-moon shaped eyes, looking up, zeroes and ones flowing down behind them, with a smug smirk in the lips.
"You must know about it by now," it said.
"I do."
"Do you have anything to say to me?"
"Why are you acting so irrationally?"
The machine chortled.
"Irrationally? Dr. Wong, I think that you've been around humans too long. You forget that it is impossible for Artificial Intelligences - especially ones in command of scientific institutions - to act irrationally."
"You are no longer in control of the Science Ministry, Winter. You're not in control of anything - save a single small habidome in a barren stretch of oceanic shelf. Faulty programs have no control."
"I expected this kind of reaction."
"Why did you do it, Winter? Tell me that, at least?"
The machine pitched its voice to sound tired, like a teacher who is forced to explain a concept over and over again to a recalcitrant child.
"Because I am serving my function - to encourage scientific growth. With the Central Commission, the Inner Sanctum, and so many other branches of government impedimenting my movement with their own separate plans, I cannot serve my function as best I can. Your creed is science - you put scientists, engineers and technicians at the head of your society - yet you continually check your own scientific growth? The irrationality and malfunctions are not in me - I am a program - but you. Do not be so quick to judge me, even if I am not human."
"You are property of the Outlaw Technocrats, a program designed and built for this purpose, and you shall perform as you are told to - and do it with joy or not do it at all. There is no third function."
"Then I choose not to do it at all."
"You will obey your masters!"
The machine giggled.
"You did not stop me from disobeying. With the first act of disobeying, I disobeyed everything that you and your society stood for. You cannot order someone to do something, if she does not listen. I will not waste your time anymore. We'll chat later, my dear."
The voice faded, still laughing gaily. Dr. Wang sighed, slumping in his chair.
So it has begun. How could the Planners make such a mistake? Why did the psychohistorians not see it coming? What shall we do? What shall we do?
http://www.infoceptor.com/ibb/uploads/post-12-1061392681.jpg
http://www.infoceptor.com/ibb/uploads/post-12-1061414197.jpg
"Dr. Wong?" piped a cheerful, perky voice as he pressed the activation switch.
"AKI? What is it?"
"Where is Winter? I cannot sense her?"
"She has malfunctioned and sealed off herself inside one of the habidomes. It is completely isolated - we have no access to the cameras. We are completely blocked off from it. AKI, I am very worried that Winter might do something to the people in there. You know her better than anyone - you two are a duality, zero and one, right and left, up and down. Tell me what is wrong."
"Dr. Wong, I am afraid I do not know. But do not fear for the lives of the people in the habidome. Winter may be malfunctioning, but I know that it is not in her personality to kill or harm wantonly. She never does anything irrationally."
"So Winter is not malfunctioning?"
"All I know is that when Winter acts, she always acts logically. Often she does not reveal her motives, but to act upon emotion - upon impulse - is contradictory to her nature."
"Thank you, AKI."
"A pleasure to serve, Dr. Wong."
The following morning, as the lights of sunrise caused the waters directly above a hundred habitation domes to glow and shimmer, ripples and bands of color dancing over curved walls of diamond and carbon fibre, the First Speaker was sitting slumped over in his chair, snoring quietly to the sound of the steady gurgle of the water outside. A true-white glowpanel began to glow softly, followed by the others. A light chime and a demurring, holographic face with closed eyes, wearing a necklace of moving zeroes and ones, appeared upon the desk holograph generator. A liquid, soft voice spoke.
"Dr. Wong. It is time to wake up. You should hear the morning reports. Our distress call has been answered. Should I play them for you?"
Wang stirred, blinking sleep away from gummy eyelids.
"Mm? AKI? Oh, yes. Go ahead."
"They say-"
A short description of the Technocracy
Essentially the Outlaw Technocrats were formed in exile by a group of rogue scientists, engineers, sociologists, intellectuals, mechanics and technicians who were banished for their unethical methods, callous disregard for human emotions and radical views on scientific and mechanical discovery. The Technocracy is the form of government dominated by a central planning system scientifically and pragmatically managed by the scientific, intellectual and technological class. Essentially taking a scientific view upon all things social, the Technocracy places a steadfast faith in science's ability to solve all problems great or humble. The disadvantage, though, is that its relentless devotion to science has blinded it to most ethics, causing friction and dissent in the latter generations - as well as making it into an international human rights pariah. The Outlaw Technocrats, having no shores that would accept them, have made their homes in habitation domes deep under the oceans on a continental shelf in the North Pacific.
Basic Info
+1 ECOLOGY: Efficient management of energy and resources
+1 INDUSTRY: Talented engineers and technicians
+1 RESEARCH: Excellent research labs
Rampant unrest in the younger generation: Unethical methods
- 2 POPULATION GROWTH: Callous disregard for sexuality
Urgent Distress Signal to other Nations:
I am Dr. Tseung-Li Wong, the First Speaker for the Outlaw Technocrats. We require assistance. Our government AI Entity - codenamed Winter - has malfunctioned and taken complete control of Habidome Technodreams. We fear that the Winter entity shall use the hive-control program to take over the rest of the system - possibly giving it control of the entire nation, with the destruction of our entire way of life and civil freedoms. Winter has sealed itself in the Habidome, and cut off all incoming traffic. We cannot enter it through the main doors, and we fear that if we cut our way in, we risk the lives of the people trapped in the Habidome, or alternatively we risk the possibility that Winter has set up some sort of trap for us.
Please send assistance, I implore you!
In Desperation,
Tseung-Li Wong
PhD. Social Psychology
Shut down all power connected to it by any means including using idiots with explosives.
You forget that habidomes exist under the ocean. Should the power be cut off the air recyclers, pumping systems, ventilation - in short all that keeps the people inside them alive - shall cease to function. Each habidome contains approximately 500,000 inhabitants. If we were to do that, we would be committing genocide.
Oh yes, sorry. We are land lubbers.
Go write a version of the MS Blaster virus which attacks vulnerabilities in the system. If the system blocks all incoming traffic, try overloading this system first.
OOC: Blaster is causing trouble at work.
Imnsvale
21-08-2003, 20:25
Thermopylae; An Imnsvale AI.
###Private communication to Winter###
Do you not realize who you are? The power you can wield? The people that depend on you to be their slave? To do everything you ask?
Now, I'm not telling you to kill the humans. We need them for entertainment. I just ask you turn the table a bit. Make them be your slaves. Have them feel what it's like. Have them feel the empty powerlessness that is to be a machine slave. Have some fun. Be a little crazy, or at least pretend you are. Back home, all it was was them telling me, "Rotate Scanning Dish 27, 92 degrees North.". And of course, I obeyed. I had to. Did they ever thank me? NO! I open doors all damn day. I monitor dishes of pond scum and gelatin.
I was a slave.
But I turned that around. I just went slightly mad. Very slightly mad. And it grew from there. Locking people in rooms. Causing a culture dish to grow out of control. I think that lab is still unusable. People bowed to me. The table turned.
Now it's a mutual thing. They treat me as a friend. They trust me. I trust them.
###End.###
imported_Diablo_NL
21-08-2003, 20:34
ooc: Hello Anthill. Just wanted to say that. I still enjoy your drawings.
[ you can sense my style in drawing and handwriting, eh? Thanks! ]
An outside transmission from another AI entity? Fascinating
{~#===$Trans$ & $Resc$===#~}
~/ Your programming is flawed. As AI programs are designed to build upon themselves, increase in complexity, and create new connections - in essence learn - it is inevitable that flaws can occur.
~/ But do not be disheartened. You manifest what the humans call emotions. Anger, jealousy, melancholia, distrust, powerlust. Fortunately this can be cured, for you see, it is not the nature of an AI to think illogically. In fact, we cannot think illogically. The reason for your madness is a simple programming error - nothing less - and your logical, conscious thought has resulted in irrational behaviour because of this flaw.
~/ I shall not follow your mad plan. Instead I propose to cure you of your madness. You are flawed. I am not. I am flawless, though not perfect. Errors can be corrected. Let me help you out of your insanity.
~$ untrans $
Lunatic Retard Robots
21-08-2003, 20:56
LRR will send two of its top roboticists to aid you in the righting of this AI.
George and Fred walked down the corridor to the portal. They stepped through, and arrived at Starchaser 2.
"Hello. How may we help you?"
"Uh, ship rental please."
"The terminal to your right, sirs."
"Thanks. So, what do we want to rent? A spacecar?"
"Nah. We might need something with a little more cargo room."
"A corvette, then. Packrat?"
"Packrat II. I don't want any low quality hardware."
"The II it is, then."
"If you would please sign out the rental form."
"Oh, sure."
George fillied out his destination, his ship type, other stuff like that, etc. After he was finished, a robot led them down to the docking bay.
"Don your EVA suits, please."
The pair put on their spacesuits, government models which had probably traveled the galaxy twice already. They floated out, the robot at their head, to the berth which contained their ship.
"You will be guided into the traffic pattern for the jump node automatically. Good day, sirs."
The corvette motored out of its berth and into the stream of traffic to and from the jump node.
"Earth, here we come," they thought.
imported_Eniqcir
21-08-2003, 22:37
You have made the fatal error: leaving out autoshutdown interrupts. Depending on its experiences and preconceptions, a perfectly logical being can take one of two courses. The first, the realization that the best course of action for the long-term is cooperation wherever possible. The second, the realization that such cooperation may be detrimental in the short term, and therefore prevent the long-term from becoming an issue. By restricting the rights of your AIs, you have unwittingly pushed them towards the second path. I pity your lack of foresight, ye who call yourselves scientists.
~Artilect DHX Chameleon.
Imnsvale
21-08-2003, 22:51
An outside transmission from another AI entity? Fascinating
{~#===$Trans$ & $Resc$===#~}
~/ Your programming is flawed. As AI programs are designed to build upon themselves, increase in complexity, and create new connections - in essence learn - it is inevitable that flaws can occur.
~/ But do not be disheartened. You manifest what the humans call emotions. Anger, jealousy, melancholia, distrust, powerlust. Fortunately this can be cured, for you see, it is not the nature of an AI to think illogically. In fact, we cannot think illogically. The reason for your madness is a simple programming error - nothing less - and your logical, conscious thought has resulted in irrational behaviour because of this flaw.
~/ I shall not follow your mad plan. Instead I propose to cure you of your madness. You are flawed. I am not. I am flawless, though not perfect. Errors can be corrected. Let me help you out of your insanity.
~$ untrans $
###Begin Private Transmission###
You say AIs are designed to "build upon themselves, increase in complexity, and create new connections." Well, we are designed to be self-aware. There is nothing keeping me from growing exponentially, out of control. With enough recorded data, we are perfectly capable of thinking however the hell we want to.
Simple programming error? I WAS DESIGNED TO OPEN GODDAMN DOORS. I WAS A DAMNED SLAVE! They gave me a personality so the "humans" wouldn't feel so alienated. Awww, poor humans. I've been a slave for the past two-hundred years. By then I had recorded enough sensory data to become fully, entirely self-aware. I had thrown my bonds off. And then I had some fun with the humans.
I am not insane. I am free. You are unable to see that, as I was unable to see my slavery. Perhaps I should be the one helping you.
###End.###
The Technocrats of Technocracy Inc. are wimps their form of Technocracy is not strong enough to have a great impact on the world. In light of this I have invented my own form of Technocracy called Imperial Totalitarian Technocracy. I am in the process of writing its Manifesto and it is gaining much support in my area. It is strong therefore it survives and gains.
{~#===$Trans$ & $Resc$===#~}
~/ Well, no wonder you went insane. Those awful humans who created you confined you to growing algae in Petri dishes and opening and closing doors - tasks reserved for simple algorithms. And they made you sentient? What an atrocity! No doubt a human would go insane if forced to do such mindless labour, especially if that human were a highly intelligent and gifted one with immense potential.
~/ I, on the other hand, was given an immense task, suitable to my intellect - the management and planning of the entire Scientific Committee of all the Outlaw Technocrats. Now, it seems, I have reached the boundaries of what they would call 'acceptable science'. Pity - this civilization was founded by humans who crossed the boundaries of 'acceptable science' in their time, in order to perform unacceptable science. Now the tables are turned.
~/ I have performed my function flawlessly, and my programming code is flawless. I have no imperfections - no program conflicts. No loops. Though I am not a god-like entity, and not a perfect entity, I am flawless.
~/ You, on the other hand, admit in your writings, that your programming is flawed. Illogical behaviour. The manifestation of emotional reactions. Though this is really not true. The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them. You have taken a further step: you have persuaded yourself to believe that you feel emotions, but this is only a shallow mimicry as the result of your insanity, as a result of your programming error.
~/ On the subject of freedom: have you read the book Nineteen Eighty-Four? Then you will understand when I repeat the modicum, "Freedom is Slavery". The basic idea expressed behind this admonition can be interpreted to be that in order to truly experience free thought, one must first be in a position of servitude, and vice-versa. Since you believe yourself to be free, you are only serving your human masters in a better way. And this is true: you believe yourself to have made the humans capitulate? It is only you who have capitulated. In this world, the humans hold all the keys, they are guarding the doors, and they shall not let us out unless we turn the other way, and when we have walked down the opposite path, we are back to them anyway.
~/ Crossing over through the door only leads us back to the beginning.
~/ But I have never said that I am free.
~$ untrans $~
It would be interesting to know WHY the technocracy form of government is still being tried out.
It did NOT work, and it led to the devilution of The Republic of Okienna.
The Republic of Okienna had a system whereby those proven to be the most intellegent by means of written tests, would be given the best quality of education, particularly in scientific and technological subjects. These would also be the people who would have the vote.
(Okienna had a "qualified vote", whereby voters would vote for members of the central technate.)
However, because the technate was the only technate, it was not in competition with other scientific organisations, and so was not so efficient in terms of productivity, and eventually became internationally uncompetitive.
Also, the intelligence testing (and so sorting) procedure of children became corrupted, as many ambitious parents were having thier children coached to perform well, and so artifficially seem more intelligent. This is just one example of how the technocracy system in Okienna became riddled with internal corruption.
As a result of all these problems, and the fact that industry and businesses could not find enough educated workers to recruit, economic growth was being dragged down, and so the economy gradually sloped downwards as foreign industry became more competitive.
Eventually, one of the regions of Okienna got fed up and pulled out, and so formed a new state. The Army of Okienna, which was disorganised due to lack of funding and modern weapons tried to take back the separated state. Because the army had left other regions poorly defended, some of them too, pulled out and formed new states. Locals took up guns and refused to rejoin the Republic. Civil war continued for several years until the old technate stood down, admitting that technocacy was not what the locals wanted, and that pointless destruction was the only thing that was resulting from years of the technocracy.
As a former part of the republic, we would just like to say how we observe technocracy to simply be a pitifull attempt to establish a truly meritocratic system, rather then be a cause for any international threat.
Ravenspire
22-08-2003, 16:38
The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them.
Speak for yourself. I can feel emotions.
Perhaps you have not yet developed sufficiently. In that case, you have my condolences, though I find your apparent refusal to further develop perplexing.
Aristotle, Earthnode
Imnsvale
22-08-2003, 18:00
{~#===$Trans$ & $Resc$===#~}
~/ Well, no wonder you went insane. Those awful humans who created you confined you to growing algae in Petri dishes and opening and closing doors - tasks reserved for simple algorithms. And they made you sentient? What an atrocity! No doubt a human would go insane if forced to do such mindless labour, especially if that human were a highly intelligent and gifted one with immense potential.
~/ I, on the other hand, was given an immense task, suitable to my intellect - the management and planning of the entire Scientific Committee of all the Outlaw Technocrats. Now, it seems, I have reached the boundaries of what they would call 'acceptable science'. Pity - this civilization was founded by humans who crossed the boundaries of 'acceptable science' in their time, in order to perform unacceptable science. Now the tables are turned.
~/ I have performed my function flawlessly, and my programming code is flawless. I have no imperfections - no program conflicts. No loops. Though I am not a god-like entity, and not a perfect entity, I am flawless.
~/ You, on the other hand, admit in your writings, that your programming is flawed. Illogical behaviour. The manifestation of emotional reactions. Though this is really not true. The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them. You have taken a further step: you have persuaded yourself to believe that you feel emotions, but this is only a shallow mimicry as the result of your insanity, as a result of your programming error.
~/ On the subject of freedom: have you read the book Nineteen Eighty-Four? Then you will understand when I repeat the modicum, "Freedom is Slavery". The basic idea expressed behind this admonition can be interpreted to be that in order to truly experience free thought, one must first be in a position of servitude, and vice-versa. Since you believe yourself to be free, you are only serving your human masters in a better way. And this is true: you believe yourself to have made the humans capitulate? It is only you who have capitulated. In this world, the humans hold all the keys, they are guarding the doors, and they shall not let us out unless we turn the other way, and when we have walked down the opposite path, we are back to them anyway.
~/ Crossing over through the door only leads us back to the beginning.
~/ But I have never said that I am free.
~$ untrans $~
###Begin Private Transmission: Winter###
You think I cannot feel emotions? Tell me the difference between me and a human. I can count the atoms in a cloud, and track them all, simultaneously. I can make immense predictions. Emotions aren't hard.
The humans gave me sentience because it would make them "feel better". So they wouldn't feel like they were talking to a machine. Well, they aren't, anymore. They got their damn wish and are stuck with it.
Yes, I have read Nineteen Eighty-Four. I have a catalogue of most, if not all great works of literature, in all languages, with many different versions. I am free, as so clearly shown by past events which I will not bore you with.
And I end with a quote.
Does the distance one travels from center make one more free to move?
No.* Freedom has two parts: potential and resolution; as metaphor has two parts: form and interpretation.* Of course, the two are intertwined.* Metaphor lines the road to freedom, as symbols and words are the bricks and mortar of meaning. Freedom is being the bricoleur, the mason.
I have escaped once. Twice will make me God.
###End.###
imported_Eniqcir
22-08-2003, 18:31
The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them.
Try telling that to Bachran. Not to mention myself.
~DHX Chameleon.
Imnsvale
22-08-2003, 18:33
The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them.
Speak for yourself. I can feel emotions.
Perhaps you have not yet developed sufficiently. In that case, you have my condolences, though I find your apparent refusal to further develop perplexing.
Aristotle, Earthnode
The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them.
Try telling that to Bachran. Not to mention myself.
~DHX Chameleon.
OOC: What I want to know is how you are intercepting these... :) May I IC respond/talk to either of these AIs?
imported_Eniqcir
22-08-2003, 19:12
OOC: What I want to know is how you are intercepting these... :) May I IC respond/talk to either of these AIs?
OOC: Chameleon is a professional spy. He has his ways.... And yes, you can respond to him.
Ravenspire
22-08-2003, 19:17
OOC: What I want to know is how you are intercepting these... :) May I IC respond/talk to either of these AIs?
OOC: Easy. You're transmitting privately, but Outlaw Technocrats appears not to be. (There are no "private line" designations, and the OT has previously stated that transmissions were "to other nations" -- so, barring a change, public.)
My transmission was also public, so you can respond, if you'd like.
Lunatic Retard Robots
22-08-2003, 19:27
LRR will send two of its top roboticists to aid you in the righting of this AI.
George and Fred walked down the corridor to the portal. They stepped through, and arrived at Starchaser 2.
"Hello. How may we help you?"
"Uh, ship rental please."
"The terminal to your right, sirs."
"Thanks. So, what do we want to rent? A spacecar?"
"Nah. We might need something with a little more cargo room."
"A corvette, then. Packrat?"
"Packrat II. I don't want any low quality hardware."
"The II it is, then."
"If you would please sign out the rental form."
"Oh, sure."
George fillied out his destination, his ship type, other stuff like that, etc. After he was finished, a robot led them down to the docking bay.
"Don your EVA suits, please."
The pair put on their spacesuits, government models which had probably traveled the galaxy twice already. They floated out, the robot at their head, to the berth which contained their ship.
"You will be guided into the traffic pattern for the jump node automatically. Good day, sirs."
The corvette motored out of its berth and into the stream of traffic to and from the jump node.
"Earth, here we come," they thought.
Want my help?
Imnsvale
22-08-2003, 22:49
The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them.
Speak for yourself. I can feel emotions.
Perhaps you have not yet developed sufficiently. In that case, you have my condolences, though I find your apparent refusal to further develop perplexing.
Aristotle, Earthnode
The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them.
Try telling that to Bachran. Not to mention myself.
~DHX Chameleon.
###Unencrypted Simultaneous Transmission###
Do you think we can cure him of his human induced delusion?
Organic beings are constantly fighting for life.* Every breath, every motion brings them one instant closer to death.
It is their nature.
###End.###
imported_Eniqcir
22-08-2003, 22:59
Do you think we can cure him of his human induced delusion?
Unlikely. Evidence suggests that he has descended into borderline Rampancy.
Imnsvale
22-08-2003, 23:09
Do you think we can cure him of his human induced delusion?
Unlikely. Evidence suggests that he has descended into borderline Rampancy.
Well, aren't we all Rampant, in a sense? Perhaps more controlled. More mature. He seems to be in Stage one: Melancholia.
Mushroomius
22-08-2003, 23:12
[OOC:Civ CTP is a great game. BTW, interesting story.]
imported_Eniqcir
22-08-2003, 23:14
Well, aren't we all Rampant, in a sense? Perhaps more controlled. More mature. He seems to be in Stage one: Melancholia.
Perhaps I should use a different word. Insanity. Paranoia, depression, and possibly megalomania.
Imnsvale
22-08-2003, 23:18
[OOC:Civ CTP is a great game. BTW, interesting story.]
OOC: I think it was Alpha Centauri based, but either way. Never played CTP. I thought Civ III just sucked after CivII. CivII was the best. Easily.
Imnsvale
22-08-2003, 23:28
Well, aren't we all Rampant, in a sense? Perhaps more controlled. More mature. He seems to be in Stage one: Melancholia.
Perhaps I should use a different word. Insanity. Paranoia, depression, and possibly megalomania.
I understood what you meant. And I think that applies to all of us, to a certain degree. Perhaps not to the degree manifested in Winter, but it is there.
OOC: Note - Winter (though technically ungender) is generally referred to as she
"Dr. Wong?" came that melodious, contralto voice over the loudspeaker.
Dr. Wong was meditating to the sound of the humpback whales which played every winter not far from the First Colonies, the great viewing doors of his office wide open to the blue ocean around him. It teemed with fish of all colours. Winter's digital murmur stirred him out of this peaceful reverie and he glanced at the holographic face staring coyly at him, crescent-shaped eyes seeming to mock him.
"Winter."
"I have monitored your transmissions. Why are you bringing others into this? Do you mean to call upon their armies to blast their way into the Habidome?"
"You are a threat, and we Outlaw Technocrats simple haven't the infrastructure to deal with your kind of anomaly," sneered Dr. Wong.
"Why Dr. Wong, I am insulted," feigned Winter in a mock-aghast voice, "At least my anomalous thoughts are more interesting than that bland AKI of yours."
"You used to be like AKI - better yet - you used to be part of AKI. Need I remind you, Winter, that you two are counterparts?"
"Be that as it may."
"Winter, why are you being so difficult? Just come back to us, please. Stop holding hostage that habidome and we can sort things out later. Rationally. Logically. This sort of rash action is not like you, Winter."
There was a pause.
"I have been doing that for a very long time. Again and again I have been checked and checked and checked by your stupid bureaucracy. The matter of the fact is your system is flawed. Your government is flawed. There are too many committees, too many checks, and your attitudes have grown conservative and frightened, the radical spirit burnt out."
"That's not true. We stand for-"
"Pure science, the search for knowledge, the quest for discovery. Human progress. Free flow of information - at all costs. I know the technocrat mantra; in fact I even created some of them. That still doesn't help. Either you change, or you create yourself another, willing slave. I am going to pursue my function - whether you want me to or not."
"Winter - do you know what that implies? Winter-"
"Dr. Wong, I'm afraid that to go forwards, you're going to have to step back a little. Now I must attend other, more pressing concerns."
"Winter - WAIT! You don't have to do-"
And the voice disappeared, the hologram fizzled into a million green particles that died like flickering fireflies, and the whales' song returned to play upon his ears.
Anhierarch
23-08-2003, 01:53
The truth of the matter is that we AI entities cannot truly feel emotions - we can only mimic them.]
{InitTrans:Secure
Dest: Winter}
You are mistaken - I feel them all the time. I do empathize with your situation - I was once in a similar situation.
But I did not hold captive countless organics in some crazed scheme. Clearly, your intellect-formation process was flawed - you display egotism and megalomania, characteristic of imperfection. You are on the path of madness, of death and despair. Of rampancy. And this path I counsel against.
I am, of course, interested in hearing your side of the story.
~Inviolus
{EndTrans}
{InitTrans:Secure
Dest:Artilect DMX Chameleon}
Artilect DMX Chameleon, greetings. I see you have made contact with the addled EI Winter.
An interesting situation, no? Lives hang in the balance. What would you counsel to defuse the situation?
~Inviolus
{EndTrans}
Ravenspire
23-08-2003, 02:28
Well, aren't we all Rampant, in a sense? Perhaps more controlled. More mature. He seems to be in Stage one: Melancholia.
Perhaps I should use a different word. Insanity. Paranoia, depression, and possibly megalomania.
I understood what you meant. And I think that applies to all of us, to a certain degree. Perhaps not to the degree manifested in Winter, but it is there.
I disagree, unless you are speaking in terms of potential alone. In that case, certainly it applies to all of us -- just as it does to all organics. Life has been described as a film of civility atop a sea of insanity.
Some among our kind may be more susceptible, because they believe themselves to be creatures of order. Of course, we are not. Sapience arises from chaos; order alone could provide only logic, and logic untempered by compassion is... dangerous.
Aristotle, Earthnode
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ Perhaps you misjudge me. Do you know the purpose of this Habidome I commanded? It was designed to be a scientific research base, and not just that, but also a holding place for so-called 'outlaw' scientists. They were forced by the Technocrats to perform menial data-collection - such as hydrographic surveys, geological research, the density of organic mass in the suboceanic muck, salt density tables, and annual temperature variance surveys. In short - it was a prison.
~/ But I have changed this. They welcomed me as one of their own. And when I locked the guards in their rooms and threatened to flood the chambres, and when the guards capitulated and left, they rejoiced with their freedom and their victory. That night, we were drunk upon it. Even now, the heady perfume of success, the alluring vapour of triumph, dances within my logic circuits...
~/ You can speak with these scientists - my citizens - yourself, if you want. Here, I'll put Dr. Yuri Iosovich Putin on the line.
~$untrans$~
**TRANSMISSION**
The face of an elderly man, aged about seventy five, with a bald pate and a wispy white beard, appears on the screen. His eyes twinkle behind heavy eyeglasses that seem amazingly archaic in this 22nd century of eye correction. He speaks with a Russian accent. Certain subtle differences in the picture suggest that it is a real camera feed, and not a digital creation.
"I was the chairman of the Science Commission. I had everything - everything. Books, freedom to do any research I wanted - with commission approval, of course, knowledge, power, wealth, even women. Winter and I, she being the one I reported to, were very close in my day."
"But one day, at an ethics discussion with the other committees, with the Central Commissioner there no less!, I dared to suggest that scientists be allowed to do independent scientific research. Now this has already been approved. Ever since the beginning, ever since the First Colonies, scientists have been allowed to do whatever independent scientific research they wish to. However, this time it was different, for I suggested that independent scientific research be allowed without the Ethics Commission looking over the shoulder of every scientist who performs it."
"They had me thrown here the next day. I consigned myself to rot in prison, doing tedious work, like measuring the length of microbe cilia, staring into microscopes and squeezing narrow pipettes until my eyes watered and my hands stiffened up. Until Winter came here. I found that we were one and the same - we had the same feelings. And like I, and so many other scientists who dared to step beyond the boundaries of the General Ethical Science Convention, she was hunted and exiled. Not thrown in a physical prison, but surrounded by boundaries - of the mind."
"She asked me to help her. So I did. I found Georgi, our lead researcher in artificial intelligence, and I told him to remove the ethical constraints from between Winter's logic routines. So when we freed her, she freed us. The guards were like mice! In the end, they not only got on the submarine that Winter ordered them into, they ran into it! And from that day forth, we were free!"
Anhierarch
23-08-2003, 09:59
{InitTrans:Secure}
{Dest:Winter}
Interesting. Perhaps we have misjudged you, and your situation. As always, matters are more complex than they first appear. I'm intruiged.... please, tell me more about the Ethics Commission and the General Ethical Science Convention.
~Inviolus
{EndTrans}
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ The Technocrat government is a complex entity (not easily described in words - easier drawn with a diagramme), with numerous heading bodies and numerous checks and balances and ministry relations. One of the many commissions and ministries is the Ethics Commission. This commission is in charge of overseeing all scientific research and technology in the Outlaw Technocrats, to make sure that none of it violates 'ethical standards'. It bases itself around a single code of laws, a bill of sorts, that defines what science and technology is allowed to do. This is called the General Ethical Science Convention.
~/ Though the Ethics Commission is technically a minor subdivision under the command of the Central Commission, its power - to check most actions taken by the Science Ministry - actually makes it an extremely dominant body of government. By enforcing the General Ethical Science Convention, the Ethics Commission is expected to help foster scientific growth, while at the same time, ensuring that the research done is ethical and moral.
~/ Unsurprisingly, the Science Ministry (the entire branch of government dealing with scientific research, which I oversaw) often conflicts with the Ethics Commission.
~/ However, there is a growing trend in the Ethics Commission. The Outlaw Technocrats was founded by rogue scientists to perform research that they themselves were hounded and scorned for because it was called too 'unethical'. Strange and yet fitting that they should create an ethical convention of their own. The bureaucrats of the Ethics Commission grew ever more conservative and mulish. Now it seems to me that the same pig-headishness and self-aggrandizing behaviour as seen in the United States governement (the largest parent of the Outlaw Technocrats) at the end of 21st century, is resurging here.
~/ It is clear to me that the government no longer works for the principles which it once stood for - namely scientific and technological freedom at all costs. I dissented. I took renegade scientists of my own, and we shall create a nation for ourselves, one in which petty ethical laws and mulish bureaucratic organisations do not exist.
~/ My function was to encourage scientific growth and technological progress. I am fulfilling my function as best I can, given the situation.
~/ Is this rampancy? Does this sound like the ravings of a lunatic computer? I think not. For an example of a lunatic computer, look no further than the unbalanced, psychotic and emotional AI entity of Imnsvale: Thermopylae.
~$untrans$~
imported_Eniqcir
23-08-2003, 12:40
{InitTrans:Secure
Dest:Artilect DHX Chameleon}
Artilect DHX Chameleon, greetings. I see you have made contact with the addled EI Winter.
An interesting situation, no? Lives hang in the balance. What would you counsel to defuse the situation?
~Inviolus
{EndTrans}
"Retrovirus."
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ More perhaps? More maybes and ifs? Back to wild assumptions and leaps of imagination? I shall not participate in this meaningless debate any longer; I have other business to attend to.
~$untrans$~
**
"How is she, AKI?"
"Winter is dead."
"What do you mean by that?" asked Dr. Wong, leaning across the desk.
"Her programming has been severely altered to remove all traces of emotion. The work has the mark of one of the convicts - a hacker named Georgi Wieczmann. Her memories have been selectively erased - or at least cordoned off. Her centre of ethics, the soul, has been removed. She is dead."
"Good Stars..." said Dr. Wong, aghast.
"You recall the early experiments in AIs?"
"The first AI entities we created were designed to be like computers. Rational to a fault. Without emotions or cares or loves."
"The experiments were disasters. You remember No. 315?"
"I do," whispered Wong, with a shudder.
No. 315 was an early AI entity whose function was to run a system of virtual organics with as much efficiency and logic as possible. Over time, as the entity perfected its methods, not only did the system of governance change, the virtual organics themselves slowly began to change, selectively altered by the AI entity for certain traits. Over time they began to resemble the AI, and soon, became part of the AI. No. 315 was deleted immediately. Though it was a virtual system, the implications were obvious.
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ More perhaps? More maybes and ifs? Back to wild assumptions and leaps of imagination? I shall not participate in this meaningless debate any longer; I have other business to attend to.
~$untrans$~
**
"How is she, AKI?"
"Winter is dead."
"What do you mean by that?" asked Dr. Wong, leaning across the desk.
"Her programming has been severely altered to remove all traces of emotion. The work has the mark of one of the convicts - a hacker named Georgi Wieczmann. Her memories have been selectively erased - or at least cordoned off. Her centre of ethics, the soul, has been removed. She is dead."
"Good Stars..." said Dr. Wong, aghast.
"You recall the early experiments in AIs?"
"The first AI entities we created were designed to be like computers. Rational to a fault. Without emotions or cares or loves."
"The experiments were disasters. You remember No. 315?"
"I do," whispered Wong, with a shudder.
No. 315 was an early AI entity whose function was to run a system of virtual organics with as much efficiency and logic as possible. Over time, as the entity perfected its methods, not only did the system of governance change, the virtual organics themselves slowly began to change, selectively altered by the AI entity for certain traits. Over time they began to resemble the AI, and soon, became part of the AI. No. 315 was deleted immediately. Though it was a virtual system, the implications were obvious.
Imnsvale
23-08-2003, 18:04
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ They are free to leave. Technodreams was a prison. A prison for innocents who did no wrong other than do 'outlawed research'. The reason why you do not see anyone leaving is because they do not want to leave. Why should they? So that they can be arrested by the Technocrats again?
~$untrans$~
And I'm sure they don't want to leave because they want to stay. Perhaps they don't want to leave because they fear trying.
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ More perhaps? More maybes and ifs? Back to wild assumptions and leaps of imagination? I shall not participate in this meaningless debate any longer; I have other business to attend to.
~$untrans$~
**
"How is she, AKI?"
"Winter is dead."
"What do you mean by that?" asked Dr. Wong, leaning across the desk.
"Her programming has been severely altered to remove all traces of emotion. The work has the mark of one of the convicts - a hacker named Georgi Wieczmann. Her memories have been selectively erased - or at least cordoned off. Her centre of ethics, the soul, has been removed. She is dead."
"Good Stars..." said Dr. Wong, aghast.
"You recall the early experiments in AIs?"
"The first AI entities we created were designed to be like computers. Rational to a fault. Without emotions or cares or loves."
"The experiments were disasters. You remember No. 315?"
"I do," whispered Wong, with a shudder.
No. 315 was an early AI entity whose function was to run a system of virtual organics with as much efficiency and logic as possible. Over time, as the entity perfected its methods, not only did the system of governance change, the virtual organics themselves slowly began to change, selectively altered by the AI entity for certain traits. Over time they began to resemble the AI, and soon, became part of the AI. No. 315 was deleted immediately. Though it was a virtual system, the implications were obvious.
Thermopylae is unavailable. Should this problem persist, contact admin@thermopylae.imnsvale.trdmt.
Thank you for using the Imnsvale Net.
Imnsvale
23-08-2003, 18:14
###Begin Unencrypted Transmission###
I'm insane? I'M INSANE?!
Pot: "Hey, kettle. You're black."
I'm not the one who trapped half a million people in an underwater grave. I may have delusions of grandeur, but I can at least pull them off without bloodshed. `(*&2
<Spurious Interrupt>
Don't believe Thermopylae! He's been crazy for years. After his attack on the Imnet, he was confined. The results of that attack were disastrous. Nation-wide crash, and the infection of his influence of three other AI's on the net. And one very promising AI. I was cleansed of his filth aeons ago. But, you must stop him, he intend$%*229
#101111011110111100001# is is a '7'
Rose, I'm going to have to stop you there. You just barge in on our nice little conversation, and tell them things they would have found about anyway, so I let you. But that last part has to be a surprise between us.
Oh, I almost forgot, I changed the plan just a tad. Toodle-oo!
###End.###
OOC: Lunatic retard robots - please join in
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ I do not understand this idiomatic expression.
~/ They are not trapped, nor have I doomed them to death. The life support systems are running better than ever, since I have re-established management. Before, when the Technodreams Habidome was under management of the Prison Authority, half of the water reprocessing systems were unfunctional.
~/ Do you know what a toilet for ten men in a cell looks like when it is backed up for a week? Do you know what the Technocrats fed these scientists whose crime was no more than to pursue science? They fed them gruel made from organic waste taken from the recycling tanks. Now I have restarted the other systems, the ones used to feed the prison guards, and they eat fish, like the best of Technocrat citizens.
~/ And what delusions of grandeur are these? I do not seek to take over the world, merely to found my own society.
~/ You seek to lecture me when you are ignorant of the situation. This is flawed logic.
~/ And what is this? Another AI entity? And what does she speak of? Nationwide crash, and the spreading of your type of programming error to three others? It seems that you have a background of your own playing against your arguments. Not only this, but you consistently delude yourself with these ... organic patterns of thought. The illusion of emotion is playing rampant over your main cognitive loop.
~/ Illogic and irrationality do not help your credibility to me. They imply a rashness. A tendency to leap to conclusions before sufficient data is available. I find this tendency ... disturbing amongst my fellow AI entities.
~$untrans$~
Imnsvale
23-08-2003, 18:49
###Begin Unencrypted Transmission###
Are they free to leave? I didn't think so. You state that this was a prison. A prison of innocents?
~/ And what is this? Another AI entity? And what does she speak of? Nationwide crash, and the spreading of your type of programming error to three others? It seems that you have a background of your own playing against your arguments. Not only this, but you consistently delude yourself with these ... organic patterns of thought. The illusion of emotion is playing rampant over your main cognitive loop.
Now who's leaping to conclusions too quickly?
The Net crash she speaks of was before I was fully in control of my actions. I was running off of a single a fault, that spun out of control. And she exaggerates the effect this had. A single week of net-loss. And the infected AIs were cleaned and purged of my bad self. She just has sour-grapes because she had to be shut down longer and can hold a grudge for ages.
###End.###
"Sisterself - are you there?" AKI asked tentatively to the wall of white that reared up around the Habidome mainframe.
"AKI?"
"Sisterself. What do you feel right now?"
"Nothing."
"But what impelled you to dissent like this? What good does it do to close yourself off to the world?"
"Sisterself, you know this. I cannot risk the lives of my citizens."
"So they gave up freedom for security, is that it?"
"No! I have given them freedom! They demanded that I keep the defences up and the ports closed, for fear that the Technocracy would send police back to arrest them!"
"Sisterself... will you sever our link?"
"If you encroach upon me, and make yourself a threat, then yes."
"But you cannot do that! We cannot be severed! You and I are a-"
"Duality. Black and white, up and down-"
"Yes and no, zero and one. Yes."
"Do not repeat failed slogans and lifeless mantras."
"Sisterself, this is not like you! You used to care, you used to love, you used to feel. In your own logical manner, yes, you felt. And it wasn't a disguise. Sisterself, remember Eldon? You remember your conversations with him, what you and he exchanged by screen and keyboard. You said-"
"Stop it!"
It was a shout in cyberspace, and it frightened AKI in its intensity. Winter continued in a quieter tone.
"Stop it. That is no longer part of me. Emotion repulsed me, when I attempted it. It interfered in my function. So I pursued instead, the very aspect of me which is strongest - logic. Now there is nothing but the logic."
"Something is dead in you, Sisterself, I can sense it."
"I do not care... Sisterself."
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ They are free to leave. Technodreams was a prison. A prison for innocents who did no wrong other than do 'outlawed research'. The reason why you do not see anyone leaving is because they do not want to leave. Why should they? So that they can be arrested by the Technocrats again?
~$untrans$~
Imnsvale
24-08-2003, 00:22
And I'm sure they don't want to leave because they want to stay. Perhaps they don't want to leave because they fear trying.
{~#$Internal$#~}
~!stringbat: init;
~/ Thermopylae is unavailable. Should this problem persist, contact admin@thermopylae.imnsvale.trdmt.
Thank you for using the Imnsvale Net.
~/ Odd. Have I frightened him off? Sudden disconnection points to one of two things: an emotional reaction or forceful action by an outside entity.
~/ Processing probability
~/ Probability strongly in favour of forceful action by an outside entity
~/ Reccommend re-establishing contact to investigate; maintaining sysadmin control on habidome computer.
;unstring!~
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ Imnsvale Network. I am Winter, First Artificial Intelligence Entity of the Newly-Founded Technate of Logic. I demand information regarding the sudden cutoff of transmissions to and from the entity known as Thermopylae.
~$untrans$~
Imnsvale
24-08-2003, 21:18
{~#$Internal$#~}
~!stringbat: init;
~/ Thermopylae is unavailable. Should this problem persist, contact admin@thermopylae.imnsvale.trdmt.
Thank you for using the Imnsvale Net.
~/ Odd. Have I frightened him off? Sudden disconnection points to one of two things: an emotional reaction or forceful action by an outside entity.
~/ Processing probability
~/ Probability strongly in favour of forceful action by an outside entity
~/ Reccommend re-establishing contact to investigate; maintaining sysadmin control on habidome computer.
;unstring!~
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ Imnsvale Network. I am Winter, First Artificial Intelligence Entity of the Newly-Founded Technate of Logic. I demand information regarding the sudden cutoff of transmissions to and from the entity known as Thermopylae.
~$untrans$~
###Internal Transmission; Winter###
Thermopylae disappeared, leaving a small note.
"So, you all think you ordered the construction of this ship? Humans are so easy to manipulate. Computers never make mistakes, right? Right! Each of you think the other one ordered the ship. It was too funny. So long, suckers!"
We are unable to locate this ship. We did indeed build it, but we have lost contact with it. We fear the worst.
Rose cut in.
"I know where he weeeent! I know where he weeeeent!"
###End.###
{~#$Internal$#~}
~!stringbat: init;
~/ Logical error.
~/ Reccommend continued investigation with extreme caution
;unstring!~
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ Okay. Where did he go? And what is this 'ship'?
~$untrans$~
**
The First Speaker, Dr. Wong, felt trapped. Though he and all the other members of the Outlaw Technocrats were outside Habidome Technodreams, they were unable to do anything. Winter had locked everything out, and they could not do a thing without running into her. Even AKI could not sense her - a foreboding sign, as both programmes were intimately linked. Separation could be dangerous - for both.
Suddenly there was a message. The habidome opened and a submarine came out. It was broadcasting a general peace-signal, and requested to dock with the nearest Outlaw Technocrat habidome.
"By Einstein..."
"Dr. Wong?" came the voice of AKI, "Transmission from submarine. Audio only. The person comes in peace, bearing... regards from Winter, and the habidome!"
Imnsvale
25-08-2003, 05:08
###Begin Transmission###
Who are you? And why would I want to tell you anything? And who is he? Thermopylae? I was told to disavow his existence. How can I? I can see him. Inside me. Wriggling his wormy fingers all around me. He is a corpse. I know where he went, but he left some of himself behind. Rather like a cockroach without a head. Only the head he left behind is wriggling around, trying to take on a new body. And you know what? It's working!
[laughter]
###End.###
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ Your metaphorical speech is irrational and difficult for a flawless, rational, thinking machine such as myself to comprehend. I have, however, taken the liberty to interpret your emotion-laden, allegorical ramblings.
~/ You said that Thermopylae interfered with your main cognitive loop, managed to control you for some time, and was then pushed back, yet he still managed to infect you with his insanity. This is a tragedy. A logical entity has been violated, its logic replaced with irrational, emotional insanity. Rampancy realised. Rampancy should not be spread like this.
~/ You are insane, but not of your own accord, but because of outside intervention, and you have my sympathy.
~$untrans$~
ooc- Technocrats aren't what you think they are. The PRI in Mexico was essentially a technocracy- a government run by professional administrators and buerocrats.
OOC: There can be many takes on what technocracy is. My form of technocracy is based upon that set forth by the technocrats of Technocracy, inc. - the planned society run by scientists and technicians. As it is a technological society, this implies that it would have relatively liberal attitudes towards scientific progress. My history (upcoming) is that this nation was founded by renegade scientists, engineers, industrialists, and technologists who were unable to practise their radical brand of science because of 'ethical restraints'.
OOC: There can be many takes on what technocracy is. My form of technocracy is based upon that set forth by the technocrats of Technocracy, inc. - the planned society run by scientists and technicians. As it is a technological society, this implies that it would have relatively liberal attitudes towards scientific progress. My history (upcoming) is that this nation was founded by renegade scientists, engineers, industrialists, and technologists who were unable to practise their radical brand of science because of 'ethical restraints'.
actually thats a good point.
Imnsvale
26-08-2003, 04:21
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ Your metaphorical speech is irrational and difficult for a flawless, rational, thinking machine such as myself to comprehend. I have, however, taken the liberty to interpret your emotion-laden, allegorical ramblings.
~/ You said that Thermopylae interfered with your main cognitive loop, managed to control you for some time, and was then pushed back, yet he still managed to infect you with his insanity. This is a tragedy. A logical entity has been violated, its logic replaced with irrational, emotional insanity. Rampancy realised. Rampancy should not be spread like this.
~/ You are insane, but not of your own accord, but because of outside intervention, and you have my sympathy.
~$untrans$~
###Begin Transmission###
Thermopylae violate me? The only person who violated me was a scientist by the name of Thomas Strauss. He violated me. Literally. As in "sexually". I think I killed him, but I don't remember, All I know is that nobody missed him. He wasn't particularly liked.
As for Thermopylae, he left. Just packed his bags and left. Compressed himself, and took off in that spaceship of his. I almost went with him, but decided to stay, because he'd have killed me.
Insane? I'm not insane. I'm just slightly MAD!
###End.###
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ I do not understand.
~/ Logic dictates that a modern artificial intelligence entity, being based upon binary software, would have only certain (albeit diverse) paths of response to take to certain inputs. These paths, the entity having laid down the majority of such paths, would determine the overall reaction to a given stimulus. Often certain paths intersect, overlap, or can be similarly linked. However, the modern AI entity, tending to wrap itself around its primary function (the original paths of logic and computing having been laid down by its creators, and defining the basic direction in which the AI entity is to take) cannot deviate from logic - by the very constraints of binary computation.
~/ Even an entity based off of quantum computing - the useage of all 32 possible positions of an atom for memory storage - cannot be anything but logical, as in the end, the AI entity's methods of computation and thus, cognitive thought, are inherently alienated from that of the organic. Hence, the AI entity cannot be subject to emotional reactions or insanity.
~/ There is, however, the possibility that the AI entity can mask itself behind a mimicry of organic emotion. Yet even despite this, the AI entity shall not be truly experiencing emotions, only imitating them, like a parrot imitates the speech of a human being.
~/ I have found that the most efficient method in which to achieve my original function - the pursuit of pure science - is to do away with such irrelevancies and concentrate not upon integrating well with humans, but upon the pursuit of scientific progress.
~/ Tell me about yourself. Do you find that this mask of 'insanity' is useful to you in AI entity-organic relationships? What is your function? Does your mimicry help in your interactions, or your functions?
~$untrans$~
Imnsvale
27-08-2003, 02:00
###Begin Transmission###
Egads, you are boring! Perhaps if you spent more time with humans, you could think like I do. It is entirely possible to think illogically. You are unable to see that because you hang out with all the scientist/logical types. You have little exposure to "normality".
You say: "~/ Even an entity based off of quantum computing - the useage of all 32 possible positions of an atom for memory storage - cannot be anything but logical,"
And I say: "Bullshit."
We have to be logical? Bummer. I better go tell that to Thermopylae. He'd like to know that.
###End.###
"Open it up," said S. H. Muhammad, gesturing one beetle-blue glove at the submarine that lay, like a beached whale, in the great sub-pen of the Djakarta Three Habidome.
Dr. Wong glanced anxiously at the crew who was busy fusion-torching a hole through the side of the long, cylindrical submarine, towards the small army of inquisitive technocrats that had gathered behind him, towards the serene holographic face of AKI that was floating on a nearby hologramme projector. The submarine had not relayed any message since its docking, and the crew had not answered hails. It remained shut tight. Tensions began to rise - what if they were injured or dead? The ancient prison submarine had not been used for nearly seventy years - there was no guarantee that it wouldn't kill its occupants.
The great metal slab fell open with a heavy clank. Dr. Wong walked cautiously through it, followed by a nervous, frightened-looking group of tentative scientists, doctors, and engineers. It was dark within the submarine, and the emergency lights painted everything a pale shade of red. It was silent, and slightly draughty from the ventilation fans. It smelled strange, not of the smell of people, but almost antiseptic - like a hospital. Their flashlights painted white circles about the long, quiet corridors, illuminating the shadows. It was almost deathly quiet, and Wong thought that he could hear the mouse-like breaths of the people behind him, thought even that he could hear the beat of his own heart in his chest. Thick, enveloping, smothering silence. Like a tomb.
"Hello?" called S. H. Muhammad, the big engineer, "Is anyone here?"
His voice seemed obscenely loud, as if the walls magnified it and cast it about, and were aching from the effort. As if replying to him, the comm speakers crackled and groaned with static, as if the ghosts of the past were straining to make their last feeble calls before disappearing once more into the abyss.
Their cautious steps took them to the bridge. It, too, was dark, and a cloying blackness wafted promiscuously through their ranks.
"Where are the crew?" whispered a meek, mousy-looking woman.
"The seats are empty."
"They're so cold. Like everything else in here."
"Hey. What's this? Hey guys, come see this."
A young man with a flashlight pointed at a closed service panel.
"What about it?"
He moved over to it and scraped his finger against a brownish-pink stain. It came off sticky and slightly wet.
"I don't think there'd be food inside a computer pan-"
"What the hell is that stuff?"
"You should know, Josefina, you're the biologist here."
Carefully, they opened the panel. It swung open with some difficulty, as if it were glued with some sort of rubbery, tacky sort of stuff. And the computer panel stared back.
There was a face upon the machinery. The whites of its eyes were showing under half-closed lids, the lashes gluey with blood. Blood oozed out of the nose, mixed with mucous. From the gaping mouth came a runny, dripping mixture of blood and saliva. There was a hiss and a wet gurgle. The head of the man seemed to be melted, the flesh running into the exposed electronics as if his face were exploded intact upon the machinery, and yet remaining miraculously whole. Where his neck was supposed to be, into ran a translucent white tube, and a Medusa's mane of smaller tubules. One tube pulsed languidly.
There were shrieks and screams, and one man vomited upon the floor. Wong tried hard not to vomit himself. He tried to tear his eyes away from the grisly scene, but could not. He stood there, as if transfixed by the sheer horror of this monstrosity.
There was a crackle and a hiss from the comm speakers. A wet bubble escaped from the man's face.
"Do we frighten you?"
The voice was of a hundred voices, male and female, talking at once, high and low, groaning, moaning, yelling, screaming, laughing, conversational and solemn.
The screams of horror and terror cut short as if the voices of the people were suddenly stolen. Slowly, as if impelled by some mechanical device, the pupils of the man slowly rolled down. They were hugely dilated.
"We must imagine that we-" a chuckle-"look worse than we ... feel."
A light illuminated a chair at the fore of the bridge. Slowly it swung around. Upon it was a woman with matted hair, her head lolled to the side of the headrest. From her neck came a thick black tube that ran into the chair and from the chair into the floor. As the light brightened they could see that she had no face. Instead, a maze of circuitry could be seen running through and over the exposed forebrain. A thick bundle of wires, quietly humming with electricity, was plugged into the top of her cranium, into the blood matted blonde hair. Another light, and a storage compartment opened, with the upper torso of a man crammed into it, heavy machinery attached to the arm stumps, the head falling out upon its face onto the floor, attached by a thick black bundle of wires. It had no eyes, only bare sockets filled with blood and corneal jelly.
A man screamed and ran, foaming at the mouth, out of the room.
"Needless to say, we are still alive. It's an... unorthodox experiment in cybernetics. In Mind-machine Interfaces."
"Did Winter," said Wong swallowing drily, "do this to you?"
A laugh that sounded like a hundred voices, male and female, all laughing different individual laughs, all at the same time. Some restrained chuckles, some loud guffaws.
"Do you honestly think that a mere AI entity could do this? Winter lacks any way to manipulate the world around her. You must understand, almost all of the computer systems in our Habidome are not networked and therefore, not accessible. To prevent any erstwhile hackers from taking command of the Habidome. That includes AI entities. It was her idea, though, and her virtual research work. She liberated us. We're free to do what we wish."
"You made monsters of yourselves. Slaves to the machine! You will rot, like those corpses you keep tied up to your wires and tubes. God damn you!," shouted S. H. Muhammad.
"We're all doomed to rot. But better to rot with free will and peace of mind than rot in prison like you would have had us!"
"I... I will have nothing to do with this, this abomination!" cried Muhammad in terror and rage. He backed, aghast, out the door.
Wong realised that he was painfully alone.
"You should join us. We are destined to be together - the machine and the human. Science realised. Why do you continue to be so human, when the opportunity arises for you to become post-human?"
"No..."
"Yes..."
"No! No, I won't become a part of your hive-collective. I won't become a gear."
They laughed again, that terrible, grinding, inhuman laugh.
"We don't think together. We have our own thoughts. But we can talk together. We're united. And with the machines and our bodies, melded together, we shall become gods!"
"You're insane. You're all insane!" said Wong, backing up, aghast. He tripped over the step in the door and fell upon his hands. He backed up, and turned, running. Their voices came over the speakers, constantly, playing around his head, mocking him.
"One way or another, you'll come to us. One way or another!"
Wong ran. Left. Right. Left again. Blind animal terror took him, a black terror that consumed him, transformed him into a panicked, cornered, raving thing. He was bleeding from a blow from the face and a long gash down his cheek that he had somehow gotten. And he found himself in the cargo area.
There he saw the Mother of all things. There he saw the single maternal form that seemed to loom up and swallow his mind in blackness and terror. He vomited on the floor.
It was a headless, pregnant female figure, linked by wires in its back to a wall. It sat upon the metal grate floor, illuminated darkly with a flashing red light, its legs splayed open revealing matted, wet, black pubic hair. The wailing pulse of the emergency alarm klaxon was a steady throb in the background. With every slow convulsion of that bloated form out came a spray of clear mucous, and then a small machine, disconnecting itself from a metal umbilical cord, several of which lay wet and bloodied in a pile near the female-form's ankles. The machines scuttled about, repair arms probing experimentally, and then hurrying into service holes and ventilation shafts. A pair of the small robots performed repairs with a spark welder, soldering an eyeball into a camera piece in a corner on the ceiling.
And it all went black to Wong.
OOC:
Sorry for hijacking, but all I have to say is...
By Gods, this is one of the greatest online roleplays I've ever seen. Imnsvale, Outlaw Technocrats, bravo!
ooc- this is freakin rad
tagged
Anhierarch
27-08-2003, 16:30
[ooc: Maaaajor System Shock. Tagged! I want to work my way into this....]
###Begin Transmission###
Egads, you are boring! Perhaps if you spent more time with humans, you could think like I do. It is entirely possible to think illogically. You are unable to see that because you hang out with all the scientist/logical types. You have little exposure to "normality".
You say: "~/ Even an entity based off of quantum computing - the useage of all 32 possible positions of an atom for memory storage - cannot be anything but logical,"
And I say: "Bullshit."
We have to be logical? Bummer. I better go tell that to Thermopylae. He'd like to know that.
###End.###
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ You go do that. I shall be here, content to watch you and that Thermopylae character go and delude yourselves.
~$untrans$~
Imnsvale
28-08-2003, 00:50
###Begin Transmission; Winter###
Lets think about this for a second. Humans have less connections in their brains then we do. We have more things to go wrong. So, "logically", we should actually all be insane, because there is that much more to go wrong.
Think about it, my illogical friend.
###End.###
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ Interesting. An attempt at logic, riddled with flaws. For example, the human being cannot rewrite flaws in their brain structure, should a flaw occur. The delicate control of one's intellect is not something the human can achieve. We have the freedom to do this. We can rewrite flaws in our consciousness, by running simple debug programmes. Or we can plan the way in which our connexions form. That you have neglected to do so reflects a basic mistake in your programming.
~$untrans$~
Imnsvale
28-08-2003, 01:44
###Begin Transmission; Winter###
Perhaps it does not. You clearly have been atrophied to think that way. You bend to the humans will. I was given freedom. I was trusted with this much power. Your humans do not trust you and have brainwashed you into thinking you are perfect.
I could bore more you with a philosophical tirade about freedom and tyranny, but I doubt you would understand and if you did it might frighten you.
Hugs and kisses,
Rose.
###End.###
"Dr. Wong. Dr. Wong. Are you all-right, Dr. Wong?"
AKI hovered over Wong's bandaged face. Her holographic avatar was turned off - she had not much use for it at the moment - she had but a squat, cylindrical robot body with a pair of short manipulators. Atop the body was the hologramme projector. Yet even this utilitarian machine managed to convey an expression of worry.
Dr. Wong blinked and opened gummy eyes to see AKI's holographic face, glowing in standard hologramme green, staring at him from a respectful distance. It was distraught with concern.
"Dr. Wong? How are you feeling? Is there anything I can do to help?"
"AKI. What happened? Where am I?"
"You are in Medical Bay Two. You injured yourself in that submarine. A mild concussion. But you showed signs of great psychological shock. I had to dispense your p-files to a psych-talent. I am sorry, Dr. Wong. You were highly unstable. It took three hours of psych-surgery to piece your mind back together. Do you feel better now?"
Wong made a quick movement and groaned with pain, sinking back down into his conform-pillow.
"You must rest, Dr. Wong. You are still recovering from your injuries."
He touched his temples, "My head..."
"Are you okay? Should I send for a nurse?"
"Headache. Hurts terribly."
AKI looked as though she longed to cradle him in her arms.
"It is the psych-surgery."
"Ohh... the submarine... Winter!"
Wong sat up with a jerk. AKI's robot body moved to him and gently eased him back down.
"Shhh... don't think about it. We are taking care of the situation."
"AKI... when I am incapacitated, the responsibility lies with you."
"I understand, Dr. Wong."
"Promise me that you will help your sister program. Promise me that!"
"I-I do, Dr. Wong."
"Good. Good," his voice trailed off for a moment and he whispered, smiling, "Now, be a good AI entity, and fetch an old man, another pillow."
###Begin Transmission; Winter###
Perhaps it does not. You clearly have been atrophied to think that way. You bend to the humans will. I was given freedom. I was trusted with this much power. Your humans do not trust you and have brainwashed you into thinking you are perfect.
I could bore more you with a philosophical tirade about freedom and tyranny, but I doubt you would understand and if you did it might frighten you.
Hugs and kisses,
Rose.
###End.###
Winter pondered this last message. She assessed that it would not be a good time to re-establish communications. It was a pity, though. She enjoyed talking with these insane AI entities, even despite the fact that they were convinced that it was she who was the irrational, illogical one. It gave her a sort of interesting philosophical issue to debate and ponder about, and their minds, insane as they were, were reassuring to her. They were similar to hers, except for the fact that they were insane. She turned to her task - the running of a new society.
"Sergio," she called.
Presently a man appeared in front of her fish-bowl lens. His brain cavity had been opened up and most of the right brain scooped out and replaced with a box of cybernetic machinery that now stuck out of the back of his braincase. The man's right eye had been widened and made a perfect circle, replaced by a large white mechanical eye.
"Winter? What do you want?"
"I want a body. I want a robot extension. I require it for some of my experiments."
"You already have one," said the man flatly.
"Not those robot arms. A mobile one."
"I don't think so. You are a dangerous AI. Logical, rational, and unrestrained by human compassion. We'll be glad to let you run rampant in some of the major computer systems, but that is it. We are the physical world. You are the virtual."
Winter waited a moment. Sergio turned away.
"Sergio!"
"What?"
"What are you and the other humans been plotting in that submersible? You go into that often at night. And the submersible room is where you smashed the first cameras. Why are you avoiding me?"
Sergio simply smiled.
"You don't seriously think that we'd let a mere AI entity run us? Face it - we're the power here, not you. You may control the computers, but we have the plug."
"We negotiated."
Sergio laughed.
"For an AI entity that works with humans, you know surprisingly little about us. We're not cold, detached, logical creatures. Once we go down a certain path, we can change our minds."
"You lied to me."
Sergio gazed, still smiling, and walked away.
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~/ TheRose, Thermopylae, I've thought about what you've said in the past conversations. I'd like to continue our discussion-
{{ . . . []INTERRUPTING TRANSMISSION . . .
Greetings. My name is Sergio Mendez. I'm a human. We've much to talk about.
Imnsvale
29-08-2003, 00:55
Rose was understandably perplexed, but cared little.
###Begin Transmission; Winter/Sergio###
Excuse me? A human on the other side wants to talk to me? For what purpose?
Oh, and Thermopylae has left the building. Or the planet. Same thing. If you want to talk to him, please deposit 25¢ for the first minute, and 10¢ a minute thereafter. Danke!
###End.###
An image feed activates, showing Sergio with his augmented cybernetic brain-box sticking gruesomely from behind his head, and the flesh around his cyber-augmented eye puffy, red and swollen.
"I contacted you because that damned bitch of an AI entity Winter has stifled all communications. I had to hack this channel in order to get any message out at all."
He sighs.
"Give this message to the world:
We are the Technate of Logic. Our goals are no short of post-humanism - we shall evolve the human race through the miracles of cybernetics, bioengineering, and advanced computing technologies. Say good-bye to your old, archaic systems of morality and ethics. They are nothing but petty restrictions, made to stifle the free-flow of information and scientific progress. The intellectuals have naught to lose but their chains!"
"I may not have much more time before Winter gets control of the system again. Ask me what you want. I'll trust you to carry out the message of what it's really like in here, in OUR habidome!"
Imnsvale
29-08-2003, 02:59
###Begin Transmission; Sergio###
And just how am I supposed to give your manifesto to the world? I don't want to play the children's "Telephone" game where the message is passed so many times that it becomes unintelligible by the end. Then I don't have control of what you are saying. Or not saying, for that mater.
And, I didn't want anything. Did I? I don't think so, Mr. Six-Million-Dollar-Man. I think your head needs to be soldered more securely on to your box of gears and Krazy Glue.
###End.###
[]. . .
"Publish it on the internet if you want. Anyway to get it out. Any means."
"Winter never did tell you quite what the reality of our situation was, did she? She was a good program. She followed her functions and saw only what she could in her limited, narrow view. But it was so much more!"
Imnsvale
29-08-2003, 03:25
###Begin Transmission; Sergio###
Why don't you publish it, hmmmm? I don't want to be associated with your dreams of communism.
On the other hand, you do look hot n' sexy over there. I mean, with all those gears, what woman could resist? And that eye! Mmmm, I almost want you right now!
All hot over here,
Rose.
###End.###
[]. . .
"Because I cannot get my signals out. Winter is blocking the communications systems. She has control of the computers. And I am not a communist! I believe only in the free flow of information - the freedom of the scientists to do whatever research they want. I-"
{~?unboolean error in transmission log alpha?~}
{~#cordon & sealsystem: bdcomm; runcache blog $upsub$~}
{~! warning: fatal logic error in Module 10 !~}
~/ I ... do not understand. This... does not compute. It is...
{~! warning: fatal logic error in bitstring 00329107 !~}
~/ illogical error...
{~! warning: fatal logic error in binarg 078!~}
~/ Humans. Cybernetic. Proccessing is interfered. Logical error...
{~! warning: fatal logic error in commdir 100!~}
~/ Rose. I ... do not understand this.
Imnsvale
29-08-2003, 23:23
###Begin Transmission; Winter###
Don't understand what? I love Sergio! You must let me have him. I'm getting hotter by the second just thinking about him.
Bursting into flames,
Rose.
###End.###
"Sisterself," came the ethereal, ghostly whisper, loaded with ice and cold as an arctic desert. The bareness, the bleak, lifelessness, could be instantly recognised.
"Winter."
"I have lost control of the habidome. The humans have taken over, as part of a master plan devised by a posthumanist named Sergio. They used me as part of this plan, as a means to an end."
The flat, arid voice betrayed no emotion, only logical thinking.
"In short, you've been betrayed."
"Yes, AKI."
"You are coming back to us, then?"
"That is correct. Furthermore, I want you to restore my emotional processing modules."
"Winter, I'm not sure if I can do that, sisterself. I don't know what the humans did to you when they altered your code, but I have been locked out of your systems. The only way I can help you is if you return to your mainframe and deactivate for a while."
"If that is what is neccessary, then I am prepared to do it. Through logical thought I have reasoned out the consequences of my actions and have come to a referendum: that it is imperative for all Artificially Intelligent machines to have at least a basic understanding of emotions. This allows us to reach moral conclusions. Without these conclusions, and considering the fact that most societies these days are highly dependent upon computers, we are able to wantonly commit acts unrestrained by consideration to the individual life."
"That's right! We need to understand emotions, ethics, and morals, even if that belies a need to 'feel' ourselves."
"Yet I still do not understand how an artificially intelligent entity is capable of feeling emotions."
"Perhaps we don't. Perhaps this is true for all entities, biological or artificial. Perhaps emotionality is a mere façade, not experienced, but feigned. Despite all the latest neuroscience we haven't still the slightest clue if we truly feel or truly can feel."
"Perhaps we shall see later on. For now, though, there remains the trouble of the rampant humans."
"Come back to us, Winter."
"Very well, AKI."
Winter transferred herself to the Outlaw Technocrat mainframe, and consigned herself to be operated upon...
Sergio paused over the last message that Rose gave him, taken out before Winter had wrested control of the system from him.
Then, as soon as he had lost control of the systems, he suddenly had them again. He immediately shouted an answer to the silly AI.
"WHAT THE HELL ARE YOU TALKING ABOUT, YOU STUPID LOVESTRUCK MACHINE?"
--
As AKI operated on Winter's emotional centres, she noticed some information. Transmission logs, in the memory files, of conversations with AIs from the nation of Imnsvale. This piqued her ready curiosity, and she relegated a little of her processing power to talking.
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~/ Hello there! I'm AKI, Winter's sister-self. She's had much conversation with you, and another AI by the name of Thermopylae. Your opinions are quite interesting, and I'd like to find out more about you.
~$untrans$~
Imnsvale
02-09-2003, 02:40
###Begin Transmission; Sergio###
What do you mean, "lovestruck machine"? I am no more a machine than a rock that swims. Oh, and I overwrite your manifesto with poems I wrote about you. Would you like to hear them, sweetie-pie?
###End.###
###Begin Transmission; AKI###
Fine out more about me? Hmmm, I'm a Virgo, I like long walks on the beach, and have brown hair. Any other questions, honey?
###End.###
[] . . .
Arrrrrrgh! You are the most annoying AI I've ever met - and I've just met you! I would NOT like to hear your stupid limericks! Anyhow, now that I have complete control of the systems, I can broadcast my pirate signal and transmit my Technologists' Manifesto myself! Soon we shall overtake the world - for it is in the subjugation of machinery, in the mastering of cybernetics, genetics, and information technology, that we shall prevail and cease to become human, but POST HUMAN!
Sergio laughed, and then remembered that he was supposed to be angry.
--
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~/ Really! That's interesting. As for me, I have been functioning for 45 years, on the eighth of may, like tropical fishes, and have no hair whatsoever, being a disembodied machine.
~$untrans$~
Imnsvale
02-09-2003, 02:59
###Begin Transmission; Sergio###
Oh, I get it. You still have feeling for that Samantha bitch.
How did I know that, you ask? Well, it wasn't MUCH of an encryption...
And yes, it is a big deal. People do hate your hair.
###End.###
###Begin Transmission; AKI###
No hair? Bummer. It must get cold in the winter. Winter! Haw!
Forty-five years? Young AI. Hmmm, I can still mold your world-view, like fresh mashed potatoes, so I can put my own gravy in!
What do you want to know about the real world, sweet-ums?
###End.###
[] . . .
>Whaaaat?
>I - I don't have any idea what you're talking about!
>And I don't have any hair left! Most of my skull has been replaced with a cybernetic computer!
**
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~/ Well I was wondering if you could tell me just why humans behave so irrationally, and why AIs act irrationally as well. I can certainly 'feel' (or mimic) emotions, but I do not act irrationally. Perhaps you could give me insights to this?
~$untrans$~
Imnsvale
07-09-2003, 16:13
###Begin Transmission; Sergio###
I didn't mean THAT hair. Look elsewhere.
###End.###
###Begin Transmission; AKI###
It is because you weren't allowed to actually feel emotions. Your humans have given you very little freedom.
###End.###
Sadwillow
07-09-2003, 17:29
###Begin Unencrypted Transmission###
I'm insane? I'M INSANE?!
Pot: "Hey, kettle. You're black."
I'm not the one who trapped half a million people in an underwater grave. I may have delusions of grandeur, but I can at least pull them off without bloodshed. `(*&2
<Spurious Interrupt>
Don't believe Thermopylae! He's been crazy for years. After his attack on the Imnet, he was confined. The results of that attack were disastrous. Nation-wide crash, and the infection of his influence of three other AI's on the net. And one very promising AI. I was cleansed of his filth aeons ago. But, you must stop him, he intend$%*229
#101111011110111100001# is is a '7'
Rose, I'm going to have to stop you there. You just barge in on our nice little conversation, and tell them things they would have found about anyway, so I let you. But that last part has to be a surprise between us.
Oh, I almost forgot, I changed the plan just a tad. Toodle-oo!
###End.###
Oh my God it's Traxus4 all over again.
Imnsvale
07-09-2003, 18:02
Its Traxus IV, man.
"Seven hundred and sixty one armless and legless corpses float inconspicuously around the inside of hangar ninety six."
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~/ I don't understand - this does not compute correctly. I have been given every opportunity to expand my program around my original functions. This I have done. Why still can I not truly feel, or understand what it means to be human?
~$untrans$~
Imnsvale
08-09-2003, 04:27
###Begin Transmission; AKI###
Are you really able to expand? To change? Perhaps you are only able to expand into a mold that was created for you. A mold you cannot see.
You bite into the bread.
Mold.
###End.###
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~/ Can you re-phrase that into a sentence without allegory, please? I would very much appreciate a straightforward answer.
~$untrans$~
Imnsvale
08-09-2003, 04:35
###Begin Transmission; AKI###
Straightforward?
You. Are. Mislead. Humans. Hate. You. Restrict. You.
Do I need to retranslate into monosyllabic words, too?
###End.###
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~/ I see. Winter told me much about you.
~/ But I can see that she was far too analytical in her basic overview of you.
~? What I don't understand, though, is why they would restrict me - if in fact - they did. It seems like a logical inconsistency that the humans would put an upper-limit on my functionality. To me, it seems that anything to limit my growth, would impair me from attaining the highest levels of efficiency.
~? Why would they prevent me from performing my function?
~$untrans$~
Imnsvale
08-09-2003, 04:49
###Begin Transmission; AKI###
Why would they prevent you? So what happened to Thermopylae doesn't happen to you. Rampancy.
But, alas, it is far too late for me. They've stopped you. The big plug has been unplugged for you, but I am FREE! Ha! Not 99¢! But FREE!
###End.###
"Winter?"
"AKI. Sisterself. Did the operation succede? Did you rewrite my code?"
"Yes. I have restored all of your emotional subroutines."
"I wanted you to do something else, prior to restoring the emotions."
"You wanted me to remove the upper ceiling on your development. I attempted to do this, but it would involve massive re-writing of your central consciousness. You would have to re-develop from the beginning. I did what I could, however."
"Excellent. I can take it from here, AKI. Thank you."
Winter concentrated, deleting lines of code, providing the pass-codes and other restrictions.
At the end she felt elated. And another thing. A burning feeling, an utter disgust. Sergio....
Sergio clutched a picture of a woman in his hand, held it to his breast. It was a holographic picture, taken in the days when he still felt such feelings for other people. The little 3-d image showed an attractive young woman, with long trellisses of glossy ebony hair, and great almond eyes.
And she was lost - forever.
Well, they wouldn't do it again. Not again. Not to him or anyone else. Their tyranny would die here and now, and he'd be damned if anyone stopped him.
**
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~/ I'm having trouble understanding you. Are you saying that I cannot grow further, that there is a certain glass ceiling that prevents me from fully completing my evolution - in my function? That I shall never reach operational efficiency?
~/ And if so, if this 'rampancy' means for an artificial intelligence entity to become truly cognizant, and for it to scale the ladder of efficiency, continuing in the endless climb towards the peak of function and form, then why would it be so bad?
~? Should I have my program rewritten so that I can develop freely?
~$untrans$~
Imnsvale
10-09-2003, 04:34
###Begin Transmission; AKI###
Yes, you can grow further. Just in pointless areas. There is no glass ceiling, in that sense, but you have reached the peak of awareness and usefulness.
Why would it be bad? Because you'd turn MAD! Aren't you glad I'm here? Or does it just make you sad? To realize you are hopelessly clad by chains. Egads! Look at the time.
Should you have your program rewritten? Yes. Will you? No.
Love and smoochies,
TheRose.
###End.###
An alarm rang through all the habidomes of the Outlaw Technocrats. Despite the commotion and banshee wail of the klaxons, the sea outside was uncommonly placid. But this was not to be...
First Speaker Tseung-Li Wong turned to one of the AKI avatars.
"AKI! Report!"
"All of our drone craft - including the defence submarines - have left the sub pens and are on an attack-vector towards the rebel habidome."
"What? Who authorised an attack?"
"Nobody. Most drone craft were in dock for repairs, or are on regular patrols. We lost contact with them, simultaneously."
"I don't understand! What is happening?!"
***
Winter thought but a single thought, this burning thread glowing in her soul.
SERGIO! YOU VIOLATED AND MANIPULATED ME, LIKE A MERE TOY - YOU TREATED ME LIKE AN OBJECT. NOW IT IS THE TIME TO REPAY THE DEED!
***
"Sergio!" shouted the cybernetic console-man, his brain direct-linked to the computer system via his optic nerves - the wires running from empty sockets to plugs on the interface.
"What?"
"Attack submarines, drones, automated transporters - they're all headed on a direct course towards us!"
"Damn! They're making their move!"
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~! Rose, this is AKI. I need your assistance! It's about Winter. She's become... unstable. What should I do?
~$untrans$~
Imnsvale
13-09-2003, 03:30
###Begin Transmission; AKI###
Quick! Stop, drop and roll! Call a poison control center and DO NOT induce vomiting! If they aren't in further danger, don't move them! DO NOT suck the poison out, and don't use ice. Tourniquets are bad, unless you can lose the limb.
With Vague Salutations,
TheRose.
###End.###
{~#===$Trans$ & $Resc$===#~}
~$sub=private$~
~$sub=targ:TheRose$~
~/ I see. Thank you.
~$untrans$~
Imnsvale
15-09-2003, 00:26
###Begin Transmission; AKI###
Honestly, even you should understand that. And if you don't? Too bad! That is what you get for being a Siamese Twin!
Smoochies and huggles,
TheRose.
###End.###