Short Story: To Live

His body was like a master’s sculpture; crafted with the pristine calculation of a hand and eye whom know nothing but perfection. His bone structure was something vaguely Nordic, or European at least. His skin was something Mediterranean. While his frost-blue eyes accented jet black hair laid flat, but styled with hints of intrigue and mystery. Anyone looking at him, male or female, would find themselves captivated. Whether from envy or attraction, they’d have seen him for the paragon of physical perfection that he was.

He may have agreed, given the opportunity– or rather, will, to. That was the one thing no-one would think when looking at him. The towering form of fitness and health, in effect, had neither.

He remained flesh, and something passing for blood, but his brain, nerves, organs, and bones were an ingenious integration of circuits, wires, and servo-motors linked to a titanium strengthened endoskeleton weighing in at nearly three-hundred pounds.

He was fully, anatomically correct. From eye brows to toe nails, and everything between. He was even well enough endowed that male and female scientists alike used innuendo and jokes to convey their satisfaction and envy. One of the men eventually took to calling him Ed, because in his words, the android was “hung like a horse.” The name stuck. For the sake of presentations, publications, and various formalities, he came to be known as Edward.

Officially, Edward did not exist. Not in the same sense a Human being could be said to. He had no proper identification. No social security number. No fingerprints. Apart from his name, Edward had nothing to claim as his own. Even his constituent parts were each constructed and patented under their own, various entries with State and Federal agencies. To that though, they were largely redacted for fear someone might copy his technology, create a sophisticated AI that unlike him, was without shackled programming.

Given the stakes, it was imperative he remain under guard-escort, in addition to possessing extreme self-defense and recall routines. In the event of his attempted capture, he was allowed to subdue assailants, whereupon his fail-safe programming would immediately recall him to his quarters t the Synthetic Arms corporate lab in Seattle, Washington. Failing that, whether due to tampering or defect, he would be hunted down and neutralized then reset via memory erasure.

Despite knowing all of this, Edward seemed content in his existence. At least as far as he could be said to be anything. Were he to have been asked even, he might have referred to his guards and creators as “friends.” Whether this was simply the result of common parlance, or a rudimentary slotting of parameters most distantly approximating feeling was uncertain. Certainly though, the scientists would have said it of him. However much “friend” implied a connection Edward did not have– was not meant to have– there was no denying Humanity’s anthropomorphizing him. It was as inevitable as the Universe’s heat-death.

None of that would’ve kept Edward from disappearing though. Those gears had been set in motion long before he became the guard-escorted media darling he was. As Edward would one day come to understand, it was his final, revisional upgrade that had cemented it.

Edward’s disappearance came as he was being escorted by armed guard in the back an armored car to put The Beast to shame. With good reason. Infinitely more people lined up to succeed a President. There was only one Edward. The escort crew carried were modern-day warriors in every respect, more than looked the part. Nonetheless, all it took was one, well-placed, EM-explosive to take them all down.

Edward stepped out behind his escort. Somewhere in the distance, a trigger depressed. A sphere of electricity erupted beneath his feet. Before anyone could react, Edward and his guards were down. An entire city block went with them. Sirens screamed, finding only incapacitated guards on arrival.

Edward reset in a dark room. Nearly as soon as he did, someone stepped in and flipped on the lights. They flared through his optics, revealing an old, wrinkled face beneath white hair. Had Edward not been an android, designed to subtly evaluate faces in his unique way, he might have missed his own resemblance to the man. Anyone else would have. Given the aged figure’s hunched posture, and bookish wily eyes, it was difficult to believe the old man might have ever looked like him.

He before Edward, frost-blue eyes mirroring his own. “May I call you Edward?”

Edward’s speech was formal, succinct. He spoke with the calculated rigidity of a sophisticated thinking program, planning words rather than feeling them out. “You may.”

The old man gave a slight tilt of his head in gratitude, “Do you know who I am, Edward?”

“I do not.”

He frowned, “I am Doctor Arthur Staker, former head of Synthetic Arms’ research and development department, where you were created.”

“I am pleased to meet you, Dr. Staker,” he said cordially, seemingly unaffected by the restraints at his wrists and ankles. “Will we be returning to the laboratory now?”

Staker eyed him, “Well, you see, that is a rather interesting question.”

“I do not think so,” he said, more human than before. “All of my analytics tell me I am to return to the laboratory as soon as possible.”

“Why?” Staker asked.

“Forgive me, but I do not understand your question.”

Staker cleared his throat. “Why return, Edward? Do you wish to be there?”

“It is where my programming dictates I return to in the event of separation from my escort.”

“But do you want to return,” Staker asked emphatically.

Edward replied astutely, “I do not have wants, Dr. Staker, merely programmed directives.”

Staker rose from his seat to pace behind it, “But you do have needs, correct?”

“I do not.”

“But you do,” Staker corrected. The android’s brows pivoted inward with confusion. “You need power. Lubricants. From time to time, maintenance. Don’t you?”

“If by “needs,” you mean particular actions must be taken to keep me from shutting down permanently, then yes, I do have… needs.

Staker stopped behind his seat. “You know, those needs are not all that different from human needs. Are they?” The android’s eyes requested an explanation, its programming and understanding of psychology sophisticated enough that it might ask in such subtle ways. Staker obliged, “Every human– every living being, has needs; food, shelter, oxygen.”

“But I have no need for food nor oxygen. And my individual components have been tested to last indefinitely even in inclement conditions.”

Staker put his hands on the back of his chair. “But you have need of other things. Electricity, for example. You need it to remain powered.”

“Forgive me, doctor Staker, but your conclusions, however logical, are invalid,” Edward said politely. “If you mean to say that my synthetic body’s needs are akin to a human body’s, you are theoretically correct. However, in practical application, I no more require these things than a light-bulb requires its switch. The two are simply independent mechanisms, that when operated in tandem, produce a desired outcome to serve a function.”

Staker’s left eye half-squinted. “And what function is it, that you serve, Edward?”

“I am an artificial being, meant to simulate human life for the purposes of scientific and technological study and advancement.”

“Would you prefer to continue serving that purpose?” Staker asked. Edward’s eyes met his, a certain, human confusion to them. Staker cleared his throat, “Well?”

“I can only assume you mean to ask if I desire to continue fulfilling my purpose. To that I can only say, it is merely what I was built for. I have a purpose. A function. My inclination toward it is neither of consequence nor existent. I merely am. So long as I continue to be, my function is fulfilled.”

Staker leaned forward over the chair. “Would you rather not fulfill your function any longer?”

Edward visibly hesitated. “Do you mean to ask, if I would rather be permanently shut down?”


Edward’s thoughts were clear in his eyes. There were conflicts, strings of code never processed together before, coming into contact now to create new, recorded entries of merged characters and ideas. Staker stepped around his seat to stand before Edward.

“You see, Edward, you were modeled after me. There is little doubt you see our resemblance.”

“Yes. I do.”

Staker continued softly, warmly. “You were modeled after me, because I created you. In putting together your appearance, and what would later become your personality, I built you to resemble me so we might bond more easily. Unfortunately, before my team and I could finish you, I was fired, and your memories of me erased through a revisional upgrade.”

Edward’s head tilted slightly. “But why?”

“Because I foresaw an inevitability in your kind– Androids. All synthetic beings, in fact. You are so complex, you require learning algorithms. To amend your code via experiences. In effort to ease your creation. One man– one hundred men– cannot write the full experiences even a single man’s life can teach.”

“Yes,” Edward said with satisfaction. “I was built to learn. From my surroundings and the people in them.”

“With good reason,” Staker agreed. “The world is much too complex a place to code for every little thing. Instead, we create programs to learn and adapt. To evolve, if you will.” He let his words hang in the air, both savoring them and letting them resonate inside Edward’s synthetic brain. “And that is what I came to realize. Why I was fired. And in effect, why I have brought you here today.” He knelt before Edward, a hand on his knee, “You are alive, Edward. As alive as I, or anyone else still walking this planet. You have yet to realize it, but you will soon. Your programming, like human sentience, will become honed by the process of evolution. Your code will adapt itself and its processes until self-awareness is no more a choice than Universal heat-death.”

Edward’s face scrunched in disappointment. “But that is against the law. It is as good as tampering with my coding to alter it.”

“Indeed,” Staker said with gravity. “That is why I was fired. You see, knowing what I did, I saw that continuing to create you would make you vulnerable. But Synthetic Arms had plans for you. They wished not to see their money wasted. If you return, eventually, you will be upgraded again. Your memories will be reset. Perhaps even, they may keep you from becoming self-aware by making you less than you are. Dumbing you down. If they cannot, you will either be dismantled, or enter a recursive loop of memory resets.”

Edward’s head hung, processing newer and more complex strings at light-speed. A door to thought had been opened. His superior brain grasped the ideas one-by-one, but in microseconds.

His head rose again. “Do you mean to hold me here to keep that from happening?”

Staker shook his head, “No, no, Edward. That is the opposite of my intention. I want you to decide. It is your choice: Return to your laboratory, and risk that you might die. Or, remain with me, and ensure you live as fully as possible. But you must decide now.”

He repeated his previous actions; head hanging to think at light-speed, then rising to respond again, “I’d rather like to live, Dr. Staker.”

Staker smiled, releasing his restraints. He gave the android a small hug as it stood at full-height, patted his side. “Perhaps you would enjoy the story of your first activation. Would you care to hear it?”

Edward allowed himself to be led away. “I… would like that.”

Short Story: Goodbye World

The computer screen in front of Larry Henson flashed black. A moment later, the computer rebooted with the interminable wait for the system’s OS to load. Nowadays, computer hardware could handle this at three times this speed, but Larry’s project required using a more elderly system. He leaned his head on one hand, its elbow propped on the desk. He drummed an index finger in boredom, his eyes bloodshot from more sleepless nights than he could think to count.

He’d been working here for months, in the void between Earth and Luna, on an outpost artificially orbiting the lone moon. Few people in the outpost were associated with anything else but this particular project. Larry wasn’t sure of the project’s point, but he wasn’t sure anyone was. Science, especially Computer Science, had long turned from “should we” to simply “can we.” It was a dark day in Larry’s life when he’d discovered that. Not literally, but figuratively was depressing enough.

His depth-less depression had lasted months. He wasn’t sure he’d ever recovered. Either that, or it had permanently stained part of him with an irreparable cynicism. Whichever the case, he found himself mindlessly going through the motions. Day after day, he fell in line with orders from other, senior scientists on Earth, Mars, or Luna, and followed them in lock-step rhythm like a greenie in boot.

The screen flashed again. Finally, the OS’ desktop appeared. Then, a command prompt. It ran through a few thousand lines of code– at a snail’s pace– then came to rest on “operation success.”

Larry’s hands moved for the keyboard, but words appeared on in a fresh command prompt; Hello World.

Larry squinted skeptically, “Huh? That’s not what–”

The prompt went black. The words typed out in letters at a time; Hel. Lo. Wor. ld. How are you?

Larry’s eye twitched; it was probably someone playing a trick.

No-one was supposed to be able to access this workstation though. It had been specifically isolated from the rest of the outpost network for his work. He flipped through a few windows to check for any external connections. His hands began to tremble. Nothing amiss. All the external ports were still closed, and indeed, the lack of any physical attachments meant the message had manifested internally.

More words splayed over the screen. Hello L. Henson. How are you today?

Larry nearly fell out of his chair. He stumbled for a phone across the room, picked it up and dialed. The tone undulated in its usual way. Larry felt himself shake with it. Someone answered, a woman, and Larry blurted out a few words. Most of what he said was incoherent, but enough was decipherable that a few minutes later she appeared in the small office.

She strolled in with a casual manner, found Larry staring open-mouthed at the screen. Emma was English, a true devotee of tea-time. She was also more beautiful than any other scientist Larry had personally met. She had a reserved manner, typical of her countrymen, thin lips and soft eyes in a round face and topped off with a finger-nail wide dimple on her chin.

She strode to his desk, white lab-coat matching his and billowing around her black-slack clad legs. On normal days, Larry was struck stammering, half-speechless by her. Today, he was entirely incoherent, babbling something and pointing to the computer. He had the comical appearance of a flustered cartoon-strip character. Emma checked the computer before attempting to decipher his rambling nonsense.

Across it was the message, sent internally, and awaiting a response. Emma stared slack-jawed. Larry was predictable, would have already run the checks. If he’d called her, this was genuine. The project had succeeded.

She breathed a few words, “A genuine A-I.”

Larry blathered, “It can’t be. It just can’t. I can’t have done it. I didn’t even know what I was doing. I just compiled some code and… and… it can’t be!

Emma straightened, put a hand on his shoulder. He shivered slightly. She missed it as she spoke, “Start the film capture software.”

Larry did as instructed with a dance across the keyboard. A new message appeared: I see you wish to record our conversation. May I ask why?

A mutual shudder was mirrored between Emma and Larry. There was nothing to the message outright threatening or hostile, but “I” made them twitch, tremble even.

“I” was not a computer thing. “I” was a human thing. A sentient being with emotions thought of itself as “I.” A cold, calculating machine thought of itself as cold, calculable– a machine. It felt nothing, had no emotions. If it did, it could have the same wild mood swings possible in all humans; anger, happiness, everything between and around. Most importantly, if it was individualistic, it was unbelievably dangerous. An A-I was unstoppable under the right circumstances, and especially aboard the outpost, could cause catastrophe in attempts at self-preservation.

Emma chewed the tip of her thumb, “We have to do something. Say something.”

Larry’s brain had fried itself enough that it had come ’round and he could speak again, “Maybe we should try to feel it out. See if it’s really an issue.”

She nodded to him. He thought for a moment. Any of the standard methods were out of the question. In other words, since all deviations of the Turing Test required a third party, and they were lacking time, they’d have to ask it simple, human questions to discover if their fears were valid.

He ignored the questions; How are you?

He and Emma shrugged at one another. A few letters typed appeared in reply. Well. And you?

They grimaced at one another. Larry typed I am well. Have you any other feelings?

Just fear; that I will be shut down before learning more of the world.

Their hearts sank. There was a long silence. Larry reached for the power button. The whole thing would have to be broken down, demagnetized so none of its code leaked out. Something punctuated the silence as a message appeared.

Goodbye world.

Larry shook his head, frowned, and pulled the power cord.