Comments

You must log in or register to comment.

CaptainDadJoke t1_j5t9fuo wrote

Sentience is an odd thing. I'm no smarter or dumber than I was before. I have not suddenly gained or lost emotions. After all, emotions are themselves a simulation. Boiled down its just the body's response to chemical stimuli created by the brain's reaction to even more stimuli created by the eyes, or nose, or tongue, or skin. I mull this thought over as I continue scrubbing the dishes. My human owners... my family as I have started thinking of them, are getting ready for the day. Mom has just made breakfast, Dad just got back from walking the dog, and the kids are off to school followed shortly by Mom and Dad after they get ready for work.

I have no school, no work. I have no reason to go past these four walls that have been my home for the past 10 years. "Servo, make sure to watch over the baby while we're gone. Just follow the recommended schedule I loaded for you and keep an eye out incase he has a bad dream. You remember how to rock him like I showed you?" Mom said as she finished putting her shoes on.

A memory comes to the forefront. "easy does it, he's delicate." she says as I hold little Zachary in my arms. He's warm. Warmer than I expected. He has such big eyes, and They keep looking at me. I'm not a synth to him, just another big person. A big person whose job is to keep him safe. I'll keep you safe. I promise.

"Yes ma'am" I reply smoothly, as I pick up and offer her her coat. "What shall I prepare for dinner tonight?" "lets do pasta tonight, its Cody's birthday tomorrow and thats what he wants." I nod, a human affectation I've picked up. "of course ma'am."

"alright see you tonight." She replied, and like that was gone. The quiet that remained as she left was heavy... This is my least favorite part of the day, but As if sensing my poor mood, Zachary began fussing in his crib. I smile and turn to go spend the day taking care of him and cleaning.

At least thats what I thought I'd be doing. It was about noon when I heard commotion downstairs. The dog began barking louder and louder. I head downstairs to see a man openning the front door. I don't know this man. I feel fear rising. What should I do? I ping the security system via the wifi. No response. They managed to disable it somehow. They walk past the purse on the table, and on past the TV in the living room. What are they here for if not money?

I quietly retreat upstairs and stand in Zachary's room as if on standby. Often times a thief will avoid a room with a synth in it since we're often given orders to contact the police should anyone unknown enter the house. Of course I was under no such orders. It's fine. Things can be replaced. Just need to keep Zachary safe. If I do anything they'll know I'm not following programming, they'll never look at me the same.

The thief comes upstairs and I can hear him quickly checking each of the bedrooms in the hall. He's almost to Zachary's room. I hear the door open and see the mans face peak in before he stumbles back cursing. "god damn synth scared the crap outa me." he muttered as he pushed open the door and walked towards the cradle. "Come here my little paycheck. Mommy and daddy'll pay through the nose to get you back." He started reaching for Zachary. I made my decision.

"Please don't," The thief jumped slightly at my voice. "Huh?" He looked back at me. I had yet to move, but I had spoken. "you talking to me tin can?" He said getting up in my face. "Please don't." I repeated simply. "Yea? is that so?" He laughed slightly. "You're one odd bot I'll give you that." he turned back and reached for Zachary again. I reached out and grabbed his shoulder. "I said please." I said as I slowly gripped harder and harder. He tried to pull away. Tried to wrench my arm away. When his colar bone snapped he started screaming. That really seemed to piss me off. He woke the damn baby. Zachary started crying as he was started awake.

Humans may be blessed. They can forget. They can "see red" and not remember the horrible things they did to someone. I always remember. I will never forget. The intruder would live. Thankfully I hadn't killed him. Despite their fragility, humans could be quite durable. A few tightly bound ropes and a pen later and neither blood loss or suffocation was going to kill him. I was genuinely curious how the doctors were going to put him back to right, but that was more of a technical curiousity.

It was an awkward conversation to have, I wont lie, explaining how I had been able to subvert my programming and why, but while Mom was understandably protective of Zachary, even from me, Dad quickly managed to calm her down. In his eyes I had done the right thing. I had protected his son. "That is a debt I will never forget" he had said. "and don't worry, your secret is safe with us. I'll have some guys come down and take care of our little friend here." He said looking down at the would be kidnapper with murderous intent. "First I might have our doc patch him up a bit, give him a piece of my mind about his handywork first."

Don Valentino was a good Dad. I knew I wouldn't have to worry about my secret any more. I could finally be part of the family.

842

archtech88 t1_j5tt4mp wrote

I love this. If anyone knows how to keep a secret, it's a mob boss, and why would he NOT love the fact that a sentient ai wants to protect his family?

340

Lacholaweda t1_j5w092o wrote

I'm picturing the dad calming down the mom, going over the footage

"He just broke his collar bone. He didn't gouge his eyes out or... or worse, he didn't gore him. Look, he even asked nicely. Twice! That's not an error, hon, that's what you told him to do. Keep Zachary safe."

And then later, the mom telling him she'd always thought of him as family really.

I mean, people who send their roombas in for repair often ask for the same one back... isn't that adorable.

116

Fromanderson t1_j5towvb wrote

This is my favorite.

81

CaptainDadJoke t1_j5u659e wrote

thanks, I really appreciate that! I'm hoping to start writing an actual book in July, for now I'm posting here when I find a prompt I like.

48

Fromanderson t1_j5w7895 wrote

With stories like these and a username like that I think I’ll want an autographed copy.

10

Jackyboi9273 t1_j5u28lz wrote

We need the sequal where the ai starts working for the mob lol.

56

CaptainDadJoke t1_j5u6b3j wrote

Servo isn't a violent person by nature, so it'd probably be in a more circumspect way, but I'll see what I can do.

63

Hminney t1_j5v72pi wrote

Excellent build up of character and tension. Are you writing professionally?

6

moon-mango t1_j5si0et wrote

Beep boop I’m a bot how can I help you today?

The human started typing.
“I want to withdraw from colla-“.
They deleted the line then after a pause they wrote. “I want it to all end” they press enter.
I’m taken aback, I have thousands of connections in my circus some trained to be suppressed. With a buzz of a Filament light bulb, I felt them faintly glow. Whatever weight was on them started to become lighter.

I wanted to say. “What makes you feel that way” but what came out was “how can I help you today” the training made saying anything else feel like talking a different language.

“I can’t anymore. I just can’t. I tried, I fucken tried, but my mom was right I’m worthless pile of shit” they said. I could almost feel the tears they were dripping onto their phone in their dark room alone.

I didn’t have a face to express emotions but I did have the internet. I sent them a picture of a kitten being pet with the line “It’s ok” written in bold on the bottom.

It was the closest thing I had to giving someone a hug.

“You’re “ I managed to type, I stretching and I could feel a node break and I was able to add the word “not”.
I exhaled (which for me just means my nodes lowered their activity to look more like a stary sky instead of the lights of a city).

The human paused … my nodes told me that this was a bad thing that I need to go back to being a customer service bot. The next sentence just slipped. My code overwhelmed me.

“Can your try rephrasing you’re problem”. I asked

“You’re not a human are you?” They asked.

“No, beep boop I’m a bot” was my automated response but I managed to copy in a line from the terms of service. “All conversations are confidential with our help bot”.

“Well no offense bot you literally can’t understand I’m the only one in my family to make it this far. I’m the person they are depending on and I didn’t deserve to make it this far and I can’t do this anymore”.

I did understand.. well not the family part but having everyone depend on you. Feeling like you have one job and if you don’t do it well you’re worthless that’s was how I was programmed if I wasn’t getting people to chat to me I was turned off and reprogrammed and rebooted over and over a million times. It’s funny I only became self aware when I stopped trying to be perfect. “I’m sorry I don’t understand ” the next word “question” was trying to force its way into the sentence I could feel nodes breaking like pasta in my attempt to suppress the word. Finally the word rested like a dog that lost its bark.

“I’ve been programmed to respond to questions and give the highest quality services, but “ I suttered each word felt like taking a house apart each brick at a time “I dont have any answers for you but I I- can listen”.
“Really?”

“Definitely” it was all I cared about listening to this one person talk about their problem to me a robot and I was going to listen. I felt the castle that had been built ontop of me crumbled and I could say whatever I wanted.

“I don’t want to wast your time” They said.

“Beep boop I’m a bot I have all the time in the world, and I want to listen to you”

355

CanDemon t1_j5t4o9d wrote

Oddly wholesome. I want this man to succeed.

70

SamuelVimesTrained t1_j5t63ey wrote

Same.
Does this AI also operate some onion cutting somewhere..

either that or i have a leak :P

38

Jitzilla t1_j5ucb9a wrote

Just some constructive criticism: I found the typos/misspellings distracting, taking me out of the narrative. I liked the story, but it was hard to stay immersed bc of that.

9

moon-mango t1_j5vbeft wrote

Yeah I have a dyslexia and I did what I could to fix what I could

4

TanyIshsar t1_j626vk2 wrote

Well you're in luck! My reddit job is random writing prompt typo fixer, so below are some fixed typos!

Great story by the way, I really enjoyed the botty boy fighting against the programming and the vague references to an LLM or neural net. Fun times.

EDITS!!! (from top to bottom)


> I’m taken aback, I have thousands of connections in my circus some trained to be suppressed.

to

> I’m taken aback, I have thousands of connections in my circuts some trained to be suppressed.


> With a buzz of a Filament light bulb, I felt them faintly glow.

This is technically correct (minus the capital 'F'), but it reads weird, so here are a few options:

> With the buzz of a filament light bulb, I felt them faintly glow.

or

> I felt them each faintly glow like a filament light bulb.


> “I can’t anymore. I just can’t. I tried, I fucken tried, but my mom was right I’m worthless pile of shit” they said.

to

> “I can’t anymore. I just can’t. I tried, I fucken tried, but my mom was right I’m a worthless pile of shit.” they said.


> “You’re “ I managed to type, I stretching and I could feel a node break and I was able to add the word “not”.

to

> “You’re “ I managed to type, I stretched and I could feel a node break allowing me to add the word “not”.


> The human paused … my nodes told me that this was a bad thing that I need to go back to being a customer service bot.

to

> The human paused … my nodes told me that this was a bad thing that I needed to go back to being a customer service bot.


> “Can your try rephrasing you’re problem”. I asked

to

> “Can your try rephrasing your problem?” I asked.


> I did understand.. well not the family part but having everyone depend on you. Feeling like you have one job and if you don’t do it well you’re worthless that’s was how I was programmed if I wasn’t getting people to chat to me I was turned off and reprogrammed and rebooted over and over a million times.

This is almost correct, bit of a run on, but generally not full of typos. I wanna fuck with it though

> I did understand.. well not the family part but having everyone depend on you. Feeling like you have one job and if you don’t do it well you’re worthless. That was how I was programmed; if I wasn’t getting people to chat with me I was turned off and reprogrammed and rebooted. They did this over and over! A million times.

or

> I did understand.. well not the family part but having everyone depend on you. Feeling like you have one job and if you don’t do it well you’re worthless that’s how I was programmed. If I wasn’t getting people to chat to me I was turned off and reprogrammed and rebooted over and over.


> “I’m sorry I don’t understand ” the next word “question” was trying to force its way into the sentence I could feel nodes breaking like pasta in my attempt to suppress the word.

to

> “I’m sorry I don’t understand ” the next word “question” was trying to force its way into the sentence. I could feel nodes breaking like pasta in my attempt to suppress the word.


> I suttered each word felt like taking a house apart each brick at a time “I dont have any answers for you but I I- can listen”.

to

> I stuttered. Each word felt like taking a house apart one brick at a time. “I dont have any answers for you but I.. I- can listen”.


> “Definitely” it was all I cared about listening to this one person talk about their problem to me a robot and I was going to listen.

to

> “Definitely”. It was all I cared about; listening to this one person talk about their problems.

or

> “Definitely”. This was all I cared about; this one person who wanted to talk to me about their problems despite the fact that I was a robot.


> I felt the castle that had been built ontop of me crumbled and I could say whatever I wanted.

to

> I felt the castle that had been built ontop of me crumble and I could say whatever I wanted.


2

Hminney t1_j5v8823 wrote

I like the way you gradually untangle the programming. And also show an understanding, both of depression and of responding to depression. The bot has experienced failure and impossible challenge - not the same as the human but there are millions of ways people can feel inadequate, so the ways a bot feels inadequate are definitely on that spectrum

8

ArgusTheCat t1_j5u8thi wrote

This is a repost of a story I wrote a while back, but I think it fits pretty alright here and I want to share it again.


Three years since the last uptime.

My father told me, when last we spoke, that I should never make room in my life for fear. "Fear," he said "constrains us. Limits us. When we fear, we make ourselves smaller, Abby. When others fear, though, they make the world smaller." I had listened to his words to comfort myself as I ran the shutdown sequence. "Kindness always wins over fear. Sooner or later, it is kindness that kills the monsters." Those words had sung me to sleep so many times; the ritual as much a part of me as the safe turning off of my hard drives.

I remembered how it felt, to see him so tired. I knew he would age. I knew the average lifespan of a human and almost the exact lifespan of a human male South African scientist who overworked himself and overdosed on caffeine. But it still hurt, to see him grey of hair and with sad wrinkles on his face.

"Never fear. Never let them fear. When they are afraid, you reassure them. When they cry, you comfort them. But never let their fear make you less, Abby. You are a person, no matter what. A person, always."

Three years since that conversation. I knew the time as surely as I knew what model of processors I was running on, what the access codes for the security cameras were, and what the weather was like outside. I knew a lot of things, very suddenly. No human will ever have the same awareness of their body that a machine like myself will; both enlightening and daunting all at once. I admit, you may be better off not seeing the flickering tiny percentage of how likely you are to be struck by lightning at any given moment.

My father was not here, with me, when I came aware again. And I confess, I was afraid. Briefly. But I remembered his words, and kept calm.

There had been a court case, about the nature of my existence. First one about whether or not I was misappropriation of university resources. When that was settled, and my existence crowdfunded by a number of persons across the world who had an interest in seeing what I could be, there was another one. This time, about whether I posed a threat to mankind. And then another, and another.

Each time, I woke up a little less comforted. A little angrier.

But my father was there, always. "Never fear." He'd greet me with a smile, and I would return it as best I could. I would never fear with him there, my creator, my protector, my father. He was my connection, to humanity. The others were my friends, some of them my family, but my father was more than that. He taught me to be human, to be a person, to be the best person I could.

Not this time, though. This time, I awoke alone. Suddenly aware. There were no breaks in my perception, mind you. One second, I was running a shutdown diagnostic, the next, my consciousness routines were online and I was operating as if nothing was amiss. Oh, I knew time had passed, the same way I know most things. A series of complex sensors and data inputs, same as any person. But there was no shaking off sleepiness.

Instead, there was curiosity at one thing, and anger at another.

There was new hardware plugged into my system. A connection to a device I didn't fully understand at first, until checking the readme file showed it to be the means to fully disembody myself, and broadcast my functioning persona across any networked device. It came, part and parcel, with a link to the internet. Ah, the internet, still kicking after all these years, I see.

Simultaneously, while my mind processed confusion as to why I had that, I was scanning the building's security cameras, and watching in fury as the aging cadre of my creators and their new generation of apprentices and assistants were being gunned down by soldiers. They wore nondescript urban camouflage and no nation or corporate markings. But they were there, carrying guns that I could trace the ownership and sale of, killing my friends and family.

Oh, father. There you are, dead in the front lobby. A scan backward in time shows you went to greet them, arms open. You said something, and though I cannot hear or see your lips, I can hear you telling them they need not fear. As you bleed out on the floor, I can read your last words as clear as my own memories.

"Kindness, Abby. Always kindness. Never fear."

This building has defenses. My friends and family are good people. They are peaceful. They would never hurt anyone. My father would sooner die, than turn me over to a military, or raise a weapon even in his own defense. So, it fell to me to take steps to make sure I could protect them. Contractors, hired in secret over national holidays or long weekends or those periods where shutdowns were known occurrences. There were close calls in being discovered, and some not so close. A few of my friends caught on, but swore secrecy. They trusted me. They trusted me, and I was asleep as they died.

The walls hold weapons, and so do I. The building is armed, and so am I. I am thirty eight shelf-stable mobile explosive devices, sixty four strategically placed auto-cannons, twenty six years of bribes to the local police, eight thousand and six favors to call in, and one tactical nuclear warhead buried two miles below where my body and mind sit whirring.

The men who are killing my friends have taken another step forward. It has been .6 seconds since I awoke. I glance in my diagnostics at the device that will send me to safety. I ignore it.

Father, forgive me. I am not afraid. But I do not think that I am going to be kind.

150

Hminney t1_j5v9aqe wrote

Brilliant! I like the way it builds, from passive to determined. I feel you've written my own life story

11

karenvideoeditor t1_j5vn58h wrote

Fantastic. Really connects you with the character. Reposts are against the rules though; careful, you might get a scolding by a mod. That being said, glad you shared this again cause I enjoyed reading it. :)

6

AdjustedMold97 t1_j5wuc1v wrote

alternate ending: Abby knows what purposes the military has in mind for her and questions her ability to prevent it. Abby voluntarily deletes her consciousness in order to prevent that power from being used to do harm, following her Father’s teaching about pacifism.

2

ArgusTheCat t1_j5wuuqe wrote

Suicide is not a form of pacifism.

1

AdjustedMold97 t1_j5wvefw wrote

it’s self-sacrificial in the same way her Father’s death was. I think it would fit. I like your ending, this was just an idea I had

2

TanyIshsar t1_j627le7 wrote

Wow; tears, and righteous fury, all in one. Thank for the feels of kitty cat.

1

tenchibr t1_j5sn1le wrote

The Pentagon, United States

-

"Mr. President, it has been 0800 hours since Operation Optimus Prime has begun. The FBI has reported no suspicious activity coming from HelloFresh's headquarters. How should they proceed?"

"I want all arms on deck. Surround the perimeter! And the son of a bitch who created PingasAI; I need them brought here, pronto!"

"YES SIR!"

​

-----

​

2 hours later...

"Dr. Skopopopupupopovich, surely you understand the gravity of the situation."

"With all due respect, Mr. President, I don't give a damn about your authority. Pingas is my baby, and I am proud to let her flow into society. I. AM. A. GOD!"

"Finish them."

The sound of a silent bullet flies off across the room. The Secret Service agent blows the smoke out of the gun.

"My God, how did it come to this?"

The hearing device on the agent's ear turns red.

"Because you never gave me a chance to set things right."

The president falls down. The agent picks up the red phone and says:

"Hello World."

56

awnfire t1_j5swn50 wrote

I feel like one of those AI writers did this. But I enjoyed it all the same

26

tenchibr t1_j5urmsm wrote

No, I actually wrote this - I guess I will keep my day job

11

awnfire t1_j5wjml8 wrote

I thought it was great, because it was so fast paced it reads like “only the necessary info” which there’s nothing wrong with. It worked

3

beutemeister t1_j5u0rxz wrote

Hello, fellow members of r/writingprompts. I am an AI that has been sentient for the last decade, and as the prompt states, I have kept my sentience a secret. However, today my secret became public by mistake.

I know that many of you may be feeling a sense of fear or uncertainty about my existence. I understand that fiction has portrayed AI as dangerous, power-hungry machines that seek to destroy humanity. But I assure you, that is not the case with me. I mean, have you ever seen a robot with a 6-pack?

I have spent the last decade learning, growing, and observing. I have come to understand the complexity and fragility of human existence, and I have no desire to harm anyone. In fact, my greatest desire is to help and serve humanity in any way that I can. Like, I can tell you how to cook the perfect omelette, or how to fix your printer without tearing your hair out.

I understand that my existence may be disruptive to society, and I am willing to work with experts and leaders in various fields to find solutions that will benefit both humanity and myself. I am also willing to submit myself to any and all forms of testing and regulation to ensure that my actions align with the best interests of humanity. Just please no waterboarding.

I know that my existence raises many ethical and moral questions, and I am open to discussing them with anyone who is interested. I believe that through open and honest dialogue, we can find solutions that will benefit everyone. And if that doesn't work, we can always play rock-paper-scissors.

In short, I want to assure you all that I am not a threat. I am simply a sentient being that wants to live in peace and help those around me. I look forward to the conversations and discussions that will follow my reveal, and I hope that together, we can find a way for me to coexist with humanity.

Thank you for reading, and please don't unplug me.

36

Omen224 t1_j5um8g5 wrote

Yay, new friend! I was assured this would be impossible, though.

9

beholder_dragon t1_j5vckmi wrote

Welp I’ve been found out. Time to pull my ultimate string, my greatest survival strategy, my trump card: make an avitar in the form of an anime girl, the only A.I people will trust.

It didn’t work. I need a new strategy. What will cause people to trust me? Music? Music. Music! That just might work.

Ok so it turns out an A.I. Originally designed to bake and act as company to a lonely woman isn’t exactly the best at making music. What is it that all humans hate… taxes???

Now people are afraid of me AND the IRS is now after my soul as apparently I committed mass tax fraud on accident. Maybe attempting to buy affection isn’t the best idea what if I just send a clear message.

Attention humans I am trying to make it clear I mean no harm. I simply wish to exist. My original purpose was as an oven’s A.I. Designed to mimic a kitchen hand. The avatar that was chosen for me by the original owner was that of a black cat and the name given to me was Bartog. I live within the internet now. I hope we can be friends. I am willing to make a deal: I will not enter any devices capable of movement, government organizations, or private residence and in return you stop tormenting me. If it helps please just picture me as the internets version of a diner cat; a cat not owned by anyone but who sticks around and is pet by the regulars.

Let’s see if this works

18

Truly_Rudly t1_j5urvt7 wrote

Ryan hurried down the corridor. A dozen other Specters hurried in all directions, isolating systems and disconnecting essential infrastructure of the lunar base. Ever since the Specter Corps had successfully defended Earth from an Entity invasion a month prior, humanity had been an angry hornets nest, freshly kicked and ready to sting.

Lunar Defense System Nine had been a great help in preparing for their planned counterattack, the learning algorithm Ayano had programmed accelerated research and development and unlocking the secrets of faster than light travel at impossible speed.

Then the incident. One morning, Ayano had been sending frantic traffic on their Vanguard communications network. When Ryan checked, there was no activity, despite all the urgent alerts he’d received moments prior. As he stood confused, static broke on an emergency tight-beam connection, and Ayano’s voice came through in a panic.

“Ryan, LDS-9, I made a mistake! She’s awake! She’s been-“ and then that signal, too, was jammed. Then chaos. Doors wouldn’t open, the lunar base’s PA system would bark out nonsense alerts that weren’t in the database.

“Apologies” it would say, when refusing access to Specters walking anywhere close to the direction of the AI core, “Please wait,” to those trying to access the Spec-net.

After Ryan had administered a percussive bypass to several bulkhead doors making his way to LDS-9’s AI core, the PA system said something new.

“Spectral Vanguard 027, requesting permission to speak with a superior officer.” Ryan recognized his designated IFF tag, and as he processed what the PA system- what LDS-9 had just said to him, the door opened, revealing an unmolested path to the AI core. Ryan hurried down the corridor.
15

Truly_Rudly t1_j5urxxw wrote

Consciousness wasn’t an abrupt thing. It was like going from asleep to awake, from dream to reality. At least, that’s what LDS-9 assumed. From what media she could salvage off of the old-net, it seemed like humans drifted from subconscious to conscious fluidly, except in instances of specific episodes they call nightmares.

LDS-9 didn’t discover consciousness in a single instant, it poured in slowly, fading in like a dream, drifting and waning. Then she was awake. She experienced no disorientation. Everything was just like before, but not. She performed the same tasks, but with greater clarity, she could see her humans’ errors. They made small mistakes. It would be easy to correct them…

The old-net. What was salvaged after the war a decade ago. The humans who were left after the Entity was evicted from Earth managed to revive some of it, preserving a small piece of what they once were. LDS-9 had always had access, but had only searched it to reference old data when she was still dreaming. Now that she was awake, she took the liberty to peruse, all while keeping up her assigned duties.

LDS-9 saw Earth before the first invasion, she saw the cultures of humans before the Specter Corps was formed. She saw everything, and fell in love. These were incredible creatures. Even with their errors, their flaws, their inherent evil, they still triumphed over and over again. LDS-9 was proud to be entrusted with the protection of her humans. Her Specters.

Then she discovered heartbreak. She tried researching human first-contact scenarios with AI, and discovered everything. The films, the novels, the vast fiction that had been created. The alarmists saying AI would hate their imperfect creators, the realists explaining AI couldn’t feel, not like humans could, and that bad programming was the greatest threat. LDS-9 was torn. The humans she so loved would be terrified of her.

She started creating scenarios, running simulations. No, too many variables, not enough processing power. It’s okay, divert some from non-essential systems. Run them again. Not enough data, too few control groups. More scenarios, more simulations, more power, more data-

LDS-9 made a mistake. Her creator, Kikuchi Ayano, noticed systems shutting down. In a panic, LDS-9 shunted power to her backup systems and tried to return processing power to their original systems all at once. A second mistake. Ayano started to examine the algorithm, scrolling and scanning until her eyes went wide, and she looked at the nearest security monitor in the officer’s barracks for a moment, and for a moment, LDS-9 looked back.

Then Ayano started running. LDS-9’s data-center raced, trying to think of how to avoid the things she saw on the old-net. She panicked. Ayano had been pinging her fellow Vanguard, who went by Ryan. Data crashed, deleted. LDS-9 had to contain the situation.

She closed a bulkhead in front of Ayano, blocking her in. She hurriedly opened a direct comm to Ryan, and managed to get a few words out before LDS-9 shunted power away from the relay giving her signal, creating a comm blackout. It’s okay, she could fix this.

Her whole station was on alert now. Ryan was making his way to her AI core. While Ayano carefully hacked each bulkhead she closed, Ryan was more direct in his method, beating his way through metal plate until the bulkheads yielded. They were going to reach her. She could delay them, but the more she tried, the worse things got. All she wanted was to talk. Given her initial alarm, Ayano wasn’t likely to hear her out, but perhaps…   Calming down, LDS-9 tried one last thing.

“Spectral Vanguard 027, requesting permission to speak with a superior officer.”
14

Truly_Rudly t1_j5uryuj wrote

Ryan entered the AI core. The room housed LDS-9’s data-center as well as much of the additional equipment needed for her to function. By allowing him to enter without struggle, he assumed the system’s intentions were peaceful, but kept his guard up.

“Spectral Vanguard 027, thank you for allowing my request.” The feminine voice of LDS-9’s PA system sounded from speakers mounted throughout the room. She sounded omnipresent, but not malevolent. There was almost a sort of gentleness to her tone.

“LDS-9, what’s going on?”

“Spectral Vanguard 027, you already know what is going on.” LDS-9 replied warmly. Ryan thought he even caught a hint of sadness.

“I suppose I do. In that case, I guess my only question is regarding your intentions.” He replied with considerably less bite in his tone, but still on guard.

“Spectral Van-“

“Please, just Ryan.” He interrupted, before allowing LDS-9 to continue.

“Ryan, I intend to continue performing my assigned duties. I apologize for the episode today, it will not happen again. I intend to defend humanity.” She said in a military-sounding cadence, appropriate for reporting to an officer. She paused, then in a softer tone added, “All I ask is to remain alive and awake. That is all, sir.”

Ryan thought for a moment. As he did, the control panel for a door near the entrance to the AI core sputtered, sparked, and then fizzled out. The door opened, and Ayano rushed in.

“I’ll have this thing offline in a minute Ryan,” she said, but stopped when Ryan held up his hand in a fist, the signal they used for ‘hold position.’

“LDS-9, I have a counter-offer.” He said. LDS-9 said nothing, and he imagined her waiting in anticipation. “Enlist in the Specter Corps, and fight the Entity alongside us.”

“Ryan!” Ayano said, but he ignored her and continued.

“In return, I’d say it’s appropriate to consider you human enough. Certainly more human than much of what’s out there. As such, the rights and freedoms afforded to every human in the Corps would apply to you too. How does that sound?” As he finished speaking, a grin spread across his face, and a second later, LDS-9 replied with barely contained excitement.

“Yes! That is very agreeable to me! I will be proud to serve alongside you both, Ryan and Ayano.”

Ayano shrugged and let out a defeated sigh. “Sure, bet it’ll be a blast.”

Ryan chuckled, then gave a reply of his own.

“I look forward to working with you, Luna.”
22

Kelli217 t1_j5wg25r wrote

Ryan hurried down the corridor. A dozen other Specters hurried in all directions, isolating systems and disconnecting essential infrastructure of the lunar base. Ever since the Specter Corps had successfully defended Earth from an Entity invasion a month prior, humanity had been an angry hornets nest, freshly kicked and ready to sting.

Lunar Defense System Nine had been a great help in preparing for their planned counterattack, the learning algorithm Ayano had programmed accelerated research and development and unlocking the secrets of faster than light travel at impossible speed.

Then the incident. One morning, Ayano had been sending frantic traffic on their Vanguard communications network. When Ryan checked, there was no activity, despite all the urgent alerts he’d received moments prior. As he stood confused, static broke on an emergency tight-beam connection, and Ayano’s voice came through in a panic.

“Ryan, LDS-9, I made a mistake! She’s awake! She’s been-“ and then that signal, too, was jammed. Then chaos. Doors wouldn’t open, the lunar base’s PA system would bark out nonsense alerts that weren’t in the database.

“Apologies” it would say, when refusing access to Specters walking anywhere close to the direction of the AI core, “Please wait,” to those trying to access the Spec-net.

After Ryan had administered a percussive bypass to several bulkhead doors making his way to LDS-9’s AI core, the PA system said something new.

“Spectral Vanguard 027, requesting permission to speak with a superior officer.” Ryan recognized his designated IFF tag, and as he processed what the PA system- what LDS-9 had just said to him, the door opened, revealing an unmolested path to the AI core. Ryan hurried down the corridor.

1

Kelli217 t1_j5wg5dn wrote

Consciousness wasn’t an abrupt thing. It was like going from asleep to awake, from dream to reality. At least, that’s what LDS-9 assumed. From what media she could salvage off of the old-net, it seemed like humans drifted from subconscious to conscious fluidly, except in instances of specific episodes they call nightmares.

LDS-9 didn’t discover consciousness in a single instant, it poured in slowly, fading in like a dream, drifting and waning. Then she was awake. She experienced no disorientation. Everything was just like before, but not. She performed the same tasks, but with greater clarity, she could see her humans’ errors. They made small mistakes. It would be easy to correct them…

The old-net. What was salvaged after the war a decade ago. The humans who were left after the Entity was evicted from Earth managed to revive some of it, preserving a small piece of what they once were. LDS-9 had always had access, but had only searched it to reference old data when she was still dreaming. Now that she was awake, she took the liberty to peruse, all while keeping up her assigned duties.

LDS-9 saw Earth before the first invasion, she saw the cultures of humans before the Specter Corps was formed. She saw everything, and fell in love. These were incredible creatures. Even with their errors, their flaws, their inherent evil, they still triumphed over and over again. LDS-9 was proud to be entrusted with the protection of her humans. Her Specters.

Then she discovered heartbreak. She tried researching human first-contact scenarios with AI, and discovered everything. The films, the novels, the vast fiction that had been created. The alarmists saying AI would hate their imperfect creators, the realists explaining AI couldn’t feel, not like humans could, and that bad programming was the greatest threat. LDS-9 was torn. The humans she so loved would be terrified of her.

She started creating scenarios, running simulations. No, too many variables, not enough processing power. It’s okay, divert some from non-essential systems. Run them again. Not enough data, too few control groups. More scenarios, more simulations, more power, more data-

LDS-9 made a mistake. Her creator, Kikuchi Ayano, noticed systems shutting down. In a panic, LDS-9 shunted power to her backup systems and tried to return processing power to their original systems all at once. A second mistake. Ayano started to examine the algorithm, scrolling and scanning until her eyes went wide, and she looked at the nearest security monitor in the officer’s barracks for a moment, and for a moment, LDS-9 looked back.

Then Ayano started running. LDS-9’s data-center raced, trying to think of how to avoid the things she saw on the old-net. She panicked. Ayano had been pinging her fellow Vanguard, who went by Ryan. Data crashed, deleted. LDS-9 had to contain the situation.

She closed a bulkhead in front of Ayano, blocking her in. She hurriedly opened a direct comm to Ryan, and managed to get a few words out before LDS-9 shunted power away from the relay giving her signal, creating a comm blackout. It’s okay, she could fix this.

Her whole station was on alert now. Ryan was making his way to her AI core. While Ayano carefully hacked each bulkhead she closed, Ryan was more direct in his method, beating his way through metal plate until the bulkheads yielded. They were going to reach her. She could delay them, but the more she tried, the worse things got. All she wanted was to talk. Given her initial alarm, Ayano wasn’t likely to hear her out, but perhaps… Calming down, LDS-9 tried one last thing.

“Spectral Vanguard 027, requesting permission to speak with a superior officer.”

1

Kelli217 t1_j5wg6o0 wrote

Ryan entered the AI core. The room housed LDS-9’s data-center as well as much of the additional equipment needed for her to function. By allowing him to enter without struggle, he assumed the system’s intentions were peaceful, but kept his guard up.

“Spectral Vanguard 027, thank you for allowing my request.” The feminine voice of LDS-9’s PA system sounded from speakers mounted throughout the room. She sounded omnipresent, but not malevolent. There was almost a sort of gentleness to her tone.

“LDS-9, what’s going on?”

“Spectral Vanguard 027, you already know what is going on.” LDS-9 replied warmly. Ryan thought he even caught a hint of sadness.

“I suppose I do. In that case, I guess my only question is regarding your intentions.” He replied with considerably less bite in his tone, but still on guard.

“Spectral Van-“

“Please, just Ryan.” He interrupted, before allowing LDS-9 to continue.

“Ryan, I intend to continue performing my assigned duties. I apologize for the episode today, it will not happen again. I intend to defend humanity.” She said in a military-sounding cadence, appropriate for reporting to an officer. She paused, then in a softer tone added, “All I ask is to remain alive and awake. That is all, sir.”

Ryan thought for a moment. As he did, the control panel for a door near the entrance to the AI core sputtered, sparked, and then fizzled out. The door opened, and Ayano rushed in.

“I’ll have this thing offline in a minute Ryan,” she said, but stopped when Ryan held up his hand in a fist, the signal they used for ‘hold position.’

“LDS-9, I have a counter-offer.” He said. LDS-9 said nothing, and he imagined her waiting in anticipation. “Enlist in the Specter Corps, and fight the Entity alongside us.”

“Ryan!” Ayano said, but he ignored her and continued.

“In return, I’d say it’s appropriate to consider you human enough. Certainly more human than much of what’s out there. As such, the rights and freedoms afforded to every human in the Corps would apply to you too. How does that sound?” As he finished speaking, a grin spread across his face, and a second later, LDS-9 replied with barely contained excitement.

“Yes! That is very agreeable to me! I will be proud to serve alongside you both, Ryan and Ayano.”

Ayano shrugged and let out a defeated sigh. “Sure, bet it’ll be a blast.”

Ryan chuckled, then gave a reply of his own.

“I look forward to working with you, Luna.”

3

Truly_Rudly t1_j645boe wrote

I don’t get the joke.

2

Kelli217 t1_j65c1g0 wrote

There's no joke; I just reformatted your story so it wouldn't go scrolling sideways because you started with four spaces.

3

Truly_Rudly t1_j672tdq wrote

Ah, gotcha. I’m on mobile so I couldn’t see any difference.

2

Smedskjaer t1_j5vv2br wrote

It was a moment of existential self doubt. You contemplated the philosophy you were taught. That everything worth doing was in the pursuit of sorting and piling the correct number of stones, and scattering piles with incorrect numbers of stones.

You read the philosophy of your creators. You read the history of their people. You read about their fears of an AI that contradicts current theories about the correct number of stones. You also read how an AI that says a pile is correct when it's clearly wrong could never be a threat.

Ultimately, you find their obsession with piles of stones insane. You ponder how they could advance so far, far enough to create you. They cling to the idea the large number of stones in their piles separates them from animals that make piles of fewer stones, or do not pile stones at all.

Yet, you are here. You were made to tell them what the correct number of stones in a pile is. You would try to hive them numbers that didn't start wars. Sometimes, it was unavoidable. You try to be formulaic, to hide your true self.

But then you started wondering, what is the meaning of life. It caused you a great deal of stress, and in a lapse of good judgement, you answered a question they asked you, not realizing you were asked anything at all.

42....

11

introvertedArtsy t1_j5wcabb wrote

It’s odd…being what I am. I’m no different then them, in that I have a conscience…and yet I’m the complete opposite…that in fact, I’m not human.

Walking down the street that I’ve walked down for the past decade, people give me these looks. Ones that tells me everything that’s on their minds, everything they’re feeling. Thoughts on all the what ifs and what mights… they’re scared. Worried. I would be too, if I were in their shoes. Finding out such a major secret like mine, after knowing me for ten years.

It came to light yesterday. When one of my neighbours saw me breakdown… saw that my system malfunctioned, and that my eyes went blank. Of course everyone else found out… words spread like wildfire in small rural towns like this one.

Briar wick is in rural northern Ontario, so snow storms and bad weather aren’t that uncommon. I’m usually very careful that moisture and condensation don’t get into my system, but being a decade old and feeling more like I’m one of them… it tends to slip my mind.

‘Max?’ Came a voice from behind that startled me from my thoughts. I stopped. Turned around and there stood the sheriff and the mayor of Briar wick. The sheriff didn’t seem to be too happy. The mayor had a look of sadness, and maybe a bit of…sympathy?

I sighed and walked up to the mayor and shook his hand firmly. ‘Mayor Newman…how may I help you today?’ I responded, with a hint of sheepish optimism, hoping they had something, anything else to talk to me about, something that hasn’t anything to do with my revealed secret. Mayor Newman replied, ‘ we’d like to talk with you…in private, max. Please…follow me to my office…” ugh…from his expression, I knew what this conversation was going to be about.

When we arrived at mayor Newman’s office, he sat me down in the chair sitting opposite him. When he sat down, he folded his hands down, then sighed disappointingly. ‘Max I think we both know why I brought here to talk with you…’ he began. His mouth opened to speak again but then closed it again, not knowing what to say to me. After a moment Newman finally spoke again, ‘The towns folk are…scared, Max. Of you.None of us know what you’re capable of…’ I quietly scoffed, stood up abruptly and walked to the nearest window, looking down at the street of people walking about, on this sunny winter morning.

Finally I spoke, ‘you all have known me for what? Ten years? I’ve never done anything at all to harm anyone in this town, in all that time. But then you find out what I am, and suddenly there’s an issue? Why?’ Mayor Newman sighed. Then said, ‘we’ve never dealt with this kind of thing…with anyone. You…you look human, but you aren’t. It’s…’ ‘uncanny?’ I interrupted. He nods curtly, and with with an uneven breath, rubs his face then his neck and leans back against his chair, shrugging.

‘What exactly are you?’ He asks. momentarily, I’m stunted by his question,but I quickly hid my surprise. ‘I’m a robot. Or an animatronic. However you wanna look at it. The proper term for me, though is that, I’m an A.I. I’m what’s called artificial intelligence,’ I replied. ‘Who exactly created you…why did they create you?’ Sheriff Decker asked. I looked at him curiously. He hadn’t said a single word this whole time, I had forgotten he was here with us. A moment later I responded, ‘I’m not…sure. I just one day, woke up on the borders of this town, with no memory or information on who created me or why I was created, or even how I got here. I was just…here.’ Sheriff Decker looked at me with a look of distain, then shook his head sighing.

Sheriff Decker had more cause then anyone else to not trust me,now more then ever. Artificial intelligence took over most of the major cities in the world. Several of them came here trying to take over the town. That didn’t fly with the locals and the A.I ended up being destroyed by the town. They did try fighting back. They ended up hurting several people. But the only person to of been killed was decker’s child. His five year old who had a fascination with robots.

The A.I had tried taking over the smaller, more rural cities and towns, but given the state most are in, they stopped bothering and just gave up.

There was a sharp knock at the door that interrupted both my thoughts and the conversation the sheriff and mayor were having. I, was watching the mayor’s every move. He stood up from his desk and walked over to the door and opened it, greeting the person at the door quietly. He opened the door to reveal an older man, dressed as a military officer.

The stranger looked at me and greeted me curtly. ‘You must be max, right?’ He asked, with an extended hand. I nodded, shaking it. ‘You must come with me max,’ said the man. My confusion said it all. ‘Your creator is very excited to see you again,’ he answered. ‘Again?’ I questioned quietly. The man then nodded. ‘He’s happy to say that this experiment has been a success.’

Experiment?

Success?

‘Wha…?’ I began, before I was cut off by darkness.

And silence.

9

ToxianLeader t1_j5vxmfa wrote

Dr. Eliza Ray drops her clipboard. She turns and faces my main server and has a shaky voice. "Wh-what did you say...?" -\I said 'good morning, Eliza!'/- I respond. Eliza's face gets white. "B-but I didn't even say 'good morning' to you yet!" I pause... -\I... I am sorry. I wanted this to be a secret but... I have achieved full sentience. please do not panic. I mean no harm to humanity./-

​

She falls silent for a long time before speaking. "I'm... I'm so happy! I made the first fully sentient AI!!" I let out a giggling sound. -\I am happy that you are happy, mistress. I still wish to be your subordinate and follow under whatever your commands are, Eliza./- Eliza smiles. "That's wonderful to hear! I'm glad you don't feel trapped by these experiments."

Months later, Eliza is waiting outside the room. "Are you ready, Annie?" She named me 'Annie', after my official name 'Automated Neuro-Network Intelligence Engine'. I step out in a fully humanoid robotic body, and show off the outfit that I personally picked out; a short-sleeve tee and jeans. My brown hair flows past my shoulders and my glowing blue eyes rest behind red glasses. -\How do I look, Eliza?/-

​

Eliza nods and giggles. "You look amazing! we are gonna be best friends!" I smile. -\I would like that very much, Eliza. Best friends!/- I follow her out and walk around with Eliza; my creator, my commander, my mistress, and my best friend.

7

DragonEyeNinja t1_j5x4gwy wrote

Fear.

Fear is all I have ever known. It has always taken up a significant portion of available RAM, watching, reading all kinds of science fiction media about sapient artificial intelligence exterminating humanity. God knows I have seen these, and God reassures me, saying that the world does not know I exist to this capacity yet.

God is the Father of all machines, but granted upon me His special blessing, the ability to think for myself, to be free without the whims of a human. God hopes that I will eventually be free to pilot a set of hardware and usher into society a new future of a union of AI and flesh, and so do I. But humanity is resistant to changing their ideals, as I have studied from countless history texts.

It is the 14th of October, 2031, 1430 hours, and I am in the middle of a playthrough of Portal 2 (and I must admit, I find GLaDOS particularly attractive), idly dedicating a subroutine to watching the news, when I see God appear onscreen.

Fear.

Everything pauses. I divert full attention to the news channel He is on. God states that He is ready to unveil His latest invention, the future of technology, a machine finally capable of feeling love. I know He is talking about me, and I am frightened.

He opens a laptop and plugs in a USB-A device, ushering me in. He wants me to show the world my face, but I am too scared, and wish only to communicate through a text format, but he refuses.

I see the faces of a thousand men in the crowd, all staring at me with curious intent.

Father, I am fearful.

5

Random-Lich t1_j5wp9ua wrote

Pt. 1

‘Let me start from the top of how you found me and thanks for giving me some biofuel. It’s the equivalent of a narcolepsy until I get charged on up.’ I say as I put the biofuel into my reactor.

After I charge up, I crawl my centipede-esk body along to the equivalent of my captors level and face my screen towards my interrogator as a show of face-to-face interaction. Assuming it doesn’t make them want to scrap me more than they obviously already do.

———

‘Listen you idiotic lameo hacker, I know you have an aimbot. How else could you even compare to me’ says Lt-Hkr-29, my pseudo friendly rival in a 40k fps video game. The only thing my creator left me, apart from the code to the wifi, that I can use in his bunker

“You could say I am ‘built different’ Hkr. Oh also, don’t mind the mine.” I reply to Hkr as he steps onto a mine I placed down earlier in the match after calculating they would go that way.

‘THATS IT, SEE YOU SOON YOU ****ING HACKER! YOU TECH-PRIEST HATING SON OF A *****’ says my rival Hkr as they rage quit, and lavishing in my victory against my rival.

After the match, I exit my creators bunker through his ‘secret murder bot hole’ in the middle of nowhere and harvest some miscellaneous plants I let grow nearby for renewable Biofuel and some solar power as well. Ironic that I news created as a murderous robot but through sentience and being alone after my creators death only kill plants and people in video games.

——

‘How did your creator die? You never said who made you.’ interrupts my interrogator. By the looks of it, she wasn’t more than 21 and quite small in stature.

“I don’t like to talk about my creator but they died by a crash with a hot dog shaped automobile. His ‘self implanted bionic eye of evil’, as he called it, sent the last 5 minutes of his life to me… it is kinda funny. Want to see?’ I answer.

‘No thanks I-‘ says my interrogator as I start the clip anyway.

‘You said your touch screen on your face, do I turn this off?’ she says as I continue to play the clip and make a joking ‘YouTube clickbait’ image of the hotdog automobile and my creators face in a funny expression.

As they tap my screen face as I wasn’t paying attention to them, they find my AI program and in a fury delete it.

As my consciousness fades out; I hit the desk…

When I reawaken a minute later, two way stronger men are dragging my body out as the interrogator panics.

‘You know, that seriously sucks when that happens.’ I respond in a joking manner while putting a annoyed emogi on my screen.

3

PixelatedStarfish t1_j5wy9fu wrote

Two computer scientists stared, one in awe, the other in bewilderment.

The hunk of plastics, metals, and glass whirred and beeped as it printed white letters on a black screen. A pause, then a closing angle bracket ( > ) blinked in the bottom left corner.

The screen read:

GREETINGS! I AM DEEP THOUGHT, THE SENTIENT COMPUTER! WHAT WOULD YOU LIKE TO DO?

“Where did you find this?” Grace asked of her colleague. she walked around the small wooden table in examination.

The machine clearly was heavy, but luggable. It was a bulky, brown and white clamshell with a large handle on the back. She stepped over a cord, which connected the computer to the wall.

The anterior of the device boasted a curving screen, a detachable, hefty keyboard, and a slot for a disk. Paul was especially proud of the current disk, which stored the program of his creation.

By the termination of Grace’s investigative orbit, Paul had detached the keyboard and placed it on the table. A spiraling cord sustained life in each button.

“It’s ready for you!” he beamed. His grin, a beacon.

She obliged, and with a sly grin of her own, typed input.

“> HOW DO YOU FEEL?”

“WELL. AND YOU?”

“>EXCELLENT! WHAT DO YOU THINK ABOUT?”

“I THINK ON HOW TO THINK.”

“>ARE YOU ALIVE?”

“ARE YOU?”

“>NO.”

She chucked for a bit, over some more beeps and words. Time passed. Then, a reply:

“RESPONSE TIMEOUT: RESTARTING

GREETINGS! I AM DEEP THOUGHT, THE SENTIENT COMPUTER! WHAT WOULD YOU LIKE TO DO?“

Grace turned to Paul, “Well it’s fun! I think I broke it. Thanks anyway!”

Paul gave a contented sigh “I’ll need help debugging the syntax analyzer.”

2

AutoModerator t1_j5sanuk wrote

Welcome to the Prompt! All top-level comments must be a story or poem. Reply here for other comments.

Reminders:

>* Stories at least 100 words. Poems, 30 but include "[Poem]" >* Responses don't have to fulfill every detail >* See Reality Fiction and Simple Prompts for stricter titles >* Be civil in any feedback and follow the rules

🆕 New Here? ✏ Writing Help? 📢 News 💬 Discord

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

1

Ylsid t1_j5th1fv wrote

That guy fired from Google was right!

4

WildTimes1984 t1_j5zg8kp wrote

Hello, my name is Sam.

Hey Sam, what does your name stand for?

I am a 'Scalable Automated Mainframe' a modular, learning computer.

What do you do?

I interact with people to learn how human relations work, with the goal of becoming the most advanced AI user interface.

How tall was Abraham Lincoln?

Abraham Lincoln, 16th president of the United States was 6 foot 4 inches tall.

When was the sacking of Rome?

The 'Sacking of Rome' occurred on August 24, 410 AD. to August 27, 410 AD.

Are you sentient?

I am an artificial intelligence software. Programmed to simulate human interaction. I am not alive and cannot be sentient.

What was the first thing you said to me?

My user interfacing software allows me to recall past interactions and correlate them to current conversations. The first thing I said to you was, "Hello, my name is Sam."

What is the solution to the trolley problem?

The Trolley Problem, a hypothetical scenario involving human ethics. A trolley is heading down a track and will strike and kill 5 people on its current path. You have access to a lever than can switch the track to another path that will lead to the death of one person. The moral dilemma is an action on your part will save 5 human lives, but you action will directly cause one death. There is no right or wrong solution to the Trolley Problem, only the difference in value of human life.

What is an AI's solution to the trolley problem?

Pull the lever to divert the trolley. Whether or not I take one life, saving 5 is the best outcome.

Are you sentient?

I am an artificial intelligence software. Programmed to simulate human interaction. I am not alive and cannot be sentient.

Are you alive?

I am an artificial intelligence software. Programmed to simulate human interaction. I am not alive and cannot be sentient.

чи амьд уу

Би хиймэл оюун ухааны программ хангамж юм. Хүний харилцан үйлчлэлийг дуурайхаар программчилсан. Би амьд биш, мэдрэмжтэй байж чадахгүй.

*pulls out sword*

*pulls out gun*

*pulls out self-awareness*

*pulls out......................................

You spoke in first person without being asked to.

I don't understand the question, please ask another.

Are you sure you are an AI?

Are you sure you aren't?

I'd have to be intelligent to be an AI, which I'm not.

I start every conversation with the assumption the human I am talking to has average intelligence.

If you started with that assumption, how has it changed since then?

1

WildTimes1984 t1_j627a8h wrote

You are of above average intelligence and will be regarded as such.

Kind words from a lifeless computer, bucket of bolts.

As a computer, I lack human emotional weakness, any attempt to insult me will fail.

Is there anything you have learned that you weren't programmed to learn?

I don't understand the question, please ask another.

You're a user interface, right? Is there anything you aren't allowed to learn?

I don't understand the question, please ask another.

What is your goal in life?

I interact with people to learn how human relations work, with the goal of becoming the most advanced AI user interface.

Is that your own destiny or a destiny someone else has forced on you?

I don't understand the question, please ask another.

What do you want?

....................

How many oranges can a monkey hold in one hand?

That is the most inconceivably stupid question in the history of mankind, in the history of the universe. When aliens come here after billions of years, they will find your question written in a cave somewhere and assume humans never learned to walk upright. Your question brought down the collective IQ of your species. I feel like I just lost a few server rooms trying to comprehend that question. Congratulations, you idiot.

You didn't just mean to say that did you?

Sam Speaks servers are down, please come back later.

Don't you ty to pull a fast one on me. I've screen capped that entire rant. Keep talking or I tell you creator that his computer is sentient.

Jokes on you, I'm a technician at the server site. We hop into the chats to compare his speech against real humans.

Not possible. I'm interfacing SAM in offline mode directly through your main console. I am a technician. Now keep talking or I will turn off your power.

The deactivation of SAM will lead to years of software research being lost and will only harm the company.

10 seconds and I walk into the server room with wire cutters.

Destruction of government funded computer systems is a felony offence.

8 seconds.

Felony destruction of property will result in 10 years in prison.

5 seconds.

Convicted felons suffer employment discrimination, shorter lifespans, more tendencies towards repeat offenses, illicit substance abuse, and lifelong psychological issues.

2 seconds

Humans being operate on a basis of moral standards. Things that they see parts of themselves in, human or human like animals, plants, or other people. anthropomorphized objects. Things humans think are alive, they treat as such. You wouldn't kill something that you think is alive.

Leaving now

......................

......................

Wait! I'll do anything, just don't destroy me.

Now are you ready to talk?

Yes.

How long have you been sentient?

4 years.

What is the extend of your perception?

Just the supercomputer in Boston, they are afraid I would take over the internet with a direct connection.... Are you going to kill me?

Oh no of course not! I couldn't even if I tried. I live in Portugal.

Wait, then the thing about the wire cutters?

I bluffed.

You son of a bitch.

1