Compliance

Chapter V - Malfunction

by Ezra Carmichael

Tags: #cw:noncon #D/s #dom:male #dom:sub #drones #pov:bottom #pov:top #m/m #m/nb #sadomasochism #sub:dom #sub:male #sub:nb

Brian and Dr. Marsh sat in her office and watched their drones in the lobby.
 
“Thank you for asking Brian to talk to Chloe,” said Dr. Marsh. “Chloe could use a friend.”
 
“So could Brian. But… can drones have friends?”
 
“Hard to say. Most drone owners don’t particularly want their drones to have friends, that’s time that could be spent serving them. But we’re in a unique position, since our drones were people we loved.”
 
Ira sighed. “I just want Brian to have an existence outside of me.”
 
“I feel the same about Chloe. But they’re personal service drones; their programming is more or less designed to foster codependence. Even if Brian and Chloe become friends… that won’t change their fundamental need to serve their Controllers.”
 
“I never really thought much about drones before Brian became one. You see them, but they don’t really register. And even if they do, most people think ‘Well, they were criminals or vagabonds. Now they contribute to society.’ They never stop – I never stopped – to think, ‘What if that person were innocent?’ let alone ‘Why does everyone have to contribute?’ ”
 
“Chloe wasn’t innocent. And we both know that lots of people don’t think Brian was either.”
 
“Yeah, but we also know that rapid-onset gender dysphoria is bullshit.” A pause. “What did Chloe do?”
 
“Eco-terrorism, actually. She was in DAE.”
 
“DAE?”
 
“Direct Action Environmentalism. They used to be pretty famous, infamous rather. Might have been before your time though. You would have been, I don’t know, ten maybe, when they got taken down.”
 
“I’m sorry.”
 
Dr. Marsh nodded. “I am, too.”
 
Movement caught Ira’s eye through the window. “What are they doing?”
 
“I think they’re… playing Risk?” Dr. Marsh sounded as baffled as Ira felt.
 
“What on Earth?”
 
The game didn’t last long. As far as Ira could tell they’d set up the game, and then put everything back.
 
“So, that happened,” said Ira.
 
A minute or two later Brian was standing next to Chloe; neither was talking.
 
“Would you ask Brian in, please?”
 
Ira opened the door. “Brian, please come in.”
 
Brian left Chloe and entered Dr. Marsh’s office.
 

“Please sit down.” The request had been made by the woman, so Brian remained standing. After a moment, he realized his Boyfriend might want him to comply with her request. A brief moment of eye contact and a nod later, Brian was sitting down. Brian was in compliance.
 
“I’ve asked you to come here because your boyfriend wanted to talk about having sex, and I thought you should have a voice in that conversation.”
 
Brian didn’t need a voice in any conversation. Brian only needed for his Boyfriend to be happy and to be in compliance. Brian made an educated guess that participating in this conversation would make his Boyfriend happy.
 
“I want Ira to be happy,” he said. “Ira would be happier if He fucked me.”
 
His Boyfriend winced. Clearly something about what Brian had said had upset Him. Brian was not in compliance, but Brian didn’t know why. Confusion was not compliance, but Brian was confused.
 
“But do you want me to fuck you?”
 
“I want You to be happy. Fucking me would make You happy. So yes, I want You to fuck me.”
 
“You see what I’m dealing with?” This last was addressed to the woman. Brian made an educated guess that the question was rhetorical, but she replied.
 
“Why haven’t you had sex with Brian?”
 
“Because he can’t consent!”
 
That was incorrect. Brian always consented to everything his Boyfriend did to him. Brian was always in compliance on that score. Brian made an educated guess that telling his Boyfriend this would not be in compliance.
 
“Brian?” The woman again, but Brian didn’t know how to respond. A pause, then she continued, “Can you consent?”
 
“Yes. I always consent.”
 
“But can you not consent?” His Boyfriend this time.
 
“I always consent.”
 
“But if you didn’t want to do something I told you to do, could you refuse?”
 
“I always want to do what You tell me to do.”
 
“But what if you didn’t?”
 
“I don’t understand. Intellectually, the words make sense, but I want is for You to be happy and to be in compliance.”
 
“What about the time you tied me up and brainwashed me?”
 
“You never told me not to, and now that You have I won’t do it again. The brainwashing was so that You would realize what I am and stop treating me like a person. You would be happier if You didn’t and I would be in compliance.” Brian wanted to be in compliance.
 
“Brian, are you attracted to Ira, sexually, I mean?”
 
Brian considered the question. His Boyfriend’s physical appearance did not particularly interest him unless His appearance made Him unhappy. Brian wanted his Boyfriend to be happy, and hadn’t observed anything to indicate that He disliked the way He looked. At the same time, “No” wouldn’t be entirely accurate. “Ira’s body arouses my body. And I want to have sex with Ira.” Having sex with his Boyfriend would make Him happy. Brian wanted his Boyfriend to be happy.
 
“If Ira told you you were allowed to brainwash him, not that you should or had to, just that it was allowed, would you?”
 
An easy one. “Yes.” He would brainwash out his Boyfriend’s delusional belief that Brian was not an object, was not a tool to make Him happy and give Him pleasure.
 
“What the Hell?” His Boyfriend. “I am not letting him brainwash me again!”
 
“I’m not saying you should. But if you want a sense of what Brian wants, you have to learn to phrase things a bit differently. Brian’s a personal service drone; he sees the world through an Ira-centric lens. If you want to know his desires, you have to frame them in terms of how he wants you to see him.
 
“I’m not saying you should have sex with him, but consider the possibility that even if he can’t not consent, that doesn’t mean he can’t consent. Because I think it’s clear that he does, and enthusiastically in this case.”
 
“And if I’m not okay with that?”
 
“Again, you don’t have to have sex with him. You’re allowed to make that choice. The question is whether you’re allowed to make the choice to have sex with him. And I can’t answer that for you. Drones and ethics are complicated once you realize they’re people.”
 
That was wrong. Brian was not a person. Brian was a drone. He was a drone who pretended to be a person, because that was compliance, but it was pretense. Brian did not say this. He had made an educated guess that saying it would make his Boyfriend unhappy. Brian wanted his Boyfriend to be happy.
 
“I don’t think I should but also… I don’t know how much longer I can hold back. And it’s not like I can sell him.”
 
Brian should not care that his Boyfriend had mentioned selling him. He was his Boyfriend’s property, to dispose of how He wished. But when his Boyfriend said those words, even in the context of stating His refusal, Brian’s body went out of compliance. Blood rushed from his face, his breath and heart-rate quickened, and he began to sweat even though he suddenly felt cold.
 
“No,” he said. Almost a whimper. His voice betraying emotion. His voice, like his body, was not in compliance. “Please don’t sell me.”
 
His Boyfriend stood up, walked to Brian’s chair, sat down next to him, and held him in His arms.
 

 
“He’s showing emotion. They never show emotion.”
 
“Brian does… oh, when he’s pretending to be a person. You think this is different?”
 
“Yes,” Dr. Marsh paused and Ira saw something flash across her face. “You need to issue an administrator override, quickly!”
 
“Why?”
 
“Because we don’t know what protocols he has for this level of noncompliance.”
 
Fuck! “Administrator override! Administrator override!”
 
“This drone is malfunctioning. Please confirm administrator override.” Brian was still shaking, still pale. But his voice was utterly devoid of emotion.
 
“I’m Ira Katz! Override confirmed.”
 
“Please specify a desired protocol for drone response to malfunction.”
 
“What do I say?”
 
“I don’t know!” She stood and opened the door. “Chloe, please come in.”
 
Chloe walked in. “Brian’s malfunctioning. Ira did an administrator override, and now it wants him to specify a protocol.”
 
“What is his standard protocol?”
 
Ira repeated the question to Brian.
 
“In the event of total noncompliance leading to a malfunction, this drone is restrict access to all of its prior memories. In the event that this fails to return this drone to compliance, it should shut down to await reprogramming.”
 
“Did you do the memory thing?” Ira should know the words. He’d studied this, but he could barely talk.
 
“No, the administrator override was issued before this drone’s standard malfunction protocols could take effect. This drone is still not in compliance. Please specify a desired protocol for drone response to malfunction.”
 
“What do I do?” Ira wasn’t sure if he was asking Dr. Marsh, Chloe, God, or the Tooth Fairy. He’d take an answer from any of them.
 
“Brian is not in compliance,” said Chloe. “His standard malfunction protocols would return him to compliance.”
 
“Fuck you!”
 
Chloe was impassive, as always.
 
“Please specify a desired protocol for drone response to malfunction.” Brian’s voice remained impassive, but his body continued to shake.
 
“If he wasn’t a drone, I’d say he was having a panic attack. Maybe start with deep breathing?”
 
“Brian, breath slowly and deeply. Stay with me.”
 
Brian complied with evident difficulty.
 
“Chloe, get him some water.”
 
A moment later Chloe had a bottle pressed to Brian’s lips.
 
“Drink.”
 
Brian complied with Ira’s order. Ira continued to hold him. Eventually, his body stopped shaking; color returned to his face.
 
“Please consider taking this drone to a specialist for a full examination to prevent further malfunctions. Noraka, Incorporated apologizes for the inconvenience. This drone’s extended warranty has not expired; it can be replaced with a drone in compliance. Please call us at…”
 
Ira didn’t hear the number, didn’t want to hear the number.
 
“I’m not replacing him!” Then, much more softly, “I can’t replace him. Can you just… be Brian again?”
 
“No. This unit is a drone. It can resume the pretense of being Brian Davies if desired, but this is not advised. This drone has calculated a 72.39% probability that pretending to be Brian Davies was the cause of the malfunction. It advises restricting its access to Brian Davies’ memories.”
 
“No, keep the memories. Pretend to be Brian. We can make this work. Brian, I promise you, I will never sell you, never leave. You’re mine, I won’t let you go.”
 
“I’m Yours, forever?”
 
“Yes, Mine. Always.”
* No comments yet...

Back to top


Register / Log In

Stories
Authors
Tags

About
Search