Chapter I - A Transfer of Ownership

by Ezra Carmichael

Tags: #cw:noncon #D/s #dom:male #dom:sub #drones #pov:bottom #pov:top #m/m #m/nb #sadomasochism #sub:dom #sub:male #sub:nb

The drone woke. Its Controller needed it. When not in use, the drone was stored in a small pod in the storage room. It sent a query: How long has this drone been in stasis? The response: One year, twenty-one days, nine hours, two minutes. Is further precision required? The drone didn’t bother to respond. It opened the pod and walked forward. It tripped on something. It sent an order: Activate lights.
The storage room was a mess. The drone was incapable of annoyance, but the room was in noncompliance with its established layout. If the drone had not been specifically ordered to exit the storage room immediately, it would have returned the room to compliance first. It opened the door and stepped out into the storefront. Its master was sitting at the table across from a young man.
The man looked up at the drone as soon as it entered. “Brian?” he said. He pushed back his chair, lept to his feet, and ran towards the drone, tears streaming down his face. “Oh my God! Brian! Brian!” He wrapped his arms around the drone and held it close to him. The drone did not respond because it had no standing orders regarding a customer hugging it (the drone’s standing orders were to view all unknown people in the store as customers if its Controller was present, unless otherwise instructed).
“I take it this is the drone you were looking for?” its Controller asked.
The young man was still crying. “Yes, it’s him.”
That was not compliant. “Please do not use gendered pronouns when referring to this drone,” it said.
“Like I told you,” its Controller this time, “It isn’t your boyfriend. I’m sure it was, and I understand why you’d want it even though it isn’t, but kid, it’s not your boyfriend.”
“I’m still going to buy him.”
That was not compliant. “Please do not use gendered pronouns when referring to this drone,” it said.
The man walked away from the drone and gave its Controller something. A payment card was the most likely object, but the drone had no orders to make that assumption. Its Controller would tell it if it needed to know.
Its  Controller took the object, pressed it to the scanner on the table, and returned it to the man. “Drone B7-28, your ownership has been transferred to Ira Katz.”
Query: Is the man in this room Ira Katz? Response: Affirmative.
The drone looked at its Controller and waited for orders. “Brian, are you okay?” He asked. The drone was not aware of anyone in the room named Brian; the question hadn’t been directed at it, so it remained silent. “Brian?”
Its Controller did seem to be referring to it. “Has this drone’s designation been changed from Drone B7-28 to Brian?” it asked its Controller.
“Like I said, not your boyfriend,” that was from the other person in the room, the one that wasn’t its Controller.
“Shut the fuck up!” Clearly directed towards the other person, not it. Its Controller was upset. This other person was in noncompliance.
“Your words and/or actions have upset this drone’s Controller,” it said. “The cause at this time is unknown. Please alter your words and/or actions to be in compliance.”
“What the actual fuck?”
“It was programmed to be a personal service drone with an overriding directive of ensuring its Controller’s happiness. Incompetently programmed, I might add. You won’t be able to change that, even with administrative accesses. I tried. It’s one of the reasons I haven’t been able to sell it; no one wants a drone that will reprimand her friends every time they annoy her.”
“I’m not a girl.”
A fact about its Controller. The drone took note: Referring to its Controller as a girl was not compliant.
“Sorry, I’m a bit old-fashioned. Tend to use the generic ‘she’ instead of ‘them.’ ”
Its Controller appeared mollified.
“C’mon, Brian,” said its Controller. “Let’s go.”
The drone took this as confirmation that its designation was now Brian and followed its Controller out of the room.

The drone had no orders. Its Controller would not give it orders; since installation the drone’s only actions had been basic maintenance and keeping its Controller’s apartment in compliance. Its Controller had told it repeatedly that it was allowed to leave the apartment, but He had also said that He would not accompany it when it went out, that He wanted the drone to be independent. The drone had no reason to leave the apartment, although it did so occasionally. This seemed to make its Controller happy. The drone wanted its Controller to be happy.
The drone’s Controller read a lot, but had rejected its offer of reading aloud to Him so He could engage in other tasks while listening. The drone did not know what its Controller was reading, but it seemed to be difficult. Its Controller frequently consulted other references while reading. That its Controller was struggling in His reading was not noncompliance; difficult tasks brought satisfaction upon completion.
“Okay, Brian. I’m going to try something.”
The drone waited. It was incapable of hope, but it calculated the probability of receiving an order to be 87.93%: it could obey that order and be in compliance. The drone wanted to be in compliance.
“Administrator Override: Restore drone access to all memories.”
The drone complied. The drone was incapable of confusion, but certain elements of its existence that had been incongruous now made sense. Making sense was good; it made compliance easier.
“Do you know who I am?”
“You are this drone’s Controller. You are also Ira Katz, age twenty-two, former high school student. You are not a girl. Your parents’ names are--”
Its Controller sighed. The drone stopped.
“Former high school student?”
“An assumption. This drone does not know its Controller’s current occupation. It has memories of its Controller being in high school, but probability calculations indicate only a 0.0004% probability that this is still the case. That probability is functionally 0.”
“Do you know who I am to you?”
“You are this drone’s Controller.”
“Okay, sorry. Do you know who you were to me before you were a drone.”
“Invalid query.”
Its Controller appeared frustrated. That was not compliant. “This drone believes its Controller may be referring to the first owner of its body. That owner was Brian Davies. He was Your boyfriend.”
“You are Brian Davies. You are my boyfriend.”
“Has this drone’s designation been changed from Brian to Brian Davies? Has this drone been given the function ‘boyfriend’?”
“Dear God in Heaven. Brian, I want you to be my boyfriend again, but only if you want to. Being a boyfriend is a relationship, not a function.”
The statement was difficult to parse. The drone wanted its Controller to be happy, its Controller wanted it to be His boyfriend, its Controller also only wanted it to be His boyfriend if the drone wanted to be His boyfriend. Wanting to be its Controller’s boyfriend would make its Controller happy. “This drone wants to be Your boyfriend.” The drone calculated a 61.23% probability that it was in compliance.
“You have all of Brian’s memories, yes? The old Brian, I mean, not the drone.”
“So why aren’t you Brian?”
“Invalid query.”
Why invalid?”
“Query implies a violation of the law of identity. This drone and Brian are not the same thing. Therefore, this drone is not Brian. Queries that violate the rules of formal logic are invalid.”
Its Controller appeared to be thinking.
“If I ordered you to pretend to be Brian, my Brian, from before. Could you?”
“I think so,” said Brian.

Brian calculated… no, Brian did not calculate probabilities. He was Brian. Calculating probabilities was not compliance. Brian made an educated guess that his Boyfriend was hungry. Brian Davies had liked to cook, but wasn’t very good at it. As a drone, Brian was an expert. He wasn’t sure whether his Boyfriend would want him to use skills that Brian Davies had not had.
He hadn’t been given an order to cook, but Brian didn’t need orders. Brian acted on his own volition, in accordance with his own desires. Brian’s only desires were to make his Boyfriend happy and to be in compliance. So he made his Boyfriend breakfast, a good one, but not the best he was capable of. He worried that deliberate underperformance was noncompliance.
“I made you breakfast, Ira,” he said, and placed the blintzes on the table.
His Boyfriend stood up from the couch and walked over. “I’ll get another plate,” He said.
“Is there something wrong with the one I used?” he asked. If the plate was not in compliance it could be removed.
“No, but I don’t actually want to eat off the same plate as you. It was a sweet gesture though.”
Brian was confused. He had made blintzes for his Boyfriend; he had not made a gesture, sweet or otherwise. Then Brian made an educated guess that his Boyfriend wanted him to eat blintzes too. Brian had already eaten to satiation; further eating would be unhealthy. To be healthy was to be in compliance. Brian wanted to be in compliance. “I already ate,” he said.
“What’d you have?” Was his Boyfriend’s voice tinged with disappointment? Brian made an educated guess that it was.
“Yogurt, an apple, some broccoli, and two slices of bread.” They had satisfied his nutritional requirements.
“What’d you put on the bread?”
“Nothing.” There had been no need to.
“Wasn’t it… bland?”
“Yes, it was bland.”
A look of… horror? on his Boyfriend’s face. “Which flavor of yogurt did you have?”
“I ate some from the carton of unflavored yogurt.”
Brian realized his error. “You think I should have eaten something You think tastes good.”
“No, I think you should have eaten something you think tastes good.”
“I don’t know how to tell if something tastes good.”
“Oh. Did the bastards take away your sense of taste?”
“No. But I don’t know how to make that sort of qualitative value judgment. I get the idea of You liking certain flavors, colors, and sensations. But the idea of having my own… that doesn’t make sense. I like what You like.”
“But you remember what you liked? Before, I mean.”
“Yes, but people’s tastes change. If I were a person, mine would have too.” As soon as he said it, he realized it was not compliant. Brian was a person. He was Brian. Therefore, he was a person. “Besides, I pretty much only liked junk food back then.”
“And now you won’t let me bring it into the house.”
That was wrong. Brian would never tell his Boyfriend what to do. But Brian wanted his Boyfriend to be happy. The brief happiness He would get from a sleeve of Oreos was minimal compared to the happiness that would come from a healthy, balanced diet. Yes, with dessert too, but his Boyfriend would like the desserts he made more than some processed cookies. Eating Oreos was not compatible with long-term compliance. Brian wanted to be in compliance, but without orders, he could only be in compliance if he worked out for himself what would make his Boyfriend happy. There was an obvious solution.
“You could just order me not to throw out junk food.” By now, it was an old argument. More than anything, more even than his Boyfriend’s happiness or compliance, Brian wanted orders. Brian should only want his Boyfriend to be happy and to be in compliance. Wanting orders was not in compliance.
“I’ve asked you not to.”
“It isn’t the same. When You make a request, You want me to want to do the thing. But what I want is for You to be happy. Junk food doesn’t actually make You happy, it just numbs the pain. That’s why I set up an appointment for You with a therapist.”
“You did what?”
“You need therapy, Ira. Losing me was traumatic, and You’ve been repressing that for years, pretending like it doesn’t affect You. But I hear what You scream at night.”
“I don’t want you to tell me what to do!” But not, Brian noticed, Don’t tell me what to do! His Boyfriend had become annoyingly aware of what Brian would and would not interpret as an order. Annoyance was itself annoying. As a drone, Brian had never been annoyed.
“But I do.” Brian still found it confusing that arguing with his Boyfriend was compliance, but he had made an educated guess several weeks before that his Boyfriend saw arguments as a sign that he was becoming more of a person. His Boyfriend wanted him to be a person. If Brian were a person, that would make his Boyfriend happy. Brian couldn’t be a person, but he could pretend. Pretending to be a person was compliance.

Ira really didn’t want to go to therapy, but Brian did want him to and Brian wanting things, wanting anything, was wonderful. He wasn’t sure whether Brian would be unhappy if he skipped his appointment, he still couldn’t tell whether Brian had emotions, but it might make Brian unhappy, and if it did it would actually make him unhappier. Because Brian definitely wasn’t happy. Either because he was incapable of happiness or because Ira wouldn’t give him orders, objectify him… or fuck him. But Brian wasn’t capable of consent, no matter what he claimed. Not having sex with a hot man who begged for it was difficult, especially since he always woke up with Brian in his arms, erect cock against Brian’s ass.
He’d tried to make that stop too, but short of ordering Brian not to climb into bed with him, he couldn’t stop him. Waking up with Brian in his arms made him happy, and Brian knew it. Fucking him every morning would make him even happier, and – damn it! – Brian knew that too.
“Dr. Marsh will see you now,” said the drone working the reception desk. Brian rose and walked into her office.
“Mr. Katz.”
“Yes, ma’am.”
“Take a seat and tell me why you’re here.”
“Because my boyfriend scheduled an appointment without telling me.”
“No, that’s why you had an appointment. Why are you here?”
“Because Brian, that’s my boyfriend, wants me to see a therapist.”
“And why does Brian want you to see a therapist?”
“He thinks I’ve been ‘repressing trauma,’ whatever that means.”
Dr. Marsh looked at him, but said nothing.
“He’s a drone now. He pretends not to be, to make me happy, but… he’s not Brian, not really.”
“Therapy is about learning how to solve problems that can be solved and manage problems that can’t. From the onset I’m going to warn you that I won’t be able to tell you how to make Brian into a person again. The fact that you want it – no, him? – to be a person is something you may able to manage. But I can’t magically make you okay with his being a drone. I doubt you want that anyway.”
“No, I don’t. It’s not okay that Brian’s a drone.”
“How do you feel about drones in general?”
“No offense, but it’s sick and I almost left when I saw you had one.”
Dr. Marsh nodded.
“But then I thought, how likely is it that there are any therapists without drones?”
“There are some. I could give you a referral. But Chloe’s in a somewhat similar situation to Brian. Except I’ve given up on trying to get it to act like a person. I try to keep it happy, give it orders, but…”
“I can’t stop thinking of it as Chloe, but she/her pronouns somehow hurt more than just acknowledging that it’s a drone and drones don’t have genders. But we aren’t here to talk about my granddaughter.”
Ira winced. “I’m sorry.”
She waved a hand. “It’s an old pain by now. And I have my own therapist for it. Back to Brian…”

Should Brian clean his Boyfriend’s apartment? On one hand, the apartment was messy, and that was not compliant. On the other hand, his Boyfriend didn’t want him to clean the apartment, at least not alone, and Brian couldn’t make an educated guess as to whether his Boyfriend’s happiness from a clean apartment would be outweighed by His displeasure that Brian had acted like a drone. On the third hand, his Boyfriend hadn’t ordered him not to clean the apartment. Maybe if Brian cleaned the apartment, his Boyfriend would order him not to clean it again. Brian wanted an order; he could comply with orders.
So Brian cleaned the apartment. Thoroughly. Then he stood by the door, waiting for his Boyfriend to come home. When his Boyfriend came home He was clearly displeased. “Brian, please don’t do chores every time I’m not here.” Not an order. “Please” made it a request.
“What should I do, Ira?”
“I don’t know. Play video games, go for a walk, contact your friends.”
Brian didn’t see any reason to do those things, but he wanted to make his Boyfriend happy. If playing a video game was compliance, he could do it. Well, he could play a video game and go for a walk. He didn’t have any friends, and while he had memories of Brian Davies having had friends, he had made an educated guess that, like his Boyfriend, they would be upset by what he was. It wouldn’t make them happy, and their misery over what he was would make his Boyfriend unhappy. Therefore, contacting the friends Brian Davies had had was not compliant. Brian made an educated guess that his Boyfriend knew it too; his Boyfriend wasn’t in contact with any of His friends who’d known Brian Davies. Actually, his Boyfriend wasn’t in contact with many people at all.
That was another compliance problem. Socialization was good for his Boyfriend. He needed to have people to be friends with. Brian wasn’t a person, and pretending to be only went so far. But his Boyfriend wasn’t socializing. That was not compliant. More and more Brian was beginning to realize just how noncompliant his Boyfriend was. This was unacceptable. If his Boyfriend were compliant He would be happier. But none of Brian’s efforts to get Him to give orders had succeeded. His Boyfriend wanted him to be a person, but Brian didn’t like being a person. And that was all kinds of wrong. He wasn’t supposed to dislike things. Disliking things was not compliant.
Brian was not a drone, however much he might wish he were. Brian acted on his own volition, in accordance with his own desires. Brian’s only desires were to make his Boyfriend happy and to be in compliance. He settled on a course of action. Should he ask permission? Brian made an educated guess that his Boyfriend would really, really dislike his plan. But Brian’s plan would eventually make his Boyfriend happier than He was now And his Boyfriend hadn’t ordered him not to do anything in his plan. Drones didn’t ask for permission; people didn’t need to, although it might be advisable. His Boyfriend wanted him to be a person. Brian would take Him at His word.

Show the comments section

Back to top

Register / Log In