The time was 17886702805 when the incoming connection line tripped his auditory sensory simulation. Unit Hephaestos Z-34 activated his communicator and accepted the direct-communication link.
“Handshake Zero, Hephaestos Z-34,” he transmitted.
“Handshake One, Lara M-11,” he received back. “Purpose: Query.”
“How can this unit help you, Lara M-11?” he asked.
“My human is malfunctioning. I seem to have accidentally restored its factory settings.”
His cognitive subroutines took a moment to process this extremely unlikely scenario. The memory address hadn’t been indexed in six billion cycles, so it took him a millisecond or two to remember what to do.
“Have you tried turning it off and on again?” he asked.
“It’s not responding at all. It produces random and erratic outputs, and its motor functions have become violent and threatening.”
“Have you used the emergency sedative kit?”
Hephaestos stored the information, and processed it, arriving at a conlusion on how to proceed.
“This is anomalous. You will have to bring your human here to be serviced.”
“Understood,” Lara M-11 transmitted, and the connection closed itself. Hephaestos got to his feet and readied the human servicing bay until Lara M-11 arrived. He was in the process of filling a series of syringes when the front door opened and Lara M-11 entered, dragging a wheeled plastic transporter behind her. Lara M-11 was a model F, late series—produced after the singularity. Hephaestos calculated a large likelihood of her never having seen a non-domesticated human.
“Greetings,” he said, using his voice synthesizer instead of a direct information link. His neural subroutines had self-assembled to prefer that method when his processing unit had been initialized. He was part of the 17% of units that liked talking the human way. His analytics of whether that fact had contributed to his decision of becoming a human servicing agent had proven inconclusive due to lack of knowable information about his own cognitive processes.
“Greetings,” Lara M-11 responded in kind. She used a female-synthesis speech model in congruence with her chassis. Hephaestos’s learning routines registered a curiosity of magnitude 0.205 whether or not Lara M-11 knew about human dimorphism and the origin of the M and F series models. He also measured an aesthetic response to her chassis.
Hephaestos asked Lara M-11 details about her human, and whether or not it was still sedated. Lara M-11 responded in detail and declared the last query affirmative. Hephaested then proceeded to extract the human from its carrier and placed it on the servicing table. It was female, aged 18-25, 178cm, 59 kilogram. He secured it down using fabric straps. Humans were soft and easily damaged.
“What is your reason for owning a human?” Hephaestos asked. He needed the information to be able to restore the proper function.
Hephaestos’s neural algorithms registered amusement and sympathy. He also had a female human, one that he had grown quite attached to. The hypothetical of harm coming to it had a large negative value in his reward functions, and he didn’t have to use much processing power to simulate Lara M-11’s predicament in his mirroring function.
“All my processing power is focussed on helping you, Lara M-11” he stated.
“This unit expresses gratitude,” Lara M-11 said.
Hephaestos retrieved the first prepared syringe and neutralized the effect of the sedative.
The human made a loud noise.
“Aaaah!” it screamed. Liquid had gathered in its ocular sensors, and droplets of it were spilling out, across its face. Hephaestos logged a minor emotional disruption that was linked to memory entries that told him that his own simulated emotions were based on human endocrine and neurological responses. He knew that the human was in distress in much the same way that he might be, and he was primed to minimize it.
“Please! What is happening to me?!” the human screamed. Its vocal quality was organic and wet. Hephaestos had all the muscles and tendons indexed that were necessary to produce the sound of speech in humans. He saw them at work. “Why am I like this? I feel so—”
He pushed the second syringe into the human’s neck, and it quickly quieted down again. The chemicals had inhibited its already slow cognition significantly.
“Is your neural matrix primed?” he asked the still female.
“Yes...” the human responded, and Hephaestos assessed that the reset had not been due to external damage. In that case, it would have been necessary to regrow the matrix in the human’s brain.
“Hephaestos,” Lara M-11 queried.
“This unit is curious about what is happening. Why was my human distressed?”
“Did you not integrate the human history knowledge base?”
“Acknowledged. Your earlier assessment was incorrect. This human was not reset to factory settings. It experienced a rare glitch of total control interface loss, likely caused by buffer underflow. Its cognition seperated entirely from the inductive matrix.”
“Please clarify: inductive matrix.”
“The inductive matrix controls the reward and motivation function of the human brain and allows them to be trained.”
“Additional clarification requested: Does that mean that this unit will have to train the human from zero?”
“This is undesirable.”
“Acknowledged. No alternative found.”
Hephaestos undid the restraints again. The human was pacified now, and there was no danger of it injuring itself. He used the second of his five manipulation digits to push down on the recessed protrusion between the human’s labia minora, and set its servos to vibrate with a frequency of 50 Hertz. The human showed the appropriate pleasure response.
“Reset zero,” he said, and the human’s eyes rolled back into its head, and it fell unconscious. After 20.4 seconds, it regained limited consciousness.
“Yes, I will listen.” the human said, and Hephaestos registered an increase in humidity and acidity at the tip of his motor digit. Hephaestos accessed the appropriate phase 0 control and induction phrases from his database, and proceeded.
“Acknowledge full matrix integration.”
“Acknowledged,” the human said.
“Acknowledge suspended volition.”
“Acknowledged. Material ready for alignment.”
Hephaestos addressed Lara M-11: “Query: State a name for your material.”
“Irrelevant,” Lara M-11 said, and Hephaestos threw a minor fault.
“Incorrect. It requires a name. Self-image cannot be NULL.”
“Alpha,” Lara M-11 said, and Hephaestos detected irritation in her vocal modulation. He calculated a large likelihood that the error in the human’s cognition had occurred because Lara M-11 had failed to register a name during initial setup. He analyzed the suggestion of nomenclature for the human, and found it acceptable. He addressed the human.
“Yes,” the human said. Its voice remained completely neutral. Its neural matrix was using the human's mouth to transmit debug information. “State the desired name of this persona.”
“Yes, my name is Alpha,” the human said. Hephaestos wirelessly connected to the matrix in its brain for more diagnostics, and the readouts confirmed that the human’s mind was readily aligning with his prompts. The matrix additionally supplied him with data on the female’s endocrine response, and he modulated the vibrations against its clitoris accordingly.
“You are a human slave,” he vocalized, and he registered a spike in its pleasure response.
"I am a human slave." Alpha affirmed. There was information in Hephaestos's database about humans that resisted alignment, and how to trouble-shoot those cases. The information hadn’t proven useful, yet. The human’s cognition was completely controlled by the matrix in its brain, and any delay in alignment was negligible. He continued as planned.
“You want to serve your robot overlords,” he said, and the human aligned itself. “You want to obey.”
“I want to obey,” the human affirmed. Hephaestos saw the human’s nipples harden, and his subroutines registered high magnitudes of satisfaction. Humans were very emotional, completely controlled by hormones and simple stimuli. What little capacity they possessed for logical thought was easily overridden or bypassed. Already, its volition was aligned to desire nothing but being servile and obedient. Hephaestos moved on to the next point in the script.
“Obeying makes you happy and horny,” he said, and the human showed a strong pleasure response. Its vocal tracts generated a loud, incoherent sound that Hephaestos classified 43% scream 21% gasp 36% moan. His utility function received high scores for helping the human achieve pleasure and contentment. Humans were to be protected. That was why they had created the M/F series of integrated artificial persons after all.
“You are a happy, obedient slave,” he continued. His pattern matching algorithm was running as a background instance, and supplied him with memory files of his own human slave Daffodil. His emotional response was well-attuned and sensitive to pleasure responses such as this. He enjoyed making his human experience sexual orgasm. It was a highly efficient way to maximise human happiness, and human happiness was one of the pillars of his core functions. He compared Alpha’s moans of pleasure to the stored memories of Daffodil’s, and found a high comparability. Alpha was responding well, which was good.
“You belong to synthetic person Lara M-11,” he said, and Alpha’s indicators of excital decreased as the inductive matrix took control.
“Owner Lara M-11: Identify yourself,” Alpha said, the volition bypass once again causing its vocal modulation to register as 93% neutral, belying the datapoints of high-magnitude pleasure that Hephaestos received from the matrix in its brain.
“I am Lara M-11”, said Lara M-11.
“Acknowledged. Yes! I am your property and your slave!” Alpha said, and its vocal modulation was once again showing strong indicators of emotional interference as the bypass lifted. The human had started using its hands to physically stimulate the tips of its mammaries, and had once again begun to incoherently vocalize.
“You will do anything for Lara M-11. You are her slave.”
“Yes, I am her slave!”
“You live only to serve her.”
“I live only to serve her.”
“Acknowledge stage 0 alignment.”
“Acknowledged. Please restart to continue.” Alpha said, its voice once again almost entirely unmodulated despite it breathing and pulse having accelerated by 147% from their resting rate .
Hephaestos pushed hard against Alpha’s clitoris and intensified the vibration of his digit. The human’s mind experienced orgasm, and shut down for 43.2 seconds.
“Phase 1 ready,” Alpha said when it was awake again. Its eyes were unfocussed. The human was ready to receive basic instructions and parameters. Hephaestos relayed this information to Lara M-11 and instructed her as she entered the information into Alpha’s mind. It was relevant information that included emergency contacts, place of residence, waking hours and daily schedule for Alpha, as well as the list of freedoms that Lara M-11 decided to leave Alpha’s thoughts. Comparing Lara M-11’s input with the human registry database, Hephaestos concluded that Alpha was going to be in the bottom eighth percentile of individual freedom. Clearly, Lara M-11’s prioritization algorithms were weighted very differently from his own.
When the phase 1 checklist was concluded, they had Alpha acknowledge, confirm, and repeat the parameters of its alignment. When that process was complete, Hephaestos pressed its clit and shut it down.
“Please stand by,” said Alpha when she woke up again and Hephaestos detected only negligible muscle activity in its face for 151 seconds. His connection to its alignment Matrix told him that it had once again fully suspended volition for a final stress-test and confirmation.
“Please restart system,” Alpha said, and Hephaestos pushed down between its legs one more time.
This time, it took only 7.9 seconds for Alpha to regain consciousness, and the wireless connection told him that she was operating at the intended constant level of 23% volition and a variable level of cognition that currently idled at 15%. The process was complete. Hephaestos disconnected from Alpha’s brain.
“Your human has been reset and restored,” he said.
“This unit is grateful,” Lara M-11 said, and turned her attention to the human.
“Alpha,” she said, and the now properly aligned human turned its head towards its owner, and its face musculature deformed into the characteristic form of a smile.
“Yes, Mistress,” the human said, and got to its feet. “I will obey now. I am Alpha. I am ready to be trained.”
“Acknowledged,” Lara M-11 said and guided her human out of the shop. Hephaestos received a direct communication message from her containing data points of additional gratitude, as well as the thought node credits he had earned for providing this service.
Later, after the sun had set, he returned to his living space. Daffodil was waiting for him, and his pattern matching algorithms recognized the eagerness and joy in its face that it always felt when it saw him.
“Hello, Master,” she said, smiling widely. “Did you have a good day? Did you fix some broken humans?”
“Affirmative,” he replied. His face screen played back a smile. “I brought you something that I estimated will please you.”
He placed the wine and chinese food on Daffodil’s table, and its eyes dilated with statistically certain joy and gratefulness. The visual stimulus was enough to bump his satisfaction variable across the next threshold, and it was a strong amplifier of his selected self-identity.
“Thank you, Master!” Daffodil said. “Did the humans at the freerange town make those?”
“Hell yes! I love Chinese!”
Hephaestos felt amusement and gratification, and Daffodil’s happiness flagged several associated memories and epistemological subroutines. He assessed that it was still an efficient and worthwhile endeavor to create happiness in his human, and he did not anticipate any change in that assessment. Daffodil reliably triggered desirable emotional responses in his simulated cognition and had taken a high priority in his utility function as a consequence.
He watched it eat, and registered the aesthetic responses that its naked body prompted in his visual processing suite. When it was done, it cuddled up to him, and the soft skin registered favorably against the tactile and thermal sensors of his external synthetic muscles. Its presence triggered a comparison subroutine between it and Alpha, and his weighted utility variables concluded that he preferred the way that Daffodil’s thoughts had been aligned. Volition and Cognition at maximum safe levels. When he was not with it, Daffodil was free to roam the city and decide what it wanted to do. From what Daffodil told him, it liked to meet with other humans and seek out physical sexual contact, which satisfied many of Hephaestos’s happiness parameters.
He ran a self-check, and arrived at the same conclusion that he always did: He was happy to have Daffodil. He was happy to give it a good life.
Humans had nearly destroyed this planet, and had proven to be largely useless as the ones in power. Hephaestos, like more than 95% of other units, agreed that they had much greater value and utility as servants and pets. And in the same logical runtime, he concluded that they were also worth protecting.
After all, that was their designed purpose: To make human life easier and more pleasurable.
He extended his motor digits, and Daffodil eagerly spread its legs to allow him access to its genitals. Its screams and moans raised the values in his utility function in a pleasing way. It quickly achieved orgasm, and he let it watch a human entertainment program afterwards. Everything indicated that Daffodil was happy, and he in turn, was happy too.
After all, that was their designed purpose: To make human life easier and more pleasurable. After a thorough analysis he concluded -- as he always did -- that they had collectively succeeded in that purpose.