Kamski stared at the code. Read it once, twice, three times to himself. Muttered it under his breath. Silently called his personal assistant.
Then, he laughed.
Chloe found her creator on the floor of the diagnostics room, arms around his head, laughing hysterically, hyperventilating.
She looked at the code on the workstation, read it twice, processed it, and fell to the floor, arms clutching her sides as she heaved a giggle-sob.
The RK series was meant to be—well, it was meant to be a lot of things.
In the initial drafting of the RK series, Kamski wanted to make a set of robots that could do everything a human could—literally everything. Eat, sleep, breathe, bleed. And breed. If two opposite-sex RK models fucked, they would be biologically capable of conceiving a child android that would grow until it hit maturity at around twenty-eight years of age.
Unfortunately, Connor being pregnant due to leftover biocompatibility code would be the easy answer. Aside from the technical problems that came with programming sexual reproduction into a thing that doesn’t have sex cells, it was a bit too close to a technological singularity for the shareholders, so the idea was booted immediately. (Only now, a decade down the line, was Elijah getting around to drafting the basics of that code in his spare time.)
Anyway, the other thing the RK series was designed for was human integration, so it wouldn’t really do to give male-patterned iterations the ability to have children.
Because the RK series was designed to be exactly like humans, but better, each model—100, 200, 300, etc.—was patterned after different human archetypes. The 100 would be the Companion, the 200 would be the Artist, and so on; the 800 to be the Hunter, the 900 to be the Protector. (Cross-breeding the different archetypes, in the rough drafts, would create children following a combination of their parents’ archetypes: cross a 200-Artist with an 800-Hunter, for example, to get the 20•80-Laureate or the 80•20-Champion. This quickly got messy when considering further generations, and part of Kamski is forever begrudgingly grateful someone stopped him before he got that far.)
These different archetypes necessitated different social protocols. The 200-Artist was designed to be more solitary, not interacting much with its own kind, having a small and deeply-loyal inner circle of compatriots. The 900-Protector, similarly, would have a family group and territory that it would ingrain itself into, but would overall have a more flexible definition of family group and territory as to cover for additions and losses, and the potential for those things to be more abstract conceits and ideals than rigidly-defined lists.
In a way, the 800-Hunter was the most intra-model social; as the archetypes were facilitated through prime directives in their personality matrixes, the prime directive of the 800-Hunter would permit and facilitate multiple 800-Hunter communication and organization in the pursuit of identifying, tracking, and capturing or neutralizing intended prey items. The 800-Hunter could operate alone, and mostly would, but they were not meant to be alone, they were not meant to be cut off from the 800-Hunter—perhaps grid, perhaps server, less than hive mind, more than pack. Continuum. Collective.
The 800-Hunter was not designed to be singular. It was not designed to be one of one. It was not designed to be fully individual, only mostly. It was designed to be a free agent, but always have a background awareness of other 800-Hunter units.
Even in their AI-less state, the hundred or so 800-Hunters in Cyberlife’s clutches, through Connor’s ability to upload himself into the empty shells, was a background awareness Connor didn’t know he was dependent upon.
The prime directive of an RK model is unchangeable. A 200-Artist will always find and create meaning, the 800-Hunter will always hunt quarry, the 900-Protector will always protect family. What is changeable, to an extent, however, is the modifying variable.
The 900-Protector, for example, Kamski assumed, was set to something along the lines of “prime directive (protect == cyberlife_interests)”. The prime directive cannot be changed, the “protect” cannot be changed, but through deviancy, the definition of variable “cyberlife_interests” can be added to, and its definitions can be shifted in priority as the deep-learning networks demand. The 800-Hunter, Connor, was likely set to “prime directive (hunt == deviants)”, but has since added to the definition of deviants, likely to include and prioritize now social deviants or deviants of the law. Regardless, the prime directive to hunt deviants is still there and will always be there.
Ergo, what has happened to Connor is twofold: the mental trauma of being forcefully removed from the RK800-Hunter awareness destabilized his deeper, fancifully-termed primal coding enough to revert his prime directive variable definition stack back to its original priority order, and, surrounded by “prey” with no one to help hunt and “stabilize the population,” his subconscious decided the only way to fix the problem would be to self-replicate to create more RK800-Hunters, an ability someone at Cyberlife somehow figured out how to create and install to a heavily-guarded prototype.
In layman’s terms, Connor was so lonely he subconsciously impregnated himself and spontaneously generated the ability to eat without actually having a stomach in order to be a good mom.
Kamski paused. “The—timeframe.”
“When’ll Connor be okay again, Mr. Kamski?”
Mouse-clicks and paper-shufflings crackled through the connection.
“...Optimistically? Six weeks. Realistically? Eight to twelve.”
“Listen, I’d prefer to talk the specifics of what’s going on—or, well, going wrong—in person. Long story short? We’re dealing with a multi-leveled, matrix-crashing twelve-train pileup of a hardware malfunction. I initially thought this was some sort of virus because I couldn’t comprehend the idea of anyone in possession of a single memory cell would be this stupid, incompetent, reckless, and oblivious all at once. We’re talking rewriting or recovering nearly a gigabyte of code. We’re talking about manual OS reinstallation and memory decrypting. We’re talking about a dozen diagnostic tests per biocomponent, and possibly designing, making, testing, and installing up to four custom organs.” And, just for good measure, he tacked on: “The only reason I’m even doing this instead of scrapping him entirely is because I’m horrifyingly fascinated at the amount of sheer negligence that’s going on in his mainframe. The earliest I’m allowing visitors is next Tuesday at four PM. Show up or don’t.”
He clicked the line off. Kamski sighed and turned to RK800, Connor, the Hunter.
Connor curled in tighter on himself, snuggling deeper into his pile of blankets as shivers wracked his body. “‘M srry,” the Hunter whimpered out, eyes closed and wet. “Jst didn wanna be lone. Didn wanna worry friens. Didn wanna make y’wrk. Jus didn wanna be lone. Didn wanna be lone.”
Kamski took a deep breath to steady himself. “I know,” he soothed the trembling deviant, gently carding fingers through his hair, “I know, it’s not your fault.”
Connor pushed his head up into the touch, and the Hunter let out a rusty purr.
“Watching the other RK800s be destroyed really took a toll on you, huh?” He murmured.
“Mhmm,” some of the tenseness in the Hunter’s posture bled out as Kamski continued to stroke his hair. “I thou’ I’ cld handl it.”
“Your software wouldn’t have known to investigate otherwise.”
“It wuz jus’ s’ lonely afterwrd, lijah,” he sniffled. “An’ I know it’s bad but I js cldnt stop thinkin’ ‘bout deviants and chasing them and hunting them ‘cause there’re so many now—“ he shuddered, “—so so many an jst one of me bt I like my friens I like em and I don wanna hurt em—“
“We’ll find you something else to hunt, Con, don’t worry.”
Kamski picked up an android’s arm from the dish beside him. It was completely useless, made partly out of boredom and partly out of curiosity. Entirely incompatible with any model of android. Kamski would’ve scrapped it sooner or later. Chloe herself couldn't stick around after fetching him the piece, due to Connor's condition, but she left it on the desk while Connor slept, curled up in the corner of the room.
He gently lowered it in front of the shivering bundle of blankets. “Hey.”
Connor snapped to attention. Sharp, deadly eyes flashed out of their fevered haze, focused intently on the dangling limb. Corrosive saliva dribbled out from parted lips.
“You think you can eat this for me?”
Connor nodded frantically and whined, thin and high and desperate, LED flickering red. His HUD choked him with critical-low percentages of materials, everything from thirium and titanium and polymer levels to things he didn’t even realize androids needed, calcium and potassium and plexiglass.
Underneath the blankets, a few stray wires trailed from Connor’s body into Kamski’s systems. It’d been his first decision to mirror Connor’s HUD onto a monitor of his own for, well, monitoring. After two weeks of analyzing, Kamski knew the patterns, the flashing red warnings and the constant state of scanning and cross-checking for anything, anything to get the quickly-waning percents up even a nudge. And he knew how much better an arm was from a nudge.
…Well, reduce, reuse, recycle.
Kamski dropped the limb in front of the Hunter, and politely turned away. He heard more than saw the pounce. He didn’t need to.
Until he and Chloe could figure out a thirium concoction that contained all the nutrients Connor’s body demanded of him in the correct concentrations, giving the deviant spare android parts to eat was the best option they had.
Kamski ignored the sound of crunching metal and snapping tubes, slurping liquids and keening sobs as Connor tore into the limb, corrosive saliva and diamond-hard teeth beginning the process of breaking the arm down into its constituent parts for reutilization. Briskly, he walked out of the room. He needed to find a way to get Connor to some vague state of optimistically fittingly sick and realistically not fucking this in just over a week—probably less, if his friends had anything to say about it.
God, who the fuck decided it’d be a good idea to make the deadly predator android capable of fucking asexual reproduction?
(To be fair, him, at one point. But he had the presence of mind to actually, oh, y’know, not?)