[personal profile] fiefoe
Martha Wells's sec unit doesn't consider itself to be human at all, but it's endearing to follow its journey as it confronts some very human problems, like why do we get up from the couch everyday.

"All Systems Red"
  • MedSystem was advising a tranq shot and blah blah blah, but I was clamping one arm on Dr. Bharadwaj’s suit to keep her from bleeding out and supporting her head with the other, and despite everything I only have two hands.
  • I ran my field camera back a little and saw I had gotten stabbed with a tooth, or maybe a cilia. Did I mean a cilia or was that something else? They don’t give murderbots decent education modules on anything except murdering, and even those are the cheap versions.
  • So, I’m awkward with actual humans. It’s not paranoia about my hacked governor module, and it’s not them; it’s me. I know I’m a horrifying murderbot, and they know it, and it makes both of us nervous, which makes me even more nervous. Also, if I’m not in the armor then it’s because I’m wounded and one of my organic parts may fall off and plop on the floor at any moment and no one wants to see that.
  • But she had barely looked at me and I had barely looked at her because again, murderbot + actual human = awkwardness.
  • I hadn’t been listening to myself, basically. I had asked him if he had kids. It was boggling. Maybe I had been watching too much media.
  • She turned to me with one of those abrupt movements that I had taught myself not to react to. “Can the HubSystem be hacked?”
  • “As far as I know, it’s possible,” I said. “But it’s more likely the report was damaged before you received the survey package.”
    Lowest bidder. Trust me on that one.
    There were groans and general complaining about having to pay high prices for shitty equipment. (I don’t take it personally.)
  • I hadn’t looked at the maps yet and I’d barely looked at the survey package. In my defense, we’d been here twenty-two planetary days and I hadn’t had to do anything but stand around watching humans make scans or take samples of dirt, rocks, water, and leaves. The sense of urgency just wasn’t there. Also, you may have noticed, I don’t care.
  • I didn’t turn my helmet toward him because that can be intimidating and it’s especially important for me to resist that urge. “I carefully monitor my own systems.” What else did he think I was going to say? It didn’t matter; I’m not refundable.
  • She took a breath and I knew she was going to tell me to stay here. And I just thought, That’s a bad idea. I couldn’t explain to myself why. It was one of those impulses that comes from my organic parts that the governor is supposed to squash.
  • I thought it was likely that the only supplies we would need for DeltFall was the postmortem kind, but you may have noticed that when I do manage to care, I’m a pessimist.
  • Even with the armor, bits of me were going numb, but I had only taken three projectiles to the right shoulder, four to the left hip. This is how we fight: throw ourselves at each other and see whose parts give out first.
  • Gurathin hesitated. “It’s downloaded seven hundred hours of entertainment programming since we landed. Mostly serials. Mostly something called Sanctuary Moon.” He shook his head, dismissing it. “It’s probably using it to encode data for the company. It can’t be watching it, not in that volume; we’d notice.”
    I snorted. He underestimated me.
    Ratthi said, “The one where the colony’s solicitor killed the terraforming supervisor who was the secondary donor for her implanted baby?”
    Again, I couldn’t help it. I said, “She didn’t kill him, that’s a fucking lie.”
  • I won’t tell the company, or anyone outside this room, anything about you or the broken module.”
    I sighed, managed to keep most of it internal. Of course she had to say that. What else could she do. I tried to decide whether to believe it or not, or whether it mattered, when I was hit by a wave of I don’t care. And I really didn’t. I said, “Okay.”
  • (“I do think of it as a person,” Gurathin said. “An angry, heavily armed person who has no reason to trust us.”
    “Then stop being mean to it,” Ratthi told him. “That might help.”)
  • I pictured doing that, pictured Arada or Ratthi trapped by rogue SecUnits, and felt my insides twist. I hate having emotions about reality; I’d much rather have them about Sanctuary Moon.
  • Whatever. Bots who are “full citizens” still have to have a human or augmented human guardian appointed, usually their employer; I’d seen it on the news feeds. And the entertainment feed, where the bots were all happy servants or were secretly in love with their guardians. If it showed the bots hanging out watching the entertainment feed all through the day cycle with no one trying to make them talk about their feelings, I would have been a lot more interested.
  • He finally said, “You don’t blame humans for what you were forced to do? For what happened to you?”
    This is why I’m glad I’m not human. They come up with stuff like this. I said, “No. That’s a human thing to do. Constructs aren’t that stupid.”
    What was I supposed to do, kill all humans because the ones in charge of constructs in the company were callous? Granted, I liked the imaginary people on the entertainment feed way more than I liked real ones, but you can’t have one without the other.
  • “You used combat override modules to make the DeltFall SecUnits behave like rogues. If you think a real rogue SecUnit still has to answer your questions, the next few minutes are going to be an education for you.”
  • She looked at me, then looked at the DeltFall unit. “How are you going to explain that?”
    I started shedding armor, every piece that had a PreservationAux logo on it, and leaned over the DeltFall unit as the pieces dropped away. “I’m going to be it and it’s going to be me.”
  • I kept running diagnostics and checking the various available feeds to make sure I wasn’t still in the cubicle, hallucinating. There was a report running on the local station news about DeltFall and GrayCris and the investigation. If I was hallucinating, I think the company wouldn’t have managed to come out of the whole mess as the heroic rescuers of PreservationAux.
"Artificial Condition"
  • When constructs were first developed, they were originally supposed to have a pre-sentient level of intelligence, like the dumber variety of bot. But you can’t put something as dumb as a hauler bot in charge of security for anything without spending even more money for expensive company-employed human supervisors. So they made us smarter. The anxiety and depression were side effects.
  • I was afraid, but that made me irritated enough to show it that what it was doing to me was not exactly new. I sent through the feed, SecUnits don’t sulk. That would trigger punishment from the governor module, and attached some brief recordings from my memory of what exactly that felt like.
    Seconds added up to a minute, then another, then three more. It doesn’t sound like much to humans, but for a conversation between bots, or excuse me, between a bot/human construct and a bot, it was a long time.
    Then it said, I’m sorry I frightened you.
    Okay, well. If you think I trusted that apology, you don’t know Murderbot.
  • “It’s not realistic,” I told it. “It’s not supposed to be realistic. It’s a story, not a documentary. If you complain about that, I’ll stop watching.”
    I will refrain from complaint, it said. (Imagine that in the most sarcastic tone you can, and you’ll have some idea of how it sounded.)
    So we watched Worldhoppers. It didn’t complain about the lack of realism. After three episodes, it got agitated whenever a minor character was killed. When a major character died in the twentieth episode I had to pause seven minutes while it sat there in the feed doing the bot equivalent of staring at a wall, pretending that it had to run diagnostics. Then four episodes later the character came back to life and it was so relieved we had to watch that episode three times before it would go on.
  • Granted, it would have been hard to show realistic SecUnits in visual media, which would involve depicting hours of standing around in brain-numbing boredom, while your nervous clients tried to pretend you weren’t there. But there weren’t any depictions of SecUnits in books, either. I guess you can’t tell a story from the point of view of something that you don’t think has a point of view.
    It said, The depiction is unrealistic.
    (You know, just imagine everything it says in the most sarcastic tone possible.)
    “There’s unrealistic that takes you away from reality and unrealistic that reminds you that everybody’s afraid of you.”
  • “I’m not your crew. I’m not a human. I’m a construct. Constructs and bots can’t trust each other.”
    It was quiet for ten precious seconds, though I could tell from the spike in its feed activity it was doing something. I realized it must be searching its databases, looking for a way to refute my statement. Then it said, Why not?
    I had spent so much time pretending to be patient with humans asking stupid questions. I should have more self-control than this. “Because we both have to follow human orders. A human could tell you to purge my memory. A human could tell me to destroy your systems.”
  • Did I really care what an asshole research transport thought about me?
    I shouldn’t have asked myself that question. I felt a wave of non-caring about to come over me, and I knew I couldn’t let it. If I was going to follow my plan, such as it was, I needed to care. If I let myself not care, then there was no telling where I’d end up.
  • I’m terrible at estimating human ages because it’s not one of the few things I care about. Also most of my experience is with the humans on the entertainment feed, and they aren’t anything like the ones you see in reality. (One of the many reasons I’m not fond of reality.)
  • After PreservationAux, it had occurred to me how different it would be to do my job as an actual member of the group I was protecting. And that was the main reason I was here.
    I phrased it as a question, because pretending you were asking for more information was the best way to try to get the humans to realize they were doing something stupid. “So do you think there’s another reason Tlacey wants you to do this exchange in person, other than … killing you?”
  • It would allow us to communicate once I was down on RaviHyral and let me continue to have access to ART’s knowledge bases and unsolicited opinions. I was used to having a HubSystem and a SecSystem for backup and ART would be taking their place. (Without the part where those two systems were partly designed to rat me out to the company and trigger punishment through the governor module. ART’s freedom to weigh in on everything I did was punishment enough.)
  • In my feed, ART turned down the soundtrack to say, Young humans can be impulsive. The trick is keeping them around long enough to become old humans. This is what my crew tells me and my own observations seem to confirm it.
    I couldn’t argue with the wisdom dispensed by ART’s absent crew. I remembered humans had needs and asked Tapan, “Did you eat?”
  • She offered me one and I told her my augments required me to have a special diet and it wasn’t time for me to eat yet. She accepted that readily. Humans apparently don’t like to discuss catastrophic injuries to digestive systems, so I didn’t need any of the corroborating detail ART had just researched for me.
  • I was seething, but I kept it out of the feed. As I told ART, bots and constructs can’t trust each other, so I don’t know why it made me angry. I wish being a construct made me less irrational than the average human but you may have noticed this is not the case. I said, Your client sent a ComfortUnit to do a SecUnit’s job.
  • Picking up on my reaction, ART said, What does it want?
    To kill all the humans, I answered.
    I could feel ART metaphorically clutch its function. If there were no humans, there would be no crew to protect and no reason to do research and fill its databases. It said, That is irrational.
    I know, I said, if the humans were dead, who would make the media? It was so outrageous, it sounded like something a human would say.
"Rogue Protocol"
  • And there was no way their bond company would have guaranteed the survey without some kind of professional security. That was unrealistic. Heroic SecUnits were unrealistic, too, but like I had told ART, there’s the right kind of unrealistic and the wrong kind of unrealistic.
  • Thinking about the probable fate of Transport’s passengers put me out of the mood, too. I didn’t want to see helpless humans. I’d rather see smart ones rescuing each other.
  • That’s the other problem with human security: they’re allowed to give up.
  • Her gaze went to Wilken’s back again, but on our private channel she said, I’ve never worked with a SecUnit before—I’ve never seen or interacted with a SecUnit before—so please tell me if you need any information or instruction from me.
    I had never had a human ask me how to give me orders before. It was an interesting novelty.
  • Confused at the apparently empty passage, it tried to trace the signal. And I pinged it with a compressed list of drone control keys.
    (That’s not in the stealth module, and it’s not a function of company-supplied SecSystems. I got it from the proprietary data of a company client who worked on countermeasures for combat drones. I had managed to resist deleting it to fill that space up with new serials. I knew someday it would come in handy.)
  • I do make mistakes (I keep a running tally in a special file) and it looked like I had made a big one. I had interpreted all of Wilken’s behavior as being about me, about the discomfort and paranoia associated with a SecUnit suddenly appearing out of nowhere, supposedly sent by another security consultant whose existence implied that the clients didn’t trust her and Gerth. (I know, the “it’s all about me” bit is usually a human thing.) But now it seemed she had been uneasy for a whole other reason.
  • I thought I had gotten good at controlling my expression, but apparently only when I wasn’t feeling actual emotions.
  • “Miki was a bot who had never been abused or lied to or treated with anything but indulgent kindness. It really thought its humans were its friends, because that’s how they treated it.”
"Exit Strategy"
  • (Deleting memories like that doesn’t work. I can delete things from my data storage, but not from the organic parts of my head. The company had purged my memory a few times, including my whole mass murder incident, and the images hung around like ghosts in an endless historical family drama serial.) (I like endless historical family drama serials, but in real life, ghosts are way more annoying.)
  • (Possibly I was overthinking this. I do that; it’s the anxiety that comes with being a part-organic murderbot. The upside was paranoid attention to detail. The downside was also paranoid attention to detail.)
  • But what really helped was that all this coding and working with different systems on the fly had opened up some new neural pathways and processing space. I’d noticed it on Milu, when I’d been handling multiple inputs without any Hub or SecSystem assistance, to the point where I thought my brain was going to implode. Hard work really did make you improve; who knew?
  • Even though the vending was all automated, and I sort of knew what to do based on what I had seen on the entertainment feed, it was still weird. (And by weird I mean an agonizing level of anxiety.)
  • When I put the new clothes on, I had a strange feeling I usually associated with finding a new show on the entertainment feed that looked good. I “liked” these clothes. Maybe I actually liked them enough to remove the quotation marks around “liked.” I don’t like things in general that can’t be downloaded via the entertainment feed.
    Maybe because I’d picked them myself.
  • Corporate political entities are more interested in keeping track of their own humans than anybody else’s. I had seen on the media that travel was easier for non-citizens inside the Corporation Rim than citizens, sub-citizens, and all the other categories each different political entity had to keep track of their humans.
  • This was a stressful trip, right up there with the one where ART introduced itself to me by implying that it might delete my brain and the one where I kept thinking about Miki. And the one with Ayres and the other humans who had sold themselves into contract slavery.
    I guess most of my trips so far had been this stressful.
  • (And I could hook my right hand on the strap, which gave me something to do with that arm. How humans decide what to do with their arms on a second-by-second basis, I still have no idea.)
  • “There’s a ninety-five percent chance,” I told her. The company is like an evil vending machine, you put money in and it does what you want, unless somebody else puts more money in and tells it to stop. GrayCris’ best option at this point was to pour as much money in as possible.
  • “If GrayCris can’t make you disappear, they want to delay you. They’re probably raising the money to buy off the company. The gunship is also here to exert pressure on GrayCris while the company is negotiating with their reps back on Port FreeCommerce. That ransom GrayCris asked for Mensah’s return will probably go straight to the company, as part of the pay-off.”
  • I had been running possible scenarios, partly to drown out the sound of humans making stupid suggestions. (Not that I don’t like that sound; it’s sort of comforting and familiar, in an annoying way.) “It would be tricky,” I said. By tricky I meant I was getting an average of an 85 percent chance of failure and death, and it was only that low because my last diagnostic said my risk assessment module was wonky. (I know, that explains a lot about me.)
  • So the plan wasn’t a clusterfuck, it was just circling the clusterfuck target zone, getting ready to come in for a landing.
  • Disinformation, which is the same as lying but for some reason has a different name, is the top tactic in corporate negotiation/warfare. (There had been a whole episode about it on Sanctuary Moon.)
  • At Milu I meant to do the same, but I was identified as a SecUnit so I told them I was under the control of an off-site security consultant client.” Impersonated is a weird word, especially in this context. (I just noticed that. Im-person-ated. Weird.)
  • Diving around hauler bots and dodging projectiles, it was hard to come up with a decent argument for free will. I’m not sure it would have worked on me, before my mass murder incident. I didn’t know what I wanted (I still didn’t know what I wanted) and when you’re told what to do every second of your existence, change is terrifying. (I mean, I’d hacked my governor module but kept my day job until PreservationAux.) What do you want?

Profile

fiefoe

February 2026

S M T W T F S
1 2 3 4 567
8 9 10 11121314
15 16 1718192021
2223 2425262728

Style Credit

Expand Cut Tags

No cut tags
Page generated Feb. 16th, 2026 01:47 pm
Powered by Dreamwidth Studios