“Suicide Bots”, Bentley A. Reese, at Shimmer (http://www.shimmerzine.com/suicide-bots-by-bentley-a-reese/)

I haven’t been able to get Bentley A. Reese’s “Suicide Bots” out of my head since I read it weeks ago. Its prose worked its way under my skin, which is understandable, since the story relies on linguistic recursion to drive its narrative. But it’s one of my favourite stories this year about robots, AI, and ethical botmaking.

The narrator is Jones, a robot assembled out of spare parts in a chop shop and given one directive: to rob a bank and give the spoils to the man who ordered these suicide bots made. He and his fellow robots are put together catastrophically poorly, bits of skin and mismatched parts hanging off metal frames. It reminds me of my favorite bits of classic cyberpunk, the assemblage of bodies, the early-model or scrap heap or improvised technology. But here it’s used to make Jones and his fellow bots, Jane and Tumbler, for a programmed mission to rob a bank and deliver the cash to their creator. To their commissioner, rather, the man who gave the directives for them to be made, who spent no more money or time on these synthetic sentient creatures than bottom line economics required, who made them futureless. The story’s prose is brutal and tight and economical; it has to be, since the robots’ experiences are abrupt and chaotic and messy and existentially fraught. “Suicide Bots”, like most AI stories, is a meditation on what it means to be conscious; but its focus is on what it means to be made wrong, for a purpose that you did not choose, with a ticking end and an inadequate knowledge of the world and the gut-deep understanding that no one else will be your advocate. These characters cannot pass as human, are not given the tools to navigate our world, and have to construct meaning for themselves within the world.

And it is a world that is fundamentally unfair, not only to the suicide bots but to the humans who inhabit it as well. At the checkpoint to the city (New Chicago, suggesting a partial-but-not-total structural upheaval; enough to mark a paradigm shift but not an outright revolution), Jones and Jane and Tumbler are given a list of synthetic entities which are not allowed in by a sentry guard bot. Around them, groups of ragged people mill; Jones searches for, and finds (after a small short-circuit), the word “vagrant”. This sector of human populace is deeply angry at how robots have replaced cheap labor and disengaged them from the framework of late-stage capitalism. It doesn’t seem entirely removed from some contemporary concerns about machine learning and the future of labor; the story’s themes come up nicely against issues we’re addressing right now about ethics and AI.

One of the core themes of the story which works so well for me is the callousness and pragmatism with which the robots are created. Their creator has given a directive, and isn’t much fussed about the nuances or implications, what gets lost. There are a lot of fascinating issues with ethical bot-making both in fiction and contemporary reality; I’ve talked before on this blog about Microsoft’s colossal disaster with Tay and the #botALLY community’s exasperated response. Suffice it to say: we do not always consider what we owe our robots, what we create. This story forces us to.


“Suicide Bots” collapses the distance between synthetic and human consciousness by highlighting the strangeness of the constructed characters but not denying them empathy, agency, or desires. Jones, who falls in love with Jane almost immediately, does so not because she represents a well-made robot who can pass according to traditional standards of human beauty for feminine-presenting people, but because of her assemblage, her very strangeness, and because he can. Everything is strange to Jones and Jane, whose sum total knowledge of the world comes from a USB of Wikipedia pages stuck in the back of their skulls, and what they learn by existing. He can’t discern the difference in human reactions, between laughing and crying; he does know he doesn’t want to shoot security guards, but he has no choice in the situation. Repetition is key here: it establishes what Jones knows as he asserts his world, makes it through language. As a word shifts between sentences, he changes, learns, grows. Giving and taking serve as the verbs at the backbone of his interaction with humans, because that’s how he was programmed. Jones’ maker–his father, as he later calls him–did not consider the implications of creating an entity after this fashion, who would grapple with what it means to be created thus.

And this is so very human. We make ourselves out of the wreckage; old lives, past hurts, things we’ve internalized as long as we’ve been conscious, running as subroutines just as Jones has “TAKE EVERYTHING” programmed into him. There is always an element of being made, shaped and formed, and always the question of: what can we do? In the face of the inevitable, how can we make ourselves so as to endure as well and as long as we can? There’s a particular moment in the story which touches on this obsession with creating a meaningful existence that both humans and these robots have deep within our programming.

“”What happens tomorrow?” Jane asks as I drive. I suck in my lips. I don’t think there is a tomorrow. We aren’t long-term projects, just hazardous grenades thrown into an industrial fire.

Our maker did not make us for our own sake.”

The sentence reminds me of the opening line of Donne’s first Holy Sonnet: “Thou hast made me, and shall thy work decay?”

Fragile mortal carbon things and suicide robots all ask the same questions, must cope with the terminal. Donne’s answer is turning to a grace larger than himself, the comfort of the divine. Jones’ is loyalty to those who one loves, protection and care for those who have loved and cared and witnessed for us in return. I am not at all sure the two concepts are entirely dissimilar.

Advertisements