This week, Futures is delighted to welcome Josh Pearce with his story Further laws of robotics. An assistant editor at Locus magazine, you can find out more about Josh’s work at his website or by following him on Twitter. Here, he reveals what sparked his latest tale — as ever, it pays to read the story first.
Writing Further laws of robotics
Buddy-cop stories between humans and robots are pretty common: Asimov’s Caves of Steel and sequels, the ‘Automata’ storyline in the Penny Arcade webcomic, that short-lived Almost Human TV show starring Karl Urban. Robots make easy material for detective stories. After all, a good mystery relies on good logic, and robots that are constrained by three (or more) laws provide a ready-made logic puzzle for us to play with.
I, Robot remains my favourite Isaac Asimov, a collection of logic puzzles built upon three simple rules. Later in his career, Asimov added a fourth rule, the zeroth law, which supersedes the other three. By that logic, I thought, a negativith law would need to be even more important than zero. From there I just had fun with number theory.
Any system with ‘Laws’ needs law enforcement, so an Asimovian Three Laws society of robots is going to need cops. In William Gibson’s Neuromancer there’s brief mention of Turing Police, who make sure that AI don’t overstep the bounds of their charters. I wanted to know more about that. I wanted to see a police force tasked with making sure that robots obey their programming rather than just going all Blade Runner on any non-compliant hardware. Inspector Warren is your stereotypical noir detective type but he’s not quite as dumb as he looks. (I named him after Warren Spector, creator of the Deus Ex video games.)
Machine ethics are a growing concern in today’s society, with fears about autonomous weapons platforms and black-box neural networks, but even if Asimov’s three laws were able to be translated into executable code, there’s no guarantee that they would protect us. Just because something is logical doesn’t mean that it’s correct.