A review by silentcat7135
Day Zero by C. Robert Cargill

2.0

So that was disappointing. Look at that cover. That's a great cover. I know the author has no control over the cover design, but that cover had me hoping for something along the lines of Calvin and Hobbes vs the apocalypse, with robots based on the blurb. Not what it felt like. Kept running into things that didn't make sense or annoyed me and had to slog a bit to finish what should have been a quick, fun read. It didn't help that I have read two excellent books about artificial intelligence and the desire for bot freedom: The Hierarchies by Ros Anderson and The Mechanical by Ian Tregillis, both 5-star reads.

Among the things that bothered me with the book were (includes spoilers):

Spoiler
1. Illogic in world building and design of main character and other bots.
The main character is Pounce, a nannybot designed to look like a fluffy tiger. These nannybots come in three styles: lion, tiger, and bear (oh my!). Unlike the cover, where Pounce is sitting like a tiger, a quadruped, in the book the nannybots, even the ones designed to look like animals, seem to walk around on two legs, which is going to look odd except for maybe the bear. If it somehow looks okay walking on two legs, it will look less like a tiger, even a fluffy toy tiger, and more like a guy in a tiger furry costume. Worse than that, they have paws but apparently have enough dexterity to handle and fire guns, rejig a car into something tank-like (including some manipulation of paper clips to get into tight mechanical spots), and sign to a child in American Sign Language. Really? Without opposable thumbs? Or do they have hands, even though they are called paws at one point, which again will look more like someone in a tiger costume rather than a tiger.

There is also reference to Asimov's three laws of robotics embedded in their programming, preventing robots from harming humans. A domestic bot comes home quite early in the book without groceries, having been damaged by thugs, whom she could not fight back against. Okay. Seems like there would be a lot of expensive damage to robots who couldn't even shove back or give a mild electric shock when being attacked. That would get costly after a while. And in this society, there are no rich people with robot bodyguards? Who couldn't function with Asimov's three laws programmed in. My mind wandered to images of people wailing away on expensive cars with baseball bats while a robot stood by calmly. There is a comment dropped in very late in the book that nannybots can bend the three laws if their charge is being attacked, but it seemed like trying to explain away the problem inherent in mentioning Asimov's laws in the first place. Those laws may be famous, but they don't always make sense in practice.

Also, some of the zoo-style nannybots, Pounce included (until this production line was outlawed), have what is called a Mama Bear protocol, which, once activated by parents, turns the nannybot into essentially Jason Bourne. This was activated by saying "Mama Bear." Again, my mind wandered to parents reading their sweet little child a bedtime story, Goldilocks for example, and ending up with a military-grade nannybot. Oops.

A minor thing about the bots that also bothered me was communication between bots. It was carried out in the same spoken, sometimes slangy language that the bots used with humans, presumably designed to put humans at ease in their interactions. And bots would do this why? There are more efficient ways machines can communicate between each other, especially useful after they've been Jason Bourned.

2. Cultural references that feel dated because they are too rooted in the present to make sense in a story set in the future.
Just a few examples: A school named after Ocasio-Cortez. A corrupt business under the name John Barron, Inc., John Barron being a pseudonym used by Donald Trump. And one that got an outright groan when I read it, when Pounce and his child charge, Ezra, meet up with other Bourned nannybots and their charges, one designed to look like a dog (a Chinese knockoff because of not respecting IP protection) is named Indiana. Get it? Do ya? "We named the dog Indiana!" Ugh. It's a somewhat dated reference now. Imagine how dated that will be by the time AI nannybots could exist.

3. Clunky infodumps, especially politidumps.
I'm sure it's hard for authors to integrate the necessary world-building information into a story, but there are ways to do it without bringing the story to a screeching halt. Have two characters refer to something in a conversation. Use the first couple sentences of each chapter to introduce a new detail. This book, especially early on in the story, has infodumps, including what I'm referring to as politidumps about society that aren't even integral to the main story but just land with a thud. For example, in the scene where the domestic bot comes home damaged, Pounce is instructed to clean off graffiti from her. Scene starts, then there's a politidump about the apparently awful people who are lazy enough to live on government-supplied basic income (UBI, which is a current political issue) without going out and getting a degree to get a better job, etc. Then, Pounce finishes cleaning the graffiti. Whose opinions are these? The only two characters in the scene are the nannybot and the domestic bot. Do they think this? If so, why would a robot have an opinion like that? If it's meant to be an opinion of humans in the story, why is it dumped in a scene with two robots? Is it the author's opinion? If so, why is it just dumped in the book instead of expressed through characters?


Obviously, I couldn't keep my mind from poking holes in the story enough to enjoy it. I'd recommend reading the two books mentioned at the beginning of this review instead: The Hierarchies by Ros Anderson and The Mechanical by Ian Tregillis, which is the beginning of a trilogy. I so wanted to add Pounce to the list of awesome AI characters, but this was a miss for me.