dani005 's review for:

3.0

This story -although short– raises a lot of intriguing questions. I believe that Chiang wrote the nature of the story in a manner that was meant to be short, to poignantly drive his narrative home, but it also leaves it feeling a little clipped at times, which I think jarred the flow of ideas. All the same, I did feel that a lot of the philosophical questions around the construct of consciousness, maturity and even consent were well delivered if even unnerving at times in their chilling delivery.

The world of artificial intelligence holds a maelstrom of existential and moral dilemmas which can be a lot to digest at one time. This story integrated an endearing storyline of creating a sentient being, one which can cognitively grow much in the same way that a child or pet would. However, the idyllic ideas of a pet who can be "suspended" for the purposes of convenience carries a dark edge to it. This idea falls prey to the ministrations of a synthetic genome which could theoretically give rise to a being that neurologically can learn to feel emotions, understand languages and comprehend abstract concepts beyond crude understanding. This begets the reality that this level of comprehension would constitute a level of sentience that is on parr with our own. Ethically, this poses the question, would turning a sentient being "off" and "on" at the emotional whim of another being be morally justified; taking away their right of choice? And as time passes and they experience the world around them, when do you draw a line of that being becoming mature enough to make their own decisions, of them being on parr with a person within their own independence or is it simply a projection of simulated ideas and thoughts being reciprocated back to you? By their own ministrations of learning and experiencing the world, they are just as exposed then to theoretically develop their own disorders to, whether it be a design flaw of their own synthetic genome or was curated by their environment. Who is responsible then for these beings then? The people who programmed them, the people who raised them? Or are they simply a failed experiment, their emotions and experiences a byproduct and they are better off to be deleted?

There was also a subtler theme that seemed to be running its course through the premise of the action of this story; one of fanaticism. A reoccurring theme seemed to be of those who would become so involved in what was happening in their "digient's" (AI beings') lives that they would forget to live in the world right in front of them. This perspective would shift the characters' priorities. Whether these were unhealthy priorities served as its own moral dilemma, as such a priority was founded on whether or not the person believed that the digient was sentient enough to be recognized as an equal to a human being. Choices were made accordingly, and often times it would be hard to recognize them as morally right or wrong, depending on how you viewed the "personhood" of these digients.

I enjoyed the intriguing questions of this story, although I did feel a lot of it was rushed, and that there were a lot of various questions which were being proposed with little effort made in regards to delving into any specific one. Then again this was a novella submitted alongside others, as I believe it, and perhaps this novella was meant to be digested alongside others that experimented with similar ideas such as this. It was an enjoyable read; thought-provoking if a little chilling in terms of how such beings might be brought into existence at our own hands only to wind up as little more than another opportunity for abuse and an enactment of misguided ethical principles.