A review by rbruehlman
The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World by Max Fisher

5.0

I grew up in a staunchly libertarian household, raised on the ideology that the free market will fix everything. "Personally, I value companies that place shareholder value creation over all else," was an (unsolicited) verbatim text my dad sent me a few months ago.

I used to believe that, too. Why wouldn't I? I've never subscribed to social conservatism anyway; you do you, I'll do me, get an abortion, live in a polygamous commune, be a vegan, be a flat-earther, whatever, I really don't care! I could see the slippery logic of legislating behavior or free speech, because at what point is a political view just one you disagree with, vs. truly wrong and extremist? Who decides? Or what if most of a political post is true, but some of it is blatantly wrong? Is it still valid?

Part and parcel with the libertarianism that I grew up with was the belief of self-autonomy, good and bad. Don't like your job? No one's making you stay; quit. Don't like Facebook? Don't use it. Want to use drugs? That's your bad choice to make. No one makes you do anything, and, accordingly, you're responsible for everything that happens to you, good and bad.

I've changed my ideology a lot over the years, and now realize that humans aren't so hyperlogical or rational. Good people can be well-intentioned but poorly informed and make choices that aren't healthy for them; companies aren't always incentivized to do the right thing; society works better when we look out for one another; government, as inefficient as it is, exists for a reason.

Social media is the perfect example of this struggle, as illustrated by The Chaos Machine. The Chaos Machine argues that social media companies are incentivized to encourage time on the platform, and it turns out what keeps people engaged is highly emotive, extreme content (even if people dislike it--hate-watching is a thing!). Ergo, extreme content and misinformation is algorithmically prioritized. In an economic system where stasis is death and continued growth is paramount, companies like Facebook and YouTube have a disincentive to dampen engagement. And, meanwhile, people aren't necessarily able to parse good information from bad... so as engagement grows, so does extremism, sowing political polarization and chaos in its wake. Simply put, unchecked libertarian ideals and capitalism do not a healthy combination make.

Broken down...

The toxicity of tech culture

The book starts by first exploring the unusual psyche of Silicon Valley, specifically how tech has long had a libertarian, nerds-know-best streak that simultaneously embraces an anything-goes culture and vehemently rejects rules and regulation. You can be a complete benign weirdo in tech... and you can also be an asshole, too, and that's also accepted, because rejecting assholery would require rules that would also threaten the right to be benignly unconventional. And, of course, in tech, intelligence is paramount. Geniuses get shit done. If the goal is to push at the boundaries of what's possible, then assholes who are geniuses get a pass, because, you know, genius.

It's a compelling argument. As a software engineer, I can see the bohemian-but-actually-toxic culture for sure. In tech, it's totally okay to be a jerk. People speak laudingly of the "10x engineer," or the lone engineer who is as productive as 10 engineers combined. The stereotype is somebody who who is a little prickly, no patience for others, hates meetings, gets it all done himself ... but he's really productive! He's a genius. So what if he's an asshole? Those guys don't get fired in tech; their behavior is waved off. He just thinks faster than other people..

Moreover, software engineering has a culture of egoism on multiple levels. Originally, the mathematical, esoteric nature of engineering attracted the eccentric, hyperlogical loner type; think Steve Wozniak, who freely admits in Walter Isaacson's Steve Jobs that he is a socially awkward loner and has always preferred the company of machines because they're easier for him to understand. More recently, a different breed of self-focused engineers has come onto the scene--people who aren't interested in tech necessarily, but do like making money and the clout of working for "Big Tech." They go in to tech for money and prestige; everything else is immaterial. Investment banks have been struggling to recruit for a while, because the soulless slimy finance bros figured out you can make easy money in tech.

All to say, Fisher argues, rationalism is highly prized in tech, more so than a concern towards other people. People don't go into tech for humanistic pursuits; they go into tech to push at the boundaries of what's possible. Impact on others is a distraction. It attracts people who aren't necessarily worried about the human condition or their impact on society, and, down the road, it discourages introspection and self-restraint. It's not an argument I can disagree with. I see it.


The internet originally served as a home for misfits, and that helped form the backbone of modern internet culture.

Before the internet was ubiquitous, it was a refuge for people who didn't fit in. I would know. I was a "terminally online" teenager. I didn't fit in at school and stuck out like a sore thumb at a preppy all-girls' school. Like some of the people Fisher follows in his book, I found refuge on internet forums where I could talk to people with similar interests and feel a little less alone. I found community there. The internet was like this secret club, with its own separate culture you were in on, and, for once, you were "in" on the joke.

Fisher's analysis of 4chan and the sort of people it attracted is spot on. I was never a 4channer, but I did visit it a few times. It was a chaotic cesspool of social rejects who found community online instead. Most were teens, struggling for identity, and many of them were lonely and hurt, misunderstood and rejected by society. That translated into anger against society for many, and, combined with good old adolescent stupidity, posting extreme, offensive concept was both funny and a direct rejection of broader society that had scorned them. I remember going to 4chan's /b/ once. Someone had spammed child pornography everywhere. Users thought it was funny. I didn't, and I never returned. But that's just the sort of repulsive place 4chan was. Push boundaries. Fuck society. If you're going to be a social outcast, own it, dammit. Anarchy.

As the internet grew, "real life" started blending with online community. 4chan's poisonous irreverent culture didn't go away; it and its users simply bled into the mainstream internet. Most people probably have never heard of 4chan, much less visited; yet much of early internet culture comes from it. Some fun stuff came out of it, like memes, but a lot of not-great stuff, too.


Echo chambers and Us vs. Them.

One of the beautiful things about the internet is community. LGBTQ people in rural areas can find others like them online, making them feel a little less alone. Autistic kids who can't find anyone else who obsessively catalogues Thomas the Train episodes can find other people who like Thomas the Train just as much online. Et cetera. Its global accessibility means it can provide community for the very niche in a way real life just can't.

However, there's a downside to this. Real life presents you with people who have different views and beliefs from you all the time. On the internet, you can find people who think exactly like you... and if you supplant real life with the internet, it can become very, very, very easy to think everyone thinks just like you. If you're miserable and unhappy, you might not lift one another up; you might simply spiral further together in your communal fatalism, and adopt even more extreme viewpoints. See for reference incels!


Social media algorithms lead down a rabbit hole of no return

Mind you, echo chambers can be self-made without the help of social media. The crux of The Chaos Machine, however, is that all of it is made worse by social media algorithms.

Firstly, Fisher argues, social media companies have one key goal: they want you to be as engaged as possible with their platform. The more YouTube videos you watch, the more ad revenue YouTube gets. The more engaging content you see on Facebook, the longer you'll spend on their platform. Social media companies are incentivized to figure out how to get you to stay longer and engage more.

It turns out that content that encourages moral outrage and/or stir up strong emotions are highly engaging--far more than happy or neutral content. It doesn't matter if you even agree with the content--the more incendiary or out-there it is, the more likely you are to engage with it. I'll admit guilty-as-charged; I'm no conspiracy theorist, but I am a silent "hate-reader", someone who will rabidly devour content I vehemently disagree with and make me angry ... for reasons? There are subreddits that infuriate me, and yet I will, like a moth to the flame, click on threads that I know will make me upset.

So, what is a company like Reddit to do if they want me to spend more time on Reddit? Show me all the subreddits that I don't actually like reading (looking at you, r/overemployed and r/cscareerquestions). Facebook, YouTube and Twitter do the same thing; their algorithms will prioritize showing content that elicits angry or crying emojis, and deprioritize ones that simply get likes, or no reactions at all. It's little wonder that Twitter constantly seems so angry and upset; the Twitter feed is designed that way! The reasonable posts are buried because they don't encourage engagement. This creates an echo chamber in and of itself; everyone online seems to believe X, so I should be too. In effect, there are two types of echo chambers: ones people seek out themselves, like 4chan, and ones that social media creates through its algorithm.

In a way, though, the algorithms are a finely-tuned black box. A new mom doesn't click on anti-vax autism scare content right away, and someone doesn't become a Q-Anon flat earther overnight. The mom might start by looking for help on when to introduce solid foods; then get recommended and watch videos about the importance of feeding babies organic food; then toxins in plastic bottles; then toxins in vaccines. Toxins are scary! Moms care about their children. It catches their eye. Slow, but subtle, down the rabbit hole, innocuous at first. People who engage with A often engage with B; people who engage with B are often receptive to C; on and on. Fisher follows the research of various individuals who reverse-engineer YouTube's algorithm and discover how scarily it leads to carefully curated, extreme content, over and over. Perfectly reasonable people can be suckered in slowly but methodically, their gradual progression down the hole perfectly guided by a well-oiled algorithm. It's not really a conscious choice to become a radical; you become radicalized over time.


There's no disincentive to do anything about it.

Social media companies' algorithms have figured out the type of content that is most engaging to people--highly emotive and extreme content--and it increases people's time spent on the platform, a lot. This is exactly what social media companies want--more time spent on the platform. Tweaking the algorithm to deprioritize extreme or damaging content would decrease time spent on the platform, and therefore ad revenue. This is fundamentally misaligned with being publicly held; companies are expected to post higher and higher profits every quarter. No sane company is going to do that, and, accordingly, Google, Facebook, Twitter et al. choose to maintain the status quo. Even when presented with concrete evidence that Facebook had played a pivotal role in Myanmar's political meltdown, Facebook did nothing. There was no incentive for Facebook to do so. Instead, Facebook understandably denied responsibility and agency.


It's Pandora's Box. There's no real easy fixing the problem anyway.

Myanmar was easy for Facebook to ignore; globally, no one really cares about Myanmar, and so it was easy for Facebook to sweep under the rug. It was less easy for social media giants to shrug off their deleterious impact when it came for the United States, particularly in regards to Trump, Q-Anon and more. Facebook and Twitter were forced to do something.

Moderating content isn't so easy, though. As I mentioned earlier, there's a slippery slope on moderating content; do you censor a post if 90% of it is true and 10% is blatantly wrong? At what point is a viewpoint "moderate-able", vs. just one that is morally repugnant but still deserves to remain posted on its own merit? As it turns out, the most engaging misinformation is often alt-right, and explicitly moderating that is a political minefield. Conservative politicians will (not wrongly) squawk that moderation disproportionately impacts them, and have threatened to sanction and punish Facebook for violating free speech. Obviously, Facebook isn't keen to get fined or sued or otherwise regulated. Doing the "right" thing morally may well be political and financial suicide. So it again has a double incentive to do nothing at all.


Overall, I thought The Chaos Theory presented a compelling argument for how social media companies' contort society, have a profound impact on polarization and extremism, have no incentive to be better, and are ultimately deeply unhealthy. If I had to have criticisms of the book, they'd be the following:

There's a definite liberal slant

I am liberal, but I felt this book leaned heavy on alt-right extremism. I would hate for the average liberal to read this book and think, "Aha! It's just making those awful conservatives worse!" The moral outrage echo chamber that social media encourages ABSOLUTELY happens on the left, too. I don't think people on the left are necessarily any more well-informed than on the right; the left's moral extremism is also problematic and unhealthy in its own way. Liberals were shocked when Trump won the election, but should they have been? Live in your liberal bubble online, and you'll get a very skewed perception of the world.

Tiktok is pretty much never mentioned

TikTok is huge and completely algorithmically-driven. Given this book's heavy focus on the cancerous nature of algorithms, TikTok's almost complete omission from this book feels quite strange. My hypothesis for its omission is that the book focuses very heavily on political extremism, and TikTok is more common among people who aren't politically active yet. Really, I think, this book is probably better summarized as an analysis of how Facebook, YouTube, and Twitter impacted the political landscape up until the early 2020s. It's not a complete analysis of the impact on social media algorithms in society. An entire book could be written on TikTok alone and its deleterious effect on people's mental health, let alone many, many, many other aspects of society. Still, though, the fact that the book barely touches on TikTok and its looming greater impact on politics is, to me, strange. Facebook is rapidly losing relevance in the US, yet this book talks about Facebook like it's as large an entity as ever.