Ride or die.
Support The Bulwark.
  Join Now

Fixing the Big Tech Bad Apples

Netflix’s documentary ‘The Social Dilemma’ describes real problems—but the time has come to start proposing solutions.
October 12, 2020
Featured Image
Actor Skyler Gisondo in one of the fictional scenes in ‘The Social Dilemma.’ (The Social Dilemma, LLC / Netflix)

Remember the last time you woke up and before you even had a chance to go void your bladder you pulled your phone off the nightstand and raced over to Wikipedia to read up on the Franco-Prussian War and the ripple effects it had on Europe and ultimately World War I?

Right—me neither. But why not? Much like Facebook, Wikipedia is a massive internet colossus rife with misinformation and used by billions of people across the globe. It comes in every language and is easily editable by absolutely anyone. Funny story: A friend of mine, in an effort to boost his own professional profile, added himself to the Wikipedia entry for a massive #1 pop hit—as a co-writer of the song. I won’t say what friend or what song, but suffice it to say that this attribution is #FakeNews, along with much else on Wikipedia.

And yet, while there was a lot of kvetching a decade and a half ago about how Wikipedia could pollute public knowledge, and while teachers and librarians still often worry about students overusing or misusing the site, no one seems to have dire concerns about Wikipedia. Its founders—10 bucks says you can’t even think of their names—are not being hauled in front of Congress to answer for their irresponsible contribution to massive global misinformation. No one seems to be holding them accountable for causing any suicides—let alone genocides. In the popular mind, they and their creation aren’t associated with the shattering of social norms and the impending collapse of society.

And that’s great—because Wikipedia plays no part in any of these ills. Indeed, it is one of a huge number of internet tools and services that contribute to a vast improvement of our lives. Remember standing in line—I mean queued up with a bunch of other human beings on a sidewalk—all night in February to buy tickets for AC-DC? I do. (Brrr.)

Sadly, though, there are a few bad apples—and a new Netflix documentary tears these bad apples to their cores. The Social Dilemma features some of my heroes, like Tristan Harris and Jaron Lanier and Roger McNamee, as well as many other escapees from the underground prison bunkers of Google, Facebook, Twitter, Instagram, and the like. Its tale of woe should send a shiver down the spine of any rational viewer.

The film, directed by Jeff Orlowski, is presented in two interwoven threads. The first consists of an off-camera interviewer asking all these tech refugees questions about what precisely is the problem. The second is a dramatized depiction of some of the ills that are being described. The objective here is to give everyone watching a high-level checklist of what they need to worry about and then to illustrate it for them in relatable terms, presumably because the servings of info being offered to the viewers are pretty large and a fictionalized story will make them more easily digestible. That story involves a family trying to go about its life while besieged by the multitude of malicious corporate actors whose primary purpose is to dominate their attention and steer them to . . . well . . . I won’t spoil it for you, you should watch it. It’s a bit hokey but pretty adorable in an ABC Afterschool Special kind of way.

The documentary has been reviewed by writers at many reputable outlets—most effectively by Andrew Sullivan. For present purposes, I want to set aside any discussion of the stylistic cinematic choices and evaluate the substance of the film’s argument.

And on the substance, all the dismissive takes on The Social Dilemma boil down to these two objections: First, this complaining about Big Tech is the same brand of hysteria as that time everyone’s hair went aflame when we invented the bicycle. And second, Facebook is doing nothing more than direct mail advertising—just much more effectively.

The film doesn’t do an adequate job of anticipating and disputing these claims. But they are easy to refute.

The bicycle argument arises because bikes are mentioned in the film: Tristan Harris suggests that there was no noteworthy social upheaval after the bicycle became popular. Historians of technology were quick to point out that he was wrong. To which I’d say: Unlike with the case of the bicycle, with social media we’re not debating what will happen if we unleash the Bad Apples, we’re now cataloging what has happened and predicting, based on a mountain of proof, what will happen next if we allow it to continue. These predictions are far less hypothetical than the nineteenth-century predictions about the potential deleterious social effects of bicycles.

As for the argument that social media is comparable to the direct marketing we’ve all become accustomed to, well, there are aspects of it that are indeed very similar. But those aren’t the aspects that people are generally concerned about. Direct marketing of yesteryear didn’t have any means to microtarget specific people based on a billion personal data points, manipulate them to skip voting, and then lie to the public about having done it. If you think this reads like dystopian sci-fi, I regret to inform you that it’s happening today.

Harris, who is given the role of the lead protagonist in the film, also recently appeared on Sam Harris’s podcast. Right out of the gate, Sam asked him: “So what’s the solution?” Tristan admitted to Sam that there are no easy solutions and the two then spent over an hour discussing the problems arising from social media. In the end, Tristan’s “solutions” largely depend on a kind of consciousness raising: the hope that enough people will see this film (or other films or articles or books or news reports making the same case) to change their own behavior, or at least to act with the understanding that they’re being manipulated and used and addicted and that their privacy is in jeopardy.

Interestingly, Jaron Lanier (the film’s Robin to Tristan Harris’s Batman) seems to have a very similar hope that he articulates towards the end of the film. And Roger McNamee (I guess he’s Alfred?) posits a hope that the users’ interests will finally get the attention and protection they warrant, ahead of all the big monied commercial interests, but neither he nor the filmmakers proffer an explicit suggestion as to how or by whom or by what mechanism this might happen.

As conclusions go, these are underwhelming and unsatisfying. In spite of all the talent and expertise on offer in the film, the ultimate takeaway basically boils down to:

I genuinely admire what the people in this film have set out to do and I wholeheartedly agree with them as to the ends, but Harris’s stated hope for the means is that “the people” will wean themselves off these products of their own volition. To which I’d say: Tristan, I don’t know if you’ve met “the people,” but I’m sorry to tell you that no one’s ever lost any money underestimating them—and I count myself squarely among their ranks. Also, climate change called and would like to know when “the people” are coming for their greenhouse gases cause it heard we’re all very eager to take those back but it has a thing it has to run to soon and is wondering if it should wait. . .

Let’s start with the obvious.

The Bad Apples executives aren’t going to do anything without their arms being twisted by either their customers (the advertisers) or regulators. Yes—“the people” could, in theory, quit on Facebook as they quit on MySpace and Friendster, but let’s be honest: the odds of this happening are pretty slim at this point.

I hear you saying “but no one other than my grandmother uses Facebook”—but that’s not true because even if you don’t know anyone who uses Facebook proper, you probably know people who use Instagram or WhatsApp (both owned by Facebook) or at least some Bad Apples product(s). And they’re not going to quit.

They also all use Wikipedia, and no one’s concerned—for good reason! Wikipedia isn’t trying to game their attention—rewarding them with likes or public comments and discussions and flamewars and the other temptations of social media.

But why not? If those practices are all essential to user retention—if they’re so engaging and engrossing to the public—how does Wikipedia manage to attract so many users without engaging in them? Wikipedia isn’t a business—it is a nonprofit organization—but does its example suggest that it may be possible to run a successful business without them? This may be the first question that people should be giving serious thought to when complaining to Congress about how to regulate the Bad Apples.

It’s actually not that complicated.

Human beings crave attention and validation. We want to be seen and heard and told that we’re right and that respectable people agree with us. We’ve learned from Hollywood and ESPN that there’s really nothing more wonderful than being famous. The Bad Apples give us tastes of that “fame” a thousand times a day. Wikipedia does not.

So that’s one big piece of the puzzle. If you want addicts to give up the thing they’re addicted to, it goes a long way to first cut off their supply.

Here we can learn much from the history of Coca-Cola. As you probably know—and if you don’t you can read about it on Wikipedia—Coca-Cola gets its name from one of its former ingredients, cocaine.

Why did Coca-Cola drop cocaine? It was not a matter of law alone. There was a significant societal backlash to its being in a beverage consumed by children. Between the backlash and the law, the highly addictive substance was removed and Coca-Cola promptly went out of business by 1910 and we never heard of the company again. Wait—no—it thrived and became one of the most successful businesses in the history of the world. Mr. Zuckerberg, pay attention!

Similarly, I believe that the Bad Apples can remove those aspects of their wares that are the most addictive as well as add all sorts of (optional) features that allow people to reduce their usage.

Aside from the harm of addiction, there are the arguably much greater harms of misinformation and tribalism and siloing and confirmation bias and the coming apart at the seams of society. (I’ve written here in The Bulwark before about these problems.)

Here again, I believe that there are obvious solutions that would make a massive difference to how people regard the information they receive.

And here again, I have to ask you to take a step back and consider how we used to evaluate information before the advent of social media.

Crazy street people were obviously crazy street people. There was no need to do validations on whether or not they were “bots” or Russian agitprop actors, or trolls, or false-flag operators, or provocateurs out to make a buck. Your eyes and your ears (and sometimes your nose) were all the tools you needed to understand that these were not serious opinion holders to whom to give your attention or money.

The sad reality now is that the most prosperous, healthiest, least violent, most benevolent, least racist society in the history of humanity is completely deranged with the false narrative that things have never been worse and that the end times are upon us. And indeed—we can easily self fulfill this prophecy—but if anyone is left to write the history of what happened it will be through the lens of “these people managed to snatch defeat from the jaws of victory as their instruments of communication drove them all mad and they all clawed each others’ eyes out in a drunken frenzy of totally unwarranted despair.”

We are inundated with misinformation, half-truths, uninformed opinions, and absurdist melodrama from across the globe. And as we are being driven mad by this lack of clarity and never-ending noise fighting for our attention 24 hours a day.

However, the answer is not for the Bad Apples to hire censors and fact-checkers because, as Sam Harris correctly notes, these companies are now run by young people who regard just about anything they disagree with as “racist” and/or “hate speech.” They are abjectly unqualified to do any “moderating” because (sorry kids) they aren’t “moderate.”

But you know who is moderate? Average Jane Internet. She’s sitting around and her eye sockets are rubbed raw from how much eye-rolling she’s doing at all the lunacy on these platforms. But there’s no “BULLSHIT” button. How easy would it be for the Bad Apples to put a “bullshit” button next to the likes? (That’s rhetorical—it’s very easy.) And then add an algorithm to notice when a particular user is posting things that are amassing an unusually high ratio of “bullshits” to “likes” and put a “bullshit” flag on their post or their profile? If this is “social” media then why aren’t more of the tools of our “social” order present?

That’s how society works. It’s not perfect—but it’s a whole hell of a lot better than what the Bad Apples have brought us.

All of the problems enumerated in the film—anonymity, bots, selection bias, addictive rewarding, personalized news from disreputable sources, etc.—have reasonably straightforward, technological solutions. None of those solutions is perfect and none of them is one-size-fits-all, which is why Tristan Harris has no single-sentence answer to the question “What’s the solution?” It will require a lot of tinkering to fine-tune them. But we don’t need to throw the baby out with the bathwater. The Bad Apples, much like Coke, can go on making insane amounts of money, even as their exact practices are adjusted to free us from their ever-tightening embrace. A few of the kinds of suggestions they could start to implement:

  • require that users verify their identities, or at least include instant markers that show when an account is likely a bot or a troll;
  • offer users tools to self-regulate (such as timers that disable the apps during certain parts of the day and a way to easily disable things like “autoplay next video”) and make them the default that must be opted out of;
  • encourage people to use their actual faces and names—making it harder to be anonymous and harder to earn “rewards” if you insist on being anonymous;
  • offer paid tiers of service that eliminate you from the rolls that are tracked and packaged as a “product” to advertisers;
  • add news-source certification on platforms like Facebook to counter the current state of ambiguity as to the credibility of the content—people should know when they’re reading something written by a fly-by-night source of propaganda originating in Russia;
  • make it 100 percent clear when a published story is only popping up because someone paid for it to be there.

This isn’t a white paper, so I’ll stop there, but you get the point: these are just a few of the kind of small but meaningful improvements that are the equivalent of removing the cocaine from the product. The auto industry is figuring out how to have cars without gas, so surely the Bad Apples can figure out how to sell us shoes without ruining civilization.

That’s what The Social Dilemma is missing. An epilogue that says “here are twenty ideas that would fix 80 percent of the Bad Apples’ problems. Write your members of Congress and tell them to pass this message along.”

The time has come to stop wringing our hands and clutching our pearls and put forth proposals for actionable, implementable, workable, technological improvements to these platforms.

Yevgeny Simkin

Yevgeny (Genia) Simkin fled Soviet Russia as a child and has spent his life bouncing from music to comedy to software engineering. You can follow his comedy Twitter feed here. He's also the founder and CEO of The Russian Mob™—an agency specializing in developing SAAS, Mobile, AR, VR, and Web applications. (No: They won't help you hack a foreign election so don't bother asking.)