Who Will Take Responsibility for Facebook?

The reckoning is upon Mark Zuckerberg.
Doug Chayka

Just after the collapse of the World Trade Center in 2001, Leslie E. Robertson, the twin towers’ chief engineer, plunged into a period of moral reckoning.

As a young hotshot in the 1960s, Robertson had defied the engineering establishment to erect the iconic skyscrapers. Now, at age 73, he brooded. Over and over, observers suggested that the arrogant silhouette of the towers was their undoing. Robertson seemed surpassingly sad. He emailed a colleague in verse: “It is hard / But that I had done a bit more … / Had the towers stood up for just one minute longer … / It is hard.” As The Wall Street Journal reported, when asked at a public forum if he wished he had done anything differently, he wept.

But Robertson also conducted a careful audit of his work. The blueprints, the physics, the math came in for close review. He knew he had designed the buildings expressly to brook the impact of an airliner. But flaming jet fuel had brought them down. He concluded that inoculating the towers against those all-consuming fires would not have been feasible. “You could always prepare for more and more extreme events, but there has to be a risk analysis of what’s reasonable,” Robertson told a Newsweek reporter. In his reckoning, Robertson managed to avoid the twin seductions of defensiveness and ­self-savagery—and took responsibility for his work.

Mark Zuckerberg, an engineer in another key, has also seen his magnum opus breached, with a force that may yet shatter it. Over the past two and a half years, Facebook’s integrity as a place that “helps you connect and share with the people in your life” has been all but laid to waste—as it has served as a clearinghouse for propaganda, disinformation, fake news, and fraud accounts. More serious still: Facebook may not just have been vulnerable to information warfare; it may have been complicit.

Zuckerberg, however, has been unaccountably slow to make earnest amends. This week for Yom Kippur, 11 months after the election, he did post what’s known as a Cheshbon Hanefesh—the moral accounting undertaken annually in a spirit of repentance. But his statement seemed pro forma—perhaps even aggrieved. “For the ways my work was used to divide people rather than bring us together, I ask forgiveness and I will work to do better.” Odd phrasing. Was Zuckerberg confessing to his own misconduct? Or was he saying that his work “was used” by bad actors, which suggested that he himself was owed the amends?

But even this pass-agg penance marked a change of course for Zuckerberg personally, if not for Facebook. In November, fresh off the US election, he dismissed as “crazy” the idea that fake news on Facebook had influenced the race. (He disavowed the word when he came in for criticism 10 months later.) When President Obama reportedly urged Zuckerberg to take seriously that Facebook could be exploited by hostile powers intent on undermining democracy, even then Zuckerberg shrugged.

Meanwhile, he whistled in the dark, lighting off on a 50-state walkabout dense with Insta opportunities. It looked for all the world like he was running for president himself. That impression was bolstered when he later hired Joel Benenson, a former campaign adviser to Obama and ­Hillary Clinton.

As the summer wore on, it became unmistakable that Facebook’s problems ran deeper than fake news. In June, Facebook officials reportedly met with the Senate Intelligence Committee as part of that body’s investigation into Russia’s election interference. In August the BBC released an interview with a member of the Trump campaign saying, “Without Facebook we wouldn’t have won.”

At last, in September, Facebook broke its silence. The company admitted it had received payments for ads placed by organizations “likely operated out of Russia.” These were troll operations with a wide range of phony ads designed to fan the flames of American racism, anti-LGBT sentiment, and fervor for guns­—as well as to build opposition to Clinton. Zuckerberg announced that the ads had been turned over to Congress, and he ­intimated that an internal investigation at Facebook would likely turn up more such ad deals: “We are looking into foreign actors, including additional Russian groups and other former Soviet states, as well as organizations like the campaigns, to further our own understanding of how they used all of our tools.”

The statement sounded more like fact-­finding than soul-searching. Zuckerberg seemed to be surveying a different Facebook from the one that allowed possibly Kremlin-­backed entities to target people who “like” hate speech with racist propaganda. A Facebook like that would need a gut renovation; Zuckerberg’s Facebook just needed tweaks.

Facebook is indeed a new world order. It determines our digital and real-world behavior in incalculable ways. It does all this without any kind of Magna Carta except a vague hypothesis that connectivity is a given good. And yes, it’s largely unregulated, having styled itself as nothing more than a platform—a ­Switzerland pose that lets it seem as benign as its bank-blue guardrails, which stand as a kind of cordon sanitaire between Facebook and the rest of the unwashed internet.

In 2006, a college kid talked me off ­Myspace and onto Facebook by insisting that Facebook was orderly while Myspace was emo and messy. That kid was right. Facebook is not passionate; it’s blandly sentimental. It runs on Mister Rogers stuff: shares and friends and likes. Grandparents and fortysomethings are not spooked by it. Like the animated confetti that speckles Facebook’s anodyne interface, our lives on Facebook—the bios and posts—seem to belong to us and not to the company’s massive statehouse, which looks on in­differently as we coo over pups and newborns. (Or is it a penal colony? In any case, it keeps order.) Facebook just is the internet to huge numbers of people. Voters, in other words.

But that order is an illusion. Nothing about Facebook is intrinsically organized or self-regulating. Its terms of service change fitfully, as do its revenue centers and the ratio of machine learning to principled human stewardship in making its wheels turn. The sheen of placidity is an effect of software created by the same mind that first launched Facemash—a mean-­spirited ­hot-or-not comparison site—but then reinvented it as Facebook, an “online directory,” to prevent anyone from shutting it down. The site was designed to make the libertarian chaos of the web look trustworthy, standing against the interfaces of kooky YouTube and artsy Myspace. Those places were Burning Man. Facebook was Harvard.

Siva Vaidhyanathan, whose book about Facebook, Anti-Social Media, comes out next year, describes Zuckerberg as a bright man who would have done well to finish his education. As Vaidhyanathan told me, “He lacks an appreciation for nuance, complexity, contingency, or even difficulty. He lacks a historical sense of the horrible things that humans are capable of doing to each other and the planet.”

Zuckerberg may just lack the moral framework to recognize the scope of his failures and his culpability. Like Robertson, he was a defiant hotshot when he launched Facebook. Maybe he still is. It’s hard to imagine he will submit to truth and reconciliation, or use Facebook’s humiliation as a chance to reconsider its place in the world. Instead, he will likely keep lawyering up and gun it on denial and optics, as he has during past ­litigation and conflict.

To be sure, unlike on 9/11, there are no mass casualties; there’s no flaming wreckage. But that may only heighten how important it is for Zuckerberg to take responsibility. Because there are 2 billion of us on Facebook. We’re all inside his tower. And, heaven help us, we have nowhere else to go.

This article will appear in the November issue. Subscribe here.