Mark Zuckerberg Answers to Congress For Facebook’s Troubles
Last fall, when Congress called on Facebook to answer for its failures during the 2016 election—including selling ads to Russian propagandists and allowing fake news to flourish on the platform—the social networking giant sent its general counsel, Colin Stretch, leaving lawmakers wanting for face time with the company’s founder. Now, they’ll get just that: Facebook CEO Mark Zuckerberg takes his seat before a joint hearing of the Senate Committee on the Judiciary and the Senate Commerce, Science, and Transportation Committee on Tuesday. He’ll follow that with a Wednesday appearance before the House Energy and Commerce Committee.
Facebook’s fortunes have changed radically since Stretch’s testimony just five months ago, showcasing the fragility of Facebook’s standing in both the markets and public perception. The mere fact that Zuckerberg isn’t sending one of his deputies is proof positive of how quickly things have escalated since November. As has his tone of remorse.
“It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here,” Zuckerberg wrote in prepared remarks shared by the House Committee on Energy and Commerce. “So now we have to go through every part of our relationship with people and make sure we’re taking a broad enough view of our responsibility.”
This time around, Zuckerberg will have to do more than apologize away each screw-up.
Zuckerberg’s opening remarks walk through the company’s marquee mistakes, including both enabling Russian interference in the 2016 election and leaching as many as 87 million users’ personal data to the shady political firm Cambridge Analytica. He also comes prepared with a raft of recent improvements Facebook is making to protect user data, make ads more transparent in the future, and prevent bad actors from building huge audiences on the platform so easily.
This time around, Zuckerberg will have to do more than apologize away each screw-up. This time, he’ll have to answer for how 14 years of uninhibited growth have enabled Facebook to play an unprecedented, even dangerous role in democracy—a role Zuckerberg seems only on the cusp of understanding. And those answers may play a crucial role in shaping the laws that regulators at home and abroad place on the embattled tech company and its contemporaries.
Last October, the fact that Facebook sold ads to Russia’s Internet Research Agency felt like the climax, the culmination of two years in which Facebook both aggressively insinuated itself into the American election and remained willfully blind to the negative consequences. In retrospect, that initial revelation about the IRA feels like just the beginning.
It seemed easy enough at the time for Facebook to minimize the size and scope of the mess it made. The company first downplayed the problem by focusing on the $100,000 of ads the IRA purchased from Facebook, a nominal amount compared to the nearly $13 billion in ad revenue Facebook made in the fourth quarter of 2017 alone. But the numbers only grew from there. In his testimony, Stretch revealed that 126 million people had been exposed to Russian propaganda on Facebook. Asked about how many people were reached on Instagram, Stretch ratcheted the figure up another 20 million. As recently as March, the company had still not yet calculated how many people followed Russian trolls on Instagram. And just last week, it announced that it had found and suspended another nearly 300 accounts and pages linked to the IRA across Facebook and Instagram.
Facebook’s public shaming continued shortly after the hearings, when the House Intelligence Committee published some of the ads and other content the Russian trolls shared on both Facebook and Twitter. For most people, it was the first concrete look at both the divisiveness and ugliness of the content that rocked the election.
In early March, at least, it seemed Facebook had the public relations crisis under control.
The hits kept coming. By January, WIRED reported that special counsel Robert Mueller interviewed at least one Facebook employee as part of his ongoing inquiry into Russian interference in the 2016 election. Just a month later, Mueller published a 37-page indictment of 13 individuals associated with the IRA, which laid out exactly how they “conducted operations on social media platforms such as YouTube, Facebook, Instagram, and Twitter.” Not only did they create phony Facebook Pages like Blacktivist and Heart of Texas, but they also sent Facebook messages to then-candidate Donald Trump’s Florida staff, asking for help organizing pro-Trump flash mobs throughout the swing state.
As the news broke, Facebook announced a spate of changes to its political advertising policies, including plans to label political ads as such and create an archive where people can see the ads, who paid for them, and information about how much they cost and who they reached. In early March, at least, it seemed Facebook had the public relations crisis under control.
The Cambridge Analytica Mess
Over St. Patrick’s Day weekend, The New York Times, alongside The Guardian and The Observer, published simultaneous stories reporting that Cambridge Analytica and its British counterpart SCL had accessed 50 million Facebook users’ data without their knowledge or permission. What’s more, Facebook acknowledged that it had known about the violation since 2015. The company tried to preempt the story by suspending both companies, as well as a former SCL employee-turned-whistleblower named Christopher Wylie, and a researcher named Aleksandr Kogan, who gave Cambridge and SCL the data to begin with.
By that Monday, the company’s stock price began to free fall. Suddenly, Facebook was forced to answer not just for its political advertising policies from 2016, but its entire history of data privacy policies, focusing especially on its Social Graph API, which allowed developers to build apps on top of Facebook—and scrape up reams of users’ unwitting friend data while they were at it. Facebook phased those capabilities out in 2015, but the Cambridge Analytica scandal revealed that the company had no mechanism in place to ensure developers weren’t sharing and misusing that data.
Right on cue, lawmakers began calling Facebook back to Congress. This time, they wanted Zuckerberg. And yet, for five days after the story broke, the camera-shy billionaire was remarkably silent. In the mean time, more dirt about Cambridge Analytica rose to the surface, thanks to an undercover video from the UK’s Channel 4 that showed Cambridge Analytica’s CEO Alexander Nix discussing the use of extortion and bribery on behalf of clients. Facebook hadn’t just leaked user data to any old data miners; it had leaked it to apparently ignoble ones.
Finally, Zuckerberg spoke out, first in a Facebook post and then in a series of interviews with WIRED and other outlets. He took responsibility for what he called a “breach of trust,” and signaled a new openness to proposed regulations like the Honest Ads Act, which would more tightly regulate digital political ads. And yet, he was still resistant to the idea of testifying before Congress. “If it is ever the case that I am the most informed person at Facebook in the best position to testify, I will happily do that,” he told WIRED. “But the reason why we haven’t done that so far is because there are people at the company whose full jobs are to deal with legal compliance or some of these different things, and they’re just fundamentally more in the details on those things.”
But then came the revelation that, in fact, as many as 87 million users’ data may have been accessed by Cambridge Analytica, and an admission that most of Facebook’s 2.2 billion users have probably had their public profiles scraped thanks to a feature that allowed people to search for Facebook users with their phone numbers and email addresses.
This is the tornado of scandals Zuckerberg will try to address before Congress today. When he takes his seat, he will be answering for a wildly different set of issues than Stretch did in fall—not just an isolated incident, but a pervasive problem affecting billions of people. It took 14 years for Facebook culture to allow these issues to foment. Over the last five months, the world finally started to care.