Diamond and Silk Expose Facebook’s Burden of Moderation
Of all the absurdities of Mark Zuckerberg’s more than ten hours of Congressional testimony this week, one moment of theater stands out.
“I’d like to show you, right now, a little picture here,” Missouri Republican Billy Long said to the Facebook CEO. A staffer placed a large photo of two women behind his head. “You recognize these folks?”
“I do,” Zuckerberg said. “I—I believe—is that Diamond and Silk?”
It was. Lynette “Diamond” Hardaway and Rochelle “Silk” Richardson are biological sisters and black conservative internet personalities who became famous before the 2016 presidential election for being vocal supporters—and paid consultants—of Trump’s campaign. They boast a particularly strong following on Facebook, where their audience has ballooned to 1.5 million followers. But in September, the sisters claim, Facebook began limiting the reach of their videos, and earlier this month, it said they were “unsafe.”
That was the crux of Long’s actual question: “What is unsafe about two black women supporting President Donald J. Trump?”
By Thursday afternoon, Facebook’s CEO was likely already familiar with the pair. During both of Zuckerberg’s hearings on Capitol Hill this week, lawmakers including Senator Ted Cruz and Representatives Joe Barton, Fred Upton, Marsha Blackburn, and Richard Hudson all cited the bloggers. For these six lawmakers, the saga of Diamond and Silk is a proxy for an issue that’s enraged conservatives: They believe that Facebook is censoring them by curtailing their reach on the site.
It’s a criticism Zuckerberg has been unable to shake since 2016, when a Gizmodo article revealed Facebook’s mostly-liberal moderators were suppressing conservative news. Since then, the social network has gone to great lengths to ensure that its decisions appear non-partisan.
But to make the platform functional, and useful to its users, Facebook must choose what information it values. “Giving everyone equal amplification—especially stripped of context—will more often lead to confusion rather than ‘more truth,’ says Jared Colton, who teaches about ethics and technology at Utah State University. “If we really are committed to honesty in this digital age, we need to be willing to filter information.”
Therein lies the conundrum of the modern social network. Facebook doesn’t have power over what its users say on the platform, but it has close to complete control over who gets heard. To communicate anything, Facebook can’t communicate everything: The company’s most powerful mechanism is its ability to determine exactly what gets seen in the News Feed. But hush anyone, it invites criticism from everyone. It’s Facebook’s unwinnable game.
Hush anyone, it invites criticism from everyone. It’s Facebook’s
Much of what Diamond and Silk offer is exactly the kind of content Facebook has been criticized for over-showing to users during the 2016 presidential election. The sisters’ videos are often sensationalist, one-sided, and riddled with inaccuracies. It’s easy to find troubling moments in their archives. During the lead-up to the election, they pushed conspiracy theories like Marco Rubio’s alleged hidden “gay lifestyle” and sat down for a radio interview with John Friend, an anti-semite and holocaust denier.
The sisters’ observed that their reach on was on the decline following several changes to Facebook’s News Feed. In August, the social network began cracking down on video clickbait, and in January Facebook began prioritizing content from friends and family over posts from brands and media pages, like Diamond and Silk’s. Facebook’s head of News Feed, Adam Mosseri, specifically said users would see “less video,” the sisters’ medium of choice. News publishers, many of which had invested specifically in creating social video for Facebook, also have see their traffic decline.
In early April, Facebook sent Diamond & Silk a message saying their content had been deemed “unsafe.” Zuckerberg told Congress the message was a mistake. “Our team made an enforcement error. And we have already gotten in touch with them to reverse it,” he told Joe Barton, a congressman from Texas. But the issue exploded, especially after Diamond and Silk repeatedly denied contact with Facebook—even after the pair’s communications with the platform were released.
“We have communicated directly with Diamond And Silk about this issue,” a Facebook spokesperson said in a statement to WIRED. “The message they received last week was inaccurate and not reflective of the way we communicate with our community and the people who run Pages on our platform. We have provided them with more information about our policies and the tools that are applicable to their Page and look forward to the opportunity to speak with them.” Diamond and Silk did not return a request for comment.
The factors that influence Facebook’s filtering systems are of monumental importance to publishers and creators, yet they’re largely opaque. While conservatives have adopted Facebook censorship as a unique and partisan issue, Facebook has made “enforcement errors” when dealing with liberal groups as well. Training documents unearthed by ProPublica in June of last year encouraged moderators to remove posts criticizing protected groups, such as white men, rather than groups defined by race and gender—say black children.
By design, Facebook often feels like a public forum rather than an advertising platform run by a corporation. Even Ted Cruz mistakenly told Zuckerberg that the law mandates Facebook be neutral, which isn’t true. It’s Diamond and Silk’s First Amendment-granted right to speak untruths—and their followers have every right to spread them. Yet it’s Facebook’s role to determine just how much impact its users have, which they will do in whatever way is befitting to their bottom line.
Facebook will never be the free expression forum we want it to be: It’s a private company, with algorithms that move in mysterious, often biased ways. Maybe it’s time we accepted that.