Top
I wrote the Facebook report Ted Cruz can’t stop talking about. He’s getting it all wrong. – ANITH
fade
135762
post-template-default,single,single-post,postid-135762,single-format-standard,eltd-core-1.1.1,flow child-child-ver-1.0.0,flow-ver-1.3.6,eltd-smooth-scroll,eltd-smooth-page-transitions,ajax,eltd-blog-installed,page-template-blog-standard,eltd-header-standard,eltd-fixed-on-scroll,eltd-default-mobile-header,eltd-sticky-up-mobile-header,eltd-dropdown-default,wpb-js-composer js-comp-ver-5.0.1,vc_responsive

I wrote the Facebook report Ted Cruz can’t stop talking about. He’s getting it all wrong.

I wrote the Facebook report Ted Cruz can’t stop talking about. He’s getting it all wrong.


Facebook CEO Mark Zuckerberg spent two days this week getting grilled by Congress in response to the Cambridge Analytica scandal. The hearings were often hampered by their awkward format and strict time constraints. But at least some legislators took the opportunity to ask important questions about how Facebook protects user data and combats misinformation.

Not everyone on Capitol Hill got the memo. Sen. Ted Cruz (R-Tex), for example, used his allotted five minutes of questioning on Tuesday to drill Zuckerberg over a perceived suppression of conservative news on the social network. He even cited a story I wrote for Gizmodo in 2016.

“Does Facebook consider itself a neutral public forum?” Cruz asked.

“We consider ourselves to be a platform for all ideas,” Zuckerberg responded.

Cruz would not relent. He continued with a series of follow-up questions, using the entire portion of his time to hammer the issue, eventually referencing a story I wrote about Facebook’s “Trending” news curators. 

Cruz said:

I will say there are a great many Americans who I think are deeply concerned that that Facebook and other tech companies are engaged in a pervasive pattern of bias and political censorship. There have been numerous instances with Facebook. In May of 2016, Gizmodo reported that Facebook had purposely and routinely suppressed conservative stories from trending news, including stories about CPAC, including stories about Mitt Romney, including stories about the Lois Lerner IRS scandal, including stories about Glenn Beck.

As Slate’s Will Oremus pointed out, it wasn’t the first time Ted Cruz used my story as ammunition to attack Silicon Valley’s biggest companies. The Republican Senator also referenced the story during a separate congressional hearing earlier this month that included representatives from Google, Facebook, and Twitter. Senator Cruz appears to have been deeply moved by the article (lucky me!) — but he misses the point entirely. 

Cruz is obsessed with the idea that employees at Facebook are shutting down conservative viewpoints. My original reporting for Gizmodo revealed that, at least for a brief period when the Trending module was launched, this was possible. After this was revealed, Facebook fired the journalists working on the Trending team and turned the feature over to algorithms. (And, as it happens, recent numbers show conservative outlets are among the most visible on Facebook.)

Now, the company’s major problems have more to do with transparency than oversight.

At this point, Cruz should forget about human bias and worry more about the automated forces controlling the information we see through our News Feeds. Most of the things we see on Facebook are controlled by algorithms, and those algorithms are inherently infused with bias. In fact, it’s the whole reason they exist in modern technology: to discriminate against certain things and give preference to others. 

The more important question Senator Cruz should have asked is, “Where do we draw the line?” and “Why should people be asked to blindly trust algorithms that have such a gigantic impact on society?” 

Doesn’t Mark Zuckerberg think there should be more transparency in how these algorithms work?

Algorithms are inherently infused with bias. In fact, it’s the whole reason they exist.

Facebook is ultimately a collection of algorithms feeding you content, and thus, a product of its employees’ carefully considered judgement. Every single decision made about how Facebook operates comes with inherent biases — whether the people making the decisions are conscious of them or not. The biggest problem is that, as users, we have very little insight into who is making those decisions and why.

The implications of each decision can have a profound impact on society. Facebook itself has even admitted it can affect the mood of its users (the company conducted a secret mood study in 2014), and admitted it can affect real world decision-making (the company conducted a secret voter-turnout study in 2012) simply by changing what people see in the News Feed.

But when the company decides to change any part of the algorithms that control the network — which it does regularly, by its own admission — it is left to police itself. There is no third-party reviewer, no ethics board, and as we’ve seen, when the company is left to self-regulate, it can lead to major problems.

My belief is that Facebook should make its algorithms open to the public. The code, or what is essentially a weighted math and logic problem, should be available for people to look at, much like a piece of open-source software. The people using Facebook deserve to have a better understanding of why they’re being exposed to different types of content.

Right now, we’re being asked to interact with a powerful black box that has a tremendous amount of control over how we think and act. This ultimately is why Ted Cruz missed the point during the tech hearings in Washington: The problem is Facebook’s hidden algorithms, not a supposed political bias.

My belief is that Facebook should make its algorithm open to the public

We can’t take Facebook’s word for how its platform operates. Until Gizmodo published its series of reports on Facebook’s Trending section, the company claimed that it was sorted by computers and algorithms (Gizmodo highlighted this point extensively in one of the stories about how the team operated). 

Ultimately, stories from other publications about how Trending worked were corrected, and Facebook eventually relented and admitted that people were infusing the algorithm with personal judgements on a daily (even hourly) basis. The same thing happened in the recent Cambridge Analytica scandal. It wasn’t until The Guardian published its explosive story that Facebook finally admitted wrongdoing. Until that point, the company had tried to intimidate and threaten the publication to prevent the story from being published. The same happened in 2016 with Gizmodo.

So the point that Cruz missed is this: Facebook is a company that has earned a multi-billion-dollar valuation by asking people to lead more transparent lives and share more on the network. But the company itself doesn’t play by the same rules. Rather than being radically transparent, it instead obfuscates its operations and even some of its privacy controls. All of this, presumably, to grow its user base — as “The Boz Memo” recently revealed.

Facebook is anything but transparent. Its privacy settings are notoriously hard to navigate. The company has bullied and intimidated journalists in an attempt to hide the truth. And the company has straight up lied about how parts of the platform operate, most evidently in the case of the Trending team. So the question is not about whether Facebook has a hidden liberal bias — it’s about why the company isn’t more transparent with the algorithms that have been proven to have a significant impact on society.

http://platform.twitter.com/widgets.js
!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];s.parentNode.insertBefore(t,s)}(window,
document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘1453039084979896’);
fbq(‘track’, “PageView”);



Source link

Anith Gopal
No Comments

Post a Comment