Top
What Facebook must do to prepare its 3,000 new moderators for the trauma they’ll face – A N I T H
fade
3793
post-template-default,single,single-post,postid-3793,single-format-standard,eltd-core-1.1.1,flow child-child-ver-1.0.0,flow-ver-1.3.6,eltd-smooth-scroll,eltd-smooth-page-transitions,ajax,eltd-blog-installed,page-template-blog-standard,eltd-header-standard,eltd-fixed-on-scroll,eltd-default-mobile-header,eltd-sticky-up-mobile-header,eltd-dropdown-default,wpb-js-composer js-comp-ver-5.0.1,vc_responsive

What Facebook must do to prepare its 3,000 new moderators for the trauma they’ll face

What Facebook must do to prepare its 3,000 new moderators for the trauma they’ll face


Mark Zuckerberg did the right thing when he announced Wednesday that Facebook plans to hire 3,000 people to review reports of objectionable content, including live video that might contain horrible scenes of murder and suicide. 

The company may have wanted to confine Facebook Live to viral quirkiness like Chewbacca mom and a folk artist covering Tears for Fears on the hammered dulcimer, or serious streams of election results and people recording and broadcasting injustice. The sad reality, however, is that people have used the technology to share grisly footage of little public value and high potential for collective trauma. Last week, for example, a man in Thailand broadcast killing his infant daughter and himself. 

While Zuckerberg hopes to protect the Facebook community from harm, his announcement raises big questions about how the company intends to shield its new hires from the emotional and mental torment of sorting through videos that might make most of us cringe or weep. 

This is an intense, high-stakes challenge, and getting it right means putting numerous policies and practices in place that address everything from hiring strategies to mental health benefits to stigma reduction to workplace culture, says David Ballard, an expert in creating psychologically healthy workplaces and an assistant executive director for Organizational Excellence at the American Psychological Association. 

“This is new territory,” Ballard says of trying to safeguard the mental health of people paid to monitor live video for violence. “It’s essentially vicarious traumatization. They’re not in the life-threatening situation themselves, but they’re viewing a situation that’s overwhelming, extreme, and upsetting.”

Facebook isn’t saying much about its policies so far. A spokesperson for the company wouldn’t comment except to say that every person reviewing content on the platform is provided psychological support and wellness resources. There is also a program specifically designed to aid employees who review potentially traumatic content, and it is evaluated annually. 

The company has also not commented on whether the new hires will be contractors or employees. The difference between the two experiences can be significant; contractors may be seen as replaceable and be treated as such, whereas employees enjoy perks, benefits, and a more supportive workplace culture.

“Finding a good fit between individuals who can function in that environment, exposed to that type of content, is really going to be important.”   

What we do know from previous reporting is that people who moderate online content, including abusive language and still images, are often poorly paid and experience negative psychological consequences. But much of the research on what it’s like to experience vicarious trauma is on first responders who put out fires and triage medical emergencies. These studies don’t provide an easy parallel to the situation that Facebook now faces, where its reviewers will identify and remove live videos of people hurting themselves or others. 

Ballard says the job description for this role must be abundantly clear about expectations so that it’s a right fit both for Facebook and the prospective employee. Emergency responders, for example, regularly undergo a fitness-for-duty evaluation, which determines whether the applicant is physically or psychologically capable of doing the job. Ballard recommends that Facebook, if it hasn’t already, work with an expert knowledgeable in how to identify candidates for high-stress or emotionally taxing jobs. 

“Finding a good fit for individuals who can function in that environment, exposed to that type of content, is really going to be important,” says Ballard.

It would probably be a mistake, he adds, to hire low-level or entry-level applicants who have little professional experience. Likewise, when new employees start their job, they should receive comprehensive training on what responses to trauma look like and how to deal with them, particularly using company-provided resources. 

Managers need similar training as well, with a focus on how to talk to employees about trauma, identify signs of emotional turmoil or mental illness, and refer people to support. 

While Facebook provides psychological support and wellness resources, Ballard specifically recommends easy access to an employee assistance program, mental health benefits that are on par with the physical health benefits, workplace education around mental health issues, and stress management programs. 

Companies should aim to have these in place regardless of whether their employees are trying to prevent live video of suicide or murder to reach millions of people, but Ballard says such measures are particularly important in light of the work that Facebook reviewers will perform: “In an environment like this, it will be even more critical.” 

“Even if they nail it now and get all the pieces in place, it’s not going to remain static.”  

This is hard enough to execute in a company with thousands of people and gets even trickier, says Ballard, if any of the new reviewers will be contractors working remotely or in other countries. That introduces complex considerations about how to gauge someone’s well-being and how to address cultural differences when it comes to mental health. It also raises serious concerns about whether a contractor can create a workplace environment that mirrors what Facebook employees experience. 

Finally, Ballard says, Facebook will have to consistently evaluate how its employees are coping with the unique pressure of this particular job — and whether or not its support is making a positive difference. It sounds like the company is already doing that; its latest move brings that effort to a bigger scale. 

Basically, in committing itself to protecting users from traumatic live video, Facebook has just signed up for one of the hardest tasks a social media company could take on. It’s necessary and essential but, as you might have guessed, there’s at least one more obstacle. 

“The nature of content will evolve over time,” says Ballard. “Even if they nail it now and get all the pieces in place, it’s not going to remain static.”  

So even if Facebook won’t comment in depth on its current efforts, expect to have this conversation for a long time to come. 



Source link

Anith Gopal
No Comments

Post a Comment