Facebook is useless without our input: Content Moderators call for securer offices

According to more than 200 Facebook staff, this social media firm is forcing content moderators to resume work in the office during the pandemic because of the company’s ‘failure’ to depend solely on automated systems.

The staff made the allegation on Wednesday in an open letter to Mark Zuckerberg and Sheryl Sandberg, Facebook’s CEO and Chief Operating Officer, respectively, and the heads of two firms, CPL and Accenture, where Facebook is a subcontractor of content moderation.

They wrote, ‘Facebook cannot operate without our work. Your algorithms are ineffective in detecting satire and can’t differentiate between journalism and wrong information. Their response to child molestation or suicide attempt cannot be fast enough. We can do it, however.’

A litigation company, Foxglove, composed the letter. The firm represents the content moderators, and the contractors and full-time Facebook staff both signed the letter.

Facebook informed the staff in August that they could continue working from home until July 2021. According to the open letter, the company was alleged to have risked the content moderators’ lives by forcing them to return to the offices.  It came in the wake of ‘The Intercept’s’ report that indicated that a member of staff at an Austin, Texas Facebook content moderation facility tested positive for Covid 19 in October a few days after going back to the office.

The staff urged the firm and its outsourcing partners to enhance the safety and working environment. Among their demands was hazard pay for the moderators who must go back to the office. Also, they want Facebook to make direct hiring of all its moderators and allow the ones who reside with high-risk individuals to work from home until further notice and provide enhanced health-care and psychological health support. Additionally, they pointed out that Facebook should ‘increase’ the quantity of work carried out at home.

Facebook depends on more than 15,000 individuals globally to check which posts violate its regulations and should be pulled down and the ones that should be retained, for instance, violence, hate speech, and nudity. A significant number of moderators do not work for Facebook. They are contractors hired by third parties like CPL and Accenture.

Drew Pusateri, the Facebook spokesman, issued a statement saying that most content moderators are still working from home during the pandemic and ‘Facebook has provided optimum health guidance to facilitate safety when working in the office.’

‘We are grateful for the precious work that content reviewers perform, and their safety and health is a top priority to us,’ he said. Moderators can get health care and ‘personal health resources’ from the time they are hired,’ he also said.

Accenture presented a statement saying, ’We are requesting our staff to go back to offices gradually. But it only happens when something urgent needs to be done, and we are confident that we have implemented the correct safety steps, according to local rules. Examples are greatly lowered structure occupancy, maximum social distancing, everyday office cleaning, masks, personal transportation, and other precautions.’

The firm is also trying look for different arrangements for the vulnerable staff or those who live with a vulnerable person.

CPL declined to comment. Since the coronavirus spread at the beginning of this year, Facebook took steps like other tech firms and sent most of its staff home, like contractors.

The firm says that this step enabled them to use automated systems more and more to capture breaching content, and Zuckerberg admitted that the systems were not foolproof. In March, he addressed reporters and said, ‘In future, our impact may go down as we are getting accustomed to this.’

Staff who signed the open letter stated that the pandemic had uncovered the demerits of the automated systems. ‘Crucial information got lost in the Facebook filter maw, and dangerous content such as self-harm was retained,’ they wrote. ‘The Facebook algorithms will take years to attain the actual rate of advancement to regulate content automatically. It may never be achieved.’

Facebook has made major investments in machine learning and artificial intelligence to review content. The firm has been open about its desire to let automated systems perform more tasks over time.

Its main stress is preventing dangerous and false content from reaching its platform. It has earned heavy criticism for mistreating contractors, who claim to be overworked, affecting their mental wellbeing.

In the letter they wrote on Wednesday, the workers said, ‘Time has come for you to appreciate this and take our work seriously.’

Related Posts

Leave a Reply

Your email address will not be published.