Facebook is an essential social network to keep in touch with your friends wherever you are. Accessible online or in mobile application for Android and iOS.
-
Downloads:
230 -
Release Date:
18/12/2024 -
Author:
Facebook Inc -
License:
Free license -
Categories:
Internet – Communication
-
Operating System:
Android, Service Online, Windows 10/11, iOS iPhone / iPad
Every day, very violent graphic content circulates on Facebook. Moderators find themselves dealing with images of murder, suicide, sexual abuse of minors and terrorism. Of these, 140 are suffering from post-traumatic stress disorder, as explained by Dr Ian Kanania, head of mental health services at Nairobi’s Kenyatta National Hospital.
Moderators who encounter violent content
Faced with this very alarming assessment for the mental health of moderators, legal action was taken against the service provider Metta and Samsource Kenya. Daily exposure to violent content generally produced anxiety disorders and major depressive disorders in moderate subjects.
Dr. Ian Kanania also found severe or very severe post-traumatic stress symptoms in 81% of cases that persisted for at least a year after a moderator left. At least 40 have developed addictions to drugs such as alcohol (cannabis, cocaine, amphetamines) and sleeping pills.
Moderators’ exposure to images and videos of necrophilia, zoophilia or suicide causes extreme physical reactions: fainting, vomiting, screaming and running away from the workstation. Those responsible for removing material from terrorist groups live in fear of surveillance and targeting.
Others reported marital breakdown, decreased sex drive, and family isolation. Even more alarmingly, four moderators elicited tropophobia: Aversion to repeated patterns of tiny holes after seeing photos of rotting corpses.
Very difficult working conditions
Working conditions at Meta’s subcontractor facility in Kenya are very difficult. Moderators from several African countries work in a cold space for 8 to 10 hours a day with bright lights and constant monitoring of their activities. Between 2019 and 2023, their salary was eight times lower than that of their American colleagues who moderated content from the African continent.
Administrators prosecuted on several grounds: intentional psychological harm, unfair employment practices, human trafficking, modern slavery and unfair dismissal.
Foxglove is a British organization that is supporting this legal action. Its founder Martha Dark explains. “The evidence is incontrovertible: Facebook moderation is a dangerous job that inflicts lifelong traumatic stress on nearly everyone who does it. In Kenya, 100 percent of the hundreds of former moderators it tested Shocked.”
As for Meta, the company says it takes moderator safety seriously. Contracts with subcontractors outline requirements for consultation, training, on-site support and access to private health care. The social network says it offers above-market salaries and uses techniques such as blurring, sound suppression and monochrome rendering to limit exposure to violent content.