Tech World

Meta sued by content moderators for psychological trauma caused by exposure to violent and graphic content 

Over 20% of the people employed by CCC Barcelona Digital Services, a company contracted by Meta to monitor content on Facebook and Instagram, are currently on sick leave due to psychological trauma. 

The hiring criteria for content moderators in Barcelona only included proficiency in the local language. On the surface, this appeared to be an appealing job opportunity, offering a monthly salary that could reach up to €2,400 in exchange for reviewing 300 to 500 videos each day. 

However, for many of those who accepted these positions, the experience turned out to be profoundly distressing.

The content moderators tasked with evaluating the content posted on these social networks encountered distressing imagery that depicted the darkest aspects of human behavior, including videos of murders, dismemberments, sexual assaults, and live suicides. 

These workers have strongly criticized the working conditions imposed by CCC Barcelona Digital Services, which have exposed them to severe mental health challenges such as post-traumatic stress disorder, obsessive-compulsive disorder, and depression.

Employees have taken their grievances to the Irish High Court, where Meta maintains its European headquarters. One of their main concerns is the lack of access to psychological support. 

Furthermore, the content moderators claim they were not subject to mental health assessments prior to employment, which could have identified those who were unsuitable for such a role due to pre-existing conditions.

Many of the affected workers continue to receive psychological treatment even years after their employment. Two significant concerns raised by lawyers are Meta’s policy of requiring employees to watch entire videos in order to provide detailed justifications for censorship. 

They assert that if the reason for censorship is apparent within the first 10 seconds, there is no need to view the entire video repeatedly, especially when multiple users report the same content.

According to, Diane Treanor, a lawyer with Coleman Legal who is representing the content moderators, “We have clients from Ireland, Poland, Germany and Spain. Some of them realised the job was hurting them after a few weeks, others claim they didn’t realise the impact until their family and friends told them their personalities had changed. 

When watching a video, many of these people collapsed, they couldn’t go on. The psychologist would listen to them and then tell them that what they were doing was extremely important for society, that they had to imagine that what they were seeing was not real but a film, and that they should go back to work.”

Facebook claims that there are “technical solutions to limit exposure to graphic material as much as possible”.

The company added: “We take the support of content reviewers seriously and require all companies we work with to provide 24/7 on-site support from trained practitioners. People who review content on Facebook and Instagram can adjust the content review tool so that graphic content appears completely blurred, in black and white, or without sound.

About the author

Scoopzone24

Add Comment

Click here to post a comment