Meta faces third lawsuit in Kenya as moderators claim illegal sacking and blacklisting

Share via:

Meta is facing its third lawsuit in Kenya, as former content moderators claim they were illegally sacked and blacklisted by the social media giant. The moderators, who were employed by outsourcing firm Samasource, allege that they were dismissed without notice and were not paid their full dues.

The lawsuit, which was filed in a Kenyan court last week, seeks compensation for the affected moderators, who claim they were subjected to poor working conditions and unfair treatment while working for Samasource.

The moderators allege that they were expected to review up to 1,000 pieces of content per day, including graphic images and videos, without proper training or support. They also claim that they were subjected to verbal abuse and were not given adequate breaks or time off.

In addition to these allegations, the moderators also claim that they were blacklisted by Meta, preventing them from finding work in the industry. They allege that Meta falsely accused them of leaking confidential information and sharing user data, despite there being no evidence to support these claims.

This is not the first time that Meta has faced legal action in Kenya. In 2019, the company was sued by a Kenyan activist over allegations that it had failed to protect users’ data and had facilitated the spread of hate speech on its platform.

In response to the latest lawsuit, Meta has stated that it takes the allegations seriously and is committed to ensuring that all of its moderators are treated fairly and with respect. The company has also stated that it has taken steps to improve working conditions for content moderators, including providing them with better training and support.

The lawsuit is likely to be closely watched by human rights advocates and industry experts, who have raised concerns about the working conditions faced by content moderators and the impact of their work on their mental health.

Content moderation has become an increasingly important issue for social media companies, as they seek to address concerns about the spread of hate speech, misinformation, and other harmful content on their platforms. However, the work can be emotionally taxing and can take a toll on the mental health of moderators, who are often exposed to graphic and disturbing content.

The ongoing legal challenges faced by Meta in Kenya highlight the need for social media companies to take a more proactive approach to ensuring that their content moderators are treated fairly and with respect. This includes providing them with adequate training, support, and resources, as well as implementing measures to protect their mental health and wellbeing.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Meta faces third lawsuit in Kenya as moderators claim illegal sacking and blacklisting

Meta is facing its third lawsuit in Kenya, as former content moderators claim they were illegally sacked and blacklisted by the social media giant. The moderators, who were employed by outsourcing firm Samasource, allege that they were dismissed without notice and were not paid their full dues.

The lawsuit, which was filed in a Kenyan court last week, seeks compensation for the affected moderators, who claim they were subjected to poor working conditions and unfair treatment while working for Samasource.

The moderators allege that they were expected to review up to 1,000 pieces of content per day, including graphic images and videos, without proper training or support. They also claim that they were subjected to verbal abuse and were not given adequate breaks or time off.

In addition to these allegations, the moderators also claim that they were blacklisted by Meta, preventing them from finding work in the industry. They allege that Meta falsely accused them of leaking confidential information and sharing user data, despite there being no evidence to support these claims.

This is not the first time that Meta has faced legal action in Kenya. In 2019, the company was sued by a Kenyan activist over allegations that it had failed to protect users’ data and had facilitated the spread of hate speech on its platform.

In response to the latest lawsuit, Meta has stated that it takes the allegations seriously and is committed to ensuring that all of its moderators are treated fairly and with respect. The company has also stated that it has taken steps to improve working conditions for content moderators, including providing them with better training and support.

The lawsuit is likely to be closely watched by human rights advocates and industry experts, who have raised concerns about the working conditions faced by content moderators and the impact of their work on their mental health.

Content moderation has become an increasingly important issue for social media companies, as they seek to address concerns about the spread of hate speech, misinformation, and other harmful content on their platforms. However, the work can be emotionally taxing and can take a toll on the mental health of moderators, who are often exposed to graphic and disturbing content.

The ongoing legal challenges faced by Meta in Kenya highlight the need for social media companies to take a more proactive approach to ensuring that their content moderators are treated fairly and with respect. This includes providing them with adequate training, support, and resources, as well as implementing measures to protect their mental health and wellbeing.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Hash-based zero-knowledge tech can quantum-proof Ethereum — XinXin Fan

Google, Microsoft, Amazon, and IBM are some of...

Indie App Spotlight: ‘Pestle’ is the ultimate recipe manager,...

Welcome to Indie App Spotlight. This is a weekly...

Matrimony.com Forays Into Online Job Market With ‘ManyJobs’

SUMMARY ManyJobs.com will target “grey collar” job seekers and...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!