UK details requirements to protect children from ‘toxic algorithms’

Share via:

The UK is calling on search and social media companies to “tame toxic algorithms” that recommend harmful content to children, or risk billions in fines. On Wednesday, the UK’s media regulator Ofcom outlined over 40 proposed requirements for tech giants under its Online Safety Act rules, including robust age-checks and content moderation that aims to better protect minors online in compliance with upcoming digital safety laws. 

“Our proposed codes firmly place the responsibility for keeping children safer on tech firms,” said Ofcom chief executive Melanie Dawes. “They will need to tame aggressive algorithms that push harmful content to children in their personalized feeds and introduce age-checks so children get an experience that’s right for their age.”

Specifically, Ofcom wants to prevent children from encountering content related to things like eating disorders, self-harm, suicide, pornography, and any material judged violent, hateful, or abusive. Platforms also have to protect children from online bullying and promotions for dangerous online challenges, and allow them to leave negative feedback on content they don’t want to see so they can better curate their feeds.

Bottom line: platforms will soon have to block content deemed harmful in the UK even if it means “preventing children from accessing the entire site or app,” says Ofcom.

The Online Safety Act allows Ofcom to impose fines of up to £18 million (around $22.4 million) or 10 percent of a company’s global revenue — whichever figure is greater. That means large companies like Meta, Google, and TikTok risk paying substantial sums. Ofcom warns that companies who don’t comply can “expect to face enforcement action.”

Companies have until July 17th to respond to Ofcom’s proposals before the codes are presented to parliament. The regulator is set to release a final version in Spring 2025, after which platforms will have three months to comply.


Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

UK details requirements to protect children from ‘toxic algorithms’

The UK is calling on search and social media companies to “tame toxic algorithms” that recommend harmful content to children, or risk billions in fines. On Wednesday, the UK’s media regulator Ofcom outlined over 40 proposed requirements for tech giants under its Online Safety Act rules, including robust age-checks and content moderation that aims to better protect minors online in compliance with upcoming digital safety laws. 

“Our proposed codes firmly place the responsibility for keeping children safer on tech firms,” said Ofcom chief executive Melanie Dawes. “They will need to tame aggressive algorithms that push harmful content to children in their personalized feeds and introduce age-checks so children get an experience that’s right for their age.”

Specifically, Ofcom wants to prevent children from encountering content related to things like eating disorders, self-harm, suicide, pornography, and any material judged violent, hateful, or abusive. Platforms also have to protect children from online bullying and promotions for dangerous online challenges, and allow them to leave negative feedback on content they don’t want to see so they can better curate their feeds.

Bottom line: platforms will soon have to block content deemed harmful in the UK even if it means “preventing children from accessing the entire site or app,” says Ofcom.

The Online Safety Act allows Ofcom to impose fines of up to £18 million (around $22.4 million) or 10 percent of a company’s global revenue — whichever figure is greater. That means large companies like Meta, Google, and TikTok risk paying substantial sums. Ofcom warns that companies who don’t comply can “expect to face enforcement action.”

Companies have until July 17th to respond to Ofcom’s proposals before the codes are presented to parliament. The regulator is set to release a final version in Spring 2025, after which platforms will have three months to comply.


Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Security Bite: Most common macOS malware in 2024 so...

It is a long-standing misconception that Macs are...

Kuo: iPhone 16 Pro to add new rose titanium...

Corroborating previous reports, analyst Ming-Chi Kuo today tweeted...

Hollywood agency CAA aims to help stars manage their...

Creative Artists Agency (CAA), one of the top...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!