UK details requirements to protect children from ‘toxic algorithms’

Share via:


The UK is calling on search and social media companies to “tame toxic algorithms” that recommend harmful content to children, or risk billions in fines. On Wednesday, the UK’s media regulator Ofcom outlined over 40 proposed requirements for tech giants under its Online Safety Act rules, including robust age-checks and content moderation that aims to better protect minors online in compliance with upcoming digital safety laws. 

“Our proposed codes firmly place the responsibility for keeping children safer on tech firms,” said Ofcom chief executive Melanie Dawes. “They will need to tame aggressive algorithms that push harmful content to children in their personalized feeds and introduce age-checks so children get an experience that’s right for their age.”

Specifically, Ofcom wants to prevent children from encountering content related to things like eating disorders, self-harm, suicide, pornography, and any material judged violent, hateful, or abusive. Platforms also have to protect children from online bullying and promotions for dangerous online challenges, and allow them to leave negative feedback on content they don’t want to see so they can better curate their feeds.

Bottom line: platforms will soon have to block content deemed harmful in the UK even if it means “preventing children from accessing the entire site or app,” says Ofcom.

The Online Safety Act allows Ofcom to impose fines of up to £18 million (around $22.4 million) or 10 percent of a company’s global revenue — whichever figure is greater. That means large companies like Meta, Google, and TikTok risk paying substantial sums. Ofcom warns that companies who don’t comply can “expect to face enforcement action.”

Companies have until July 17th to respond to Ofcom’s proposals before the codes are presented to parliament. The regulator is set to release a final version in Spring 2025, after which platforms will have three months to comply.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

UK details requirements to protect children from ‘toxic algorithms’


The UK is calling on search and social media companies to “tame toxic algorithms” that recommend harmful content to children, or risk billions in fines. On Wednesday, the UK’s media regulator Ofcom outlined over 40 proposed requirements for tech giants under its Online Safety Act rules, including robust age-checks and content moderation that aims to better protect minors online in compliance with upcoming digital safety laws. 

“Our proposed codes firmly place the responsibility for keeping children safer on tech firms,” said Ofcom chief executive Melanie Dawes. “They will need to tame aggressive algorithms that push harmful content to children in their personalized feeds and introduce age-checks so children get an experience that’s right for their age.”

Specifically, Ofcom wants to prevent children from encountering content related to things like eating disorders, self-harm, suicide, pornography, and any material judged violent, hateful, or abusive. Platforms also have to protect children from online bullying and promotions for dangerous online challenges, and allow them to leave negative feedback on content they don’t want to see so they can better curate their feeds.

Bottom line: platforms will soon have to block content deemed harmful in the UK even if it means “preventing children from accessing the entire site or app,” says Ofcom.

The Online Safety Act allows Ofcom to impose fines of up to £18 million (around $22.4 million) or 10 percent of a company’s global revenue — whichever figure is greater. That means large companies like Meta, Google, and TikTok risk paying substantial sums. Ofcom warns that companies who don’t comply can “expect to face enforcement action.”

Companies have until July 17th to respond to Ofcom’s proposals before the codes are presented to parliament. The regulator is set to release a final version in Spring 2025, after which platforms will have three months to comply.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Heavy lifters: the key players powering ecommerce in Indonesia

Funding for Indonesia’s logistics and ecommerce enabler sectors...

Crypto.com acquires Australian brokerage firm Fintek

Crypto.com will expand its range of financial services...

Musk’s amended lawsuit against OpenAI names Microsoft as defendent

Elon Musk’s lawsuit against OpenAI accusing the company...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!