EU opens child safety probes of Facebook and Instagram, citing addictive design concerns

Share via:


Facebook and Instagram are under formal investigation in the European Union over child protection concerns, the Commission announced Thursday. The proceedings follow a raft of requests for information to parent entity Meta since the bloc’s online governance regime, the Digital Services Act (DSA), started applying on it last August.

The development could be significant as the formal proceedings unlock additional investigatory powers for EU enforcers, such as the ability to conduct office inspects or apply interim measures. While penalties for any confirmed breaches of the DSA could reach up to 6% of Meta’s global annual turnover.

Meta’s two social networks are designated as very large online platforms (VLOPs) under the DSA. This means the company faces an extra set of rules — overseen by the EU directly — requiring it to assess and mitigate systemic risks on Facebook and Instagram, including in areas like minors’ mental health.

In a briefing with journalists, senior Commission officials said they suspect Meta of failing to properly assess and mitigate risks affecting children.

They particularly highlighted concerns about addictive design on its social networks, and what they referred to as a “rabbit hole effect”, where a minor watching one video may be pushed to view more similar content as a result of the platforms’ algorithmic content recommendation engines.

Commission officials gave examples of depression content, or content that promotes an unhealthy body image, as types of content that could have negative impacts on minors’ mental health.

They are also concerned that the age assurance methods Meta uses may be too easy for kids to circumvent.

“One of the underlying questions of all of these grievances is how can we be sure who accesses the service and how effective are the age gates — particularly for avoiding that underage users access the service,” said a senior Commission official briefing press today on background. “This is part of our investigation now to check the effectiveness of the measures that Meta has put in place in this regard as well.”

In all, the EU suspects Meta of infringing DSA Articles 28, 34, and 35. The Commission will now carry out an in-depth investigations of the two platforms’ approach to child protection.

Meta has been contacted for a response.

The EU opened a similar probe of addictive design concerns on video sharing social network TikTok last month.

The Commission also already opened two DSA investigations on Meta’s social networks: Last month it said it would investigate separate concerns related to Facebook’s and Instagram’s approach to election integrity.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

EU opens child safety probes of Facebook and Instagram, citing addictive design concerns


Facebook and Instagram are under formal investigation in the European Union over child protection concerns, the Commission announced Thursday. The proceedings follow a raft of requests for information to parent entity Meta since the bloc’s online governance regime, the Digital Services Act (DSA), started applying on it last August.

The development could be significant as the formal proceedings unlock additional investigatory powers for EU enforcers, such as the ability to conduct office inspects or apply interim measures. While penalties for any confirmed breaches of the DSA could reach up to 6% of Meta’s global annual turnover.

Meta’s two social networks are designated as very large online platforms (VLOPs) under the DSA. This means the company faces an extra set of rules — overseen by the EU directly — requiring it to assess and mitigate systemic risks on Facebook and Instagram, including in areas like minors’ mental health.

In a briefing with journalists, senior Commission officials said they suspect Meta of failing to properly assess and mitigate risks affecting children.

They particularly highlighted concerns about addictive design on its social networks, and what they referred to as a “rabbit hole effect”, where a minor watching one video may be pushed to view more similar content as a result of the platforms’ algorithmic content recommendation engines.

Commission officials gave examples of depression content, or content that promotes an unhealthy body image, as types of content that could have negative impacts on minors’ mental health.

They are also concerned that the age assurance methods Meta uses may be too easy for kids to circumvent.

“One of the underlying questions of all of these grievances is how can we be sure who accesses the service and how effective are the age gates — particularly for avoiding that underage users access the service,” said a senior Commission official briefing press today on background. “This is part of our investigation now to check the effectiveness of the measures that Meta has put in place in this regard as well.”

In all, the EU suspects Meta of infringing DSA Articles 28, 34, and 35. The Commission will now carry out an in-depth investigations of the two platforms’ approach to child protection.

Meta has been contacted for a response.

The EU opened a similar probe of addictive design concerns on video sharing social network TikTok last month.

The Commission also already opened two DSA investigations on Meta’s social networks: Last month it said it would investigate separate concerns related to Facebook’s and Instagram’s approach to election integrity.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Indian lab-grown diamond startup raises $1m seed funding

The funds will expand the team, boost brand...

Spot Bitcoin ETFs see $438M outflows — biggest since...

Spot Bitcoin ETFs saw their steepest single day...

Japan investigates Amazon for pressuring sellers on prices

The investigation focuses on claims that Amazon Japan...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!