EU Commission Criticizes Meta's Child Protection Compliance

Key Takeaways
- 1EU claims Meta underestimates child safety risks on platforms
- 2Meta fails to meet Digital Services Act obligations
- 3Increased scrutiny may reduce child access to social media
- 4EU claims Meta underestimates child safety risks on platforms • Meta fails to meet Digital Services Act obligations • Increased scrutiny may reduce child access to social media
The European Commission has officially acknowledged that Meta has underestimated the risks children face on its platforms. This announcement is a prelude to potential penalties under the Digital Services Act (DSA) as Meta reportedly does not ensure adequate protective measures against underage users, despite the significant number of children accessing its services. Currently, around 12% of users on Meta's social media platforms are children, raising alarms about the effectiveness of the company’s safety measures.
This situation underscores a critical shift in the regulatory landscape concerning online child safety. The EU Commission emphasizes that user agreements must lead to concrete protection measures, instead of serving as mere formalities. This scrutiny could compel Meta to implement more stringent safeguards, potentially decreasing children's access to its platforms in-line with EU regulations, while also shedding light on broader issues of data sovereignty and user accountability in the digital age.