UK Regulators Demand Stricter Age Controls on Social Media
Key Points
- 1UK regulators urge social media to enforce age restrictions by April 30.
- 2Concerns raised over harmful content exposure to children online.
- 3Tighter controls could impose significant fines on non-compliance.
In a push for child safety online, UK's media and privacy regulators, including Ofcom and the Information Commissioner’s Office (ICO), are demanding major social media platforms, such as Meta, TikTok, Snap, and YouTube, enhance their age verification processes. As part of the latest phase of Britain’s Online Safety Act, these companies must show plans by April 30 to improve safety measures, particularly in algorithmic feeds that can expose children to dangerous and addictive content.
The implications of this regulatory pressure could reshape the landscape of social media operations in the UK. With potential fines reaching up to 10% of global revenue for failing to comply, businesses face a significant incentive to adopt measures ensuring children under 16 are restricted from accessing their platforms. This move reflects urgent concerns about algorithmic safety and paves the way for stricter enforcement against companies that neglect minors' safety in the digital environment.
Free Daily Briefing
Top AI intelligence stories delivered each morning.