Meta introduces enhanced safety measures for teen users on Facebook and Messenger, with parental permission required to live stream or turn off image protections.
Meta Expands Teen Accounts to Facebook and Messenger, with Restrictions in Place
The expansion of Teen Accounts from Instagram to Facebook and Messenger marks a significant step towards providing a safer experience for young users. The system involves putting younger teens on the platforms into more restricted settings by default, with parental permission required to live stream or turn off image protections for messages.
Teen accounts have become a significant aspect of online presence, with many social media platforms offering features tailored to users under the age of 18.
These 'accounts' often come with restrictions and parental controls to ensure minors' safety and well-being.
According to a survey, 70% of teens aged 13-17 use social media, highlighting the importance of teen accounts in today's digital landscape.
A New Era of Safety Features
Meta has introduced this new feature in response to growing concerns over children and teenagers receiving unwanted nude or sexual images, or feeling pressured to share them in potential sexortion scams. The company claims that Teen Accounts have fundamentally changed the experience for teens on Instagram, but campaigners argue that it’s unclear what difference these accounts have actually made.
Online identity theft occurs when a person's personal information is stolen and used for malicious purposes.
To protect your identity online, use strong passwords and enable two-factor authentication.
Be cautious with links and attachments from unknown sources, and never share sensitive information via email or messaging apps.
Regularly review your credit reports and bank statements to detect any suspicious activity.
Parental Control and Accountability

The expanded use of Teen Accounts is beginning in the UK, US, Australia, and Canada from Tuesday. Companies that provide services popular with children have faced pressure to introduce parental controls or safety mechanisms to safeguard their experiences. In the UK, they also face legal requirements to prevent children from encountering harmful and illegal content on their platforms.
Parental control refers to the methods used by parents or guardians to regulate their child's access to digital content, online activities, and screen time.
This can include setting limits on website visits, monitoring social media usage, and controlling exposure to mature themes in games and movies.
According to a survey, 70% of parents use parental control software to monitor their child's online activity.
Implementing effective parental control strategies is essential for promoting healthy digital habits and protecting children from online risks.
The new system will notify under 18s on Facebook and Messenger that their account will become a Teen Account via in-app notifications. Younger teens will also need parental consent to go live on Instagram or turn off nudity protection, which blurs suspected nude images in direct messages. Meta has moved at least 54 million teens globally into teen accounts since they were introduced in September, with 97% of 13 to 15 year olds keeping its built-in restrictions.
A Step in the Right Direction?
While some campaigners argue that Teen Accounts are not effective in preventing harm, others see it as a step in the right direction. Drew Benvie, chief executive of social media consultancy Battenhall, says that Meta is fighting for the safest experience for teens, rather than just engaging with highly active user bases.
However, concerns remain over the company’s overall protections for young people from online harms and its data-driven practices. Prof Sonia Livingstone, director of the Digital Futures for Children centre, notes that while the expansion of Teen Accounts may be a welcome move, questions still need to be answered about Meta’s accountability for its effects on young people.
A Growing Desire for Age-Appropriate Social Media
There is a growing desire from parents and children for age-appropriate social media. The expansion of Teen Accounts may be a response to this demand, but it also highlights the need for greater transparency and accountability from companies like Meta. As the debate around online safety continues, one thing is clear: providing a safe experience for young users requires ongoing effort and commitment from tech companies.