Meta is expanding its safety measures for teenagers on Instagram, introducing a block on livestreaming by under-16s in an effort to protect children from harm and require tech platforms to shield under-18s from damaging material.
Teenagers’ Livestreaming on Instagram Blocked Due to Safety Concerns
Meta is expanding its safety measures for teenagers on Instagram, introducing a block on livestreaming by under-16s. The social media company is also extending its under-18 safeguards to the Facebook and Messenger platforms.
Launched in 2010 by Kevin Systrom and Mike Krieger, Instagram is a photo and video-sharing social networking platform.
With over 1 billion active users, it has become one of the most popular social media platforms worldwide.
Instagram's unique features, such as filters and hashtags, have made it a favorite among users.
The platform allows users to share moments from their lives, connect with others, and discover new content.
Founded by Facebook in 2012, Instagram continues to evolve with new features and updates.
The changes come as part of an effort to protect children from harm and require tech platforms to shield under-18s from damaging material such as suicide and self-harm-related content. According to Meta, 54 million under-18s use Instagram, with more than 90% of 13- to 15-year-olds keeping their default restrictions.
New Safety Features for Teenagers on Facebook and Messenger
The new features will be rolled out initially in the US, UK, Australia, and Canada. On these platforms, users under the age of 16 will need parental permission to change settings, while 16- and 17-year-olds will be able to change them independently.

The expansion is part of a broader effort by Meta to enhance safety on its platforms. The company’s then president of global affairs, ‘Nick Clegg‘ , previously stated that the aim was to ‘shift the balance in favour of parents’ when it came to using parental controls.
Meta builds technologies that help people connect, find communities and grow businesses.
The company's primary products include Facebook, Instagram, Threads and WhatsApp, in addition to other products and services.
With more than 2 billion monthly active users across its platforms, Meta is one of the world's most widely used technology services.
Its technologies aim to give people the power to build community and bring the world closer together.
UK Implements Online Safety Act
The announcement comes as the UK implements the Online Safety Act, which requires sites and apps within its scope to take steps to prevent the appearance of illegal content. The act also contains provisions for protecting children from harm and shielding under-18s from damaging material.
The Online Safety Act is a legislative framework designed to protect internet users from online harm.
Introduced in various countries, it aims to regulate social media platforms and online services to ensure user safety.
Key provisions include mandatory age verification for users under 18, stricter moderation policies, and penalties for non-compliance.
The act also requires online services to report 'terrorist content' and child exploitation material to authorities.
By balancing free speech with protection from harm, the Online Safety Act seeks to create a safer digital environment.
Reports have suggested that the act could be watered down as part of a UK-US trade deal, prompting protests from child safety groups. However, Meta‘s decision to expand its safety measures on Instagram is seen as a positive step towards protecting young users online.
- theguardian.com | Meta blocks livestreaming by teenagers on Instagram