Instagram has announced the global rollout of enhanced content restrictions for teen accounts as part of efforts to improve online safety.
The restrictions, initially introduced in select countries including Australia, Canada, the United Kingdom and the United States, limit exposure to content involving violence, nudity, drug use, and other sensitive themes.
The platform has also introduced a “Limited Content” setting, which applies stricter filters to posts and comments for younger users.
The move comes amid growing regulatory pressure on parent company Meta following legal challenges in the United States related to the impact of social media on teenagers.
Meta said the updated guidelines are intended to align with age-appropriate standards, while acknowledging that no system can completely eliminate exposure to inappropriate content.
The company has also introduced additional safety measures in recent months, including parental notifications for sensitive searches and expanded controls for AI features.














