Instagram is rolling out new safety measures to shield teenagers from harmful content, including a PG-13 content filter that will be automatically applied to all users under 18.
Under the new policy, teens will only see content appropriate for a PG-13 movie rating by default—blocking posts with extreme violence, sexual nudity, or graphic drug use. They will not be able to change this setting without parental approval.
Instagram is also introducing a “Limited Content” filter, which restricts underage users from viewing or commenting on posts that fall under stricter moderation. Additional rules will limit chats between teens and AI bots, expanding teen protections introduced earlier by chatbot companies like OpenAI and Character.AI.
The company is blocking teen access to age-inappropriate accounts, removing such profiles from recommendations, and preventing interaction even if teens follow them. Instagram is also testing parental tools that allow adults to flag inappropriate content for review.
The new rules are rolling out in the U.S., U.K., Australia, and Canada, with global expansion planned for next year.














