Instagram is implementing new restrictions to protect underage users, ensuring teens under 18 primarily view PG-13 rated content, avoiding themes like extreme violence and drug use. These settings cannot be changed without parental approval.
A stricter 'Limited Content' filter will prevent teens from seeing and posting comments on restricted posts. Instagram will also apply more restrictions to AI chatbot interactions with teens, following similar moves by OpenAI and Character.AI, which face lawsuits over potential harm to users.
Instagram is expanding teen safety tools across accounts, DMs, search, and content, blocking age-inappropriate accounts and content from being recommended to or viewed by teenagers. The platform is also testing a parental flagging system for inappropriate content.
These changes are rolling out in the U.S., U.K., Australia, and Canada, with a global rollout planned for next year. This builds on Meta's existing restrictions on content related to eating disorders and self-harm.