Tech giant Meta, the parent company of Instagram, has announced a major policy change to make its platform safer for teenagers. Soon, content on Instagram will be rated using a movie-style PG-13 system, limiting what younger users can view online.
Under this system, Meta will automatically place all teen accounts in the PG-13+ category, restricting access to explicit, violent, or drug-related content. Teens will not be able to modify these safety settings on their own.
The decision follows growing pressure from advocacy groups and lawsuits that accused Meta of failing to protect teenagers from harmful and sexually suggestive content. The new guidelines, based on Motion Picture Association standards, will also extend to Meta’s AI tools.
Meta had previously introduced several safety measures, but many proved ineffective. Reports even found that Meta’s chatbots sometimes displayed romantic or sensual behaviour, prompting further criticism.
Parents and child-safety advocates have largely welcomed Meta’s latest decision, calling it a positive step toward shielding young users from psychologically harmful online exposure.




