Meta plans to conceal additional content aimed at teenagers on Instagram and Facebook
Responding to growing regulatory pressure aimed at ensuring the safety of teenagers online, Meta revealed on Tuesday its commitment to imposing more stringent content control measures for young users on Instagram and Facebook. These comprehensive measures include the restriction of certain search terms on Instagram to make it more challenging for teenagers to come across content related to sensitive topics like suicide, self-harm, and eating disorders while using features such as Search and Explore.
Meta is taking an automatic enrollment approach, applying the strictest content control settings to both Instagram and Facebook for all teenagers. Initially introduced for new teen users, this setting will now be extended to encompass existing teenage users on these platforms.
In a further effort to enhance safety, Instagram is introducing new notifications tailored specifically for teenagers. These notifications aim to prompt teens to regularly check and adjust their safety and privacy settings. By opting to "Turn on recommended settings," teens can easily update their preferences with a single tap, triggering automatic adjustments such as limiting reposting permissions, tagging, mentions, and inclusion in Reels Remixes. Additionally, this update ensures that only followers can send messages, and offensive comments are better concealed.
Expected to be implemented over the coming weeks, these changes are designed to curate a more "age-appropriate" experience. Meta emphasized its commitment to removing certain sensitive content from teenagers' experiences on Instagram and Facebook, striving to strike a balance between fostering important conversations and protecting young users from inappropriate material.
However, Meta is currently facing heightened scrutiny in the United States and Europe, with allegations of its apps contributing to addictive behavior and a youth mental health crisis. In October, a lawsuit involving 33 U.S. states' attorneys general accused Meta of misleading the public about platform dangers, while the European Commission sought information on how Meta safeguards children from illegal and harmful content. Despite these efforts, Meta's recent changes faced criticism from a former employee who testified in the U.S. Senate, asserting that the company's initiatives fell short in addressing concerns and lacked effective reporting tools for teens facing harassment.
As Meta grapples with regulatory challenges, the competitive landscape with platforms like TikTok for younger users continues to evolve, underscoring the crucial need for prioritizing safety on social media platforms catering to teenagers.
Sources: adapted from an article by Maya Robertson, Author for Mobile Marketing Reads.