Instagram, Facebook Introduce Measures to Conceal Harmful Content from Teen Users
- 221 Views
- Cameron Palmer
- January 10, 2024
- Technology
Meta announced on Tuesday that it would hide more sensitive information from teens on Instagram and Facebook, in response to global regulatory pressure on the social media giant to protect youngsters from harmful content on its platforms.
According to Meta, the move would make it more difficult for minors to come across sensitive information such as suicide, self-harm, and eating disorders while using Instagram tools like search and explore.
On Instagram and Facebook, all teenagers’ profiles will be subjected to the most stringent content control settings by default, and new search phrases on Instagram will be prohibited, according to Meta in a blog post.
“We want teens to have safe, age-appropriate experiences on our apps,” the blog post states. “Today, we’re announcing additional protections that are focused on the types of content teens see on Instagram and Facebook.”
Even if a teenager follows an account that posts on sensitive themes, the postings will be removed from the adolescent’s feed, according to Meta’s blog. The company stated that the steps, which are scheduled to be implemented in the coming weeks, will assist provide a more “age-appropriate” experience.
“For example, consider someone who posts about their continuing struggle with suicidal ideation. This is an important story that can help to de-stigmatize these concerns, but it is a complicated topic that may not be appropriate for many young people.
“We will now begin to remove this type of content from teens’ Instagram and Facebook experiences,” according to the company’s blog post.
Meta Under Fire for Alleged Inaction on Teen Harassment
Meta is under fire in both the United States and Europe for allegedly creating addicting apps that have contributed to a child mental health crisis. Attorneys general from 33 US states, including California and New York, sued the corporation in October, alleging that it regularly misled the public about the sensitive contents of its platforms.
In Europe, the European Commission has inquired as to how Meta protects children from illegal and harmful content. The regulatory pressure came after a former Meta employee, Arturo Bejar, testified in the US Senate that the firm was aware of harassment and other problems faced by teens on its platforms but failed to take action to address them.
Bejar urged the firm to make design adjustments to Facebook and Instagram to steer users toward more positive activities and provide better tools for young people to deal with negative experiences. Bejar stated that his own daughter had experienced unwelcome advances on Instagram, which he brought to the attention of the company’s senior leadership. He testified that Meta’s top leadership rejected his pleas.
Children have long been an enticing demographic for businesses, with the intention of attracting them as consumers at a young age and solidifying brand loyalty.
Teenagers may help get more advertisers for Meta, which has been in tough battle with TikTok for young users in recent years, in the hopes that children will continue to buy their products as they grow older.
Read more: Biden Offers Companies Millions To Reduce Dependency On Russia’s Key Resource