Is Meta Doing Enough to Shield Teenagers From Inappropriate Content Online?

Date

Author

By Tom Linder
Alissa Haedt-Matt

In an effort to protect teenagers from inappropriate posts, social media giant Meta that they鈥檙e implementing new policies to block content related to suicide, self-harm, and eating disorders from showing up in their feeds.

The announcement was similar to those made in and , so Illinois Institute of Technology Associate Professor of Psychology Alissa Haedt-Matt remains unconvinced that the parent company of Facebook, Instagram, Threads, and WhatsApp is doing enough to protect teenagers.

鈥淚n order to really evaluate whether they鈥檙e doing enough, we need a lot more transparency on what exactly is happening,鈥 says Haedt-Matt, who is the director of Illinois Tech鈥檚 Dysregulated Eating and Mood Team (formerly the Eating Behaviors Lab). 鈥淲hen I read the policy, I thought they were already doing this; it鈥檚 a little unclear to me what exactly has changed鈥攊f anything鈥攚ith this new policy. Meta has long said they remove eating disorder content and will blur out self-harm or suicide-related content as well.鈥

The new policy announcement comes on the heels of Arturo Bejar鈥擬eta鈥檚 former director of engineering鈥 in November 2023. While Bejar鈥檚 testimony primarily addressed unwanted advances received by teenagers on Meta鈥檚 apps, he expressed concern about the company鈥檚 lack of transparency, as well.

So, what can Meta do in an effort to earn back some of the public鈥檚 trust?

鈥淚 think they could do a lot more, especially in terms of being transparent about their algorithms, about how they鈥檙e identifying eating disorder content,鈥 says Haedt-Matt. 鈥淔rom my perspective, it鈥檚 rather insidious content for somebody who鈥檚 vulnerable to developing an eating disorder, often disguised as health promotion. You know, the 鈥榯hinspiration,鈥 鈥榝itspiration,鈥 some of that is disguised as health promotion, but really looks a lot like eating disorder content.

鈥淭here needs to be transparency about what they鈥檙e finding and how they鈥檙e using this information to protect vulnerable groups, especially teens.鈥