Big Tech /

Instagram Reels Algorithm Slammed for Serving 'Salacious Content' Involving Minors

A new report found the platforms offers 'risqué footage of children' alongside advertisements from Pizza Hut, Disney and Bumble


Instagram Reels Algorithm Slammed for Serving 'Salacious Content' Involving Minors

The algorithm powering Instagram has come under new scrutiny after it was observed directing adults with a potentially inappropriate interest in children to underage accounts with large followings. 


Users are directed to content, including the short form videos known as Reels, based on what the algorithm determines to be their interests. 

The Wall Street Journal ran an experiment to determine what content the platform’s algorithm would direct to users who follow teenage and preteen social media influencers.

The newspaper “set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults.”

The Instagram Reels algorithm sent the outlet’s test accounts “a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff” as well as “a video of a man lying on a bed with his arm around what the caption said was a 10-year-old girl.”

In between the videos, the test accounts were served ads from major brands like the dating app Bumble, Disney and Pizza Hut. 

A similar test was run by the Canadian Centre for Child Protection, which documented the same content patterns. 

Meta, the parent company of both Instagram and Facebook, promised to fix its algorithm in June after it was found to promote pedophilic content during a separate investigation by The Journal and researchers at Stanford University and the University of Massachusetts Amherst.

"Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests," stated the report. "Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest.”

In addition to being illegal, the promotion of sexually explicit content involving minors violates Meta’s terms of service. 

"Child exploitation is a horrific crime," Meta said in a statement, per Engadget. "We’re continuously investigating ways to actively defend against this behavior."

The technology company said it is working to take down all child sexual abuse material networks and has blocked thousands of related hashtags. 

Concerns about the content and potential placement of advertisements prompted several companies to pull back from the platform. The online dating service Match canceled Meta advertisements for some its apps, like Tinder, in October. 

“We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content,” Match spokeswoman Justine Sacco told The Journal.

Samantha Stetson, Meta’s vice president for Client Council and Industry Trade Relations, told Fox News that the platform continues to “invest aggressively to stop” the spread of the concerning content “which remains very low.”

“Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions,” she said following the latest Journal report.

"These results are based on a manufactured experience that does not represent what billions of people around the world see every single day when they use our products and services," Stetson added. "We tested Reels for nearly a year before releasing it widely - with a robust set of safety controls and measures. In 2023, we actioned over 4 million Reels per month across Facebook and Instagram globally for violating our policies."

*For corrections please email [email protected]*