Privacy Policy

We use cookies to ensure that we give you the best experience on our website.

Thanks to the support of 400,000 grassroots patriots, Turning Point USA reaches and impacts millions of students on campus and online. Please consider joining our cause with a tax deductible gift today!

DONATE NOWDONATE NOW
TPUSA Live
TPUSA Live

Instagram Algorithm Promotes ‘Vast Pedophile Network,’ WSJ Reports

Instagram’s recommendation algorithms is connecting a “vast pedophile network” that promotes the sale of “child-sex material,” according to a bombshell report from the Wall Street Journal.

“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found.”

The Wall Street Journal

Instagram has allowed users to search hashtags such as #pedowhore, #preteensex, and #pedobait to connect users with accounts that sell child-exploitation materials. These accounts would post content “menus” for buyers to choose from, with some accounts going as far as advertising “meet-ups” in person with children.

While some hashtags that are known to be associated with pedophilic material have been taken down by Instagram, other terms were still permitted to be searched after a simple warning to the user that the content may contain images of child sexual abuse.

In many cases, Instagram has permitted users to search for terms that its own algorithms know may be associated with illegal material. In such cases, a pop-up screen for users warned that “These results may contain images of child sexual abuse,” and noted that production and consumption of such material causes “extreme harm” to children. The screen offered two options for users: “Get resources” and “See results anyway.” 

The Wall Street Journal

Researchers from Stanford University worked closely with the Journal to investigate the child exploitation occurring on Instagram. They found that the solicitation of child-sex-abuse material is far more prevalent on Instagram than it is on other platforms such as Twitter.

The Stanford team found 128 accounts offering to sell child-sex-abuse material on Twitter, less than a third the number they found on Instagram, which has a far larger overall user base than Twitter. Twitter didn’t recommend such accounts to the same degree as Instagram, and it took them down far more quickly, the team found.

The Wall Street Journal

Meta, the owner of Instagram, stated that the company has taken down over two dozen pedophile networks along with thousands of hashtags that imply the sexualization of children.

The company stated that they are continuously working to prevent its systems from recommending potential child-sex material, but its automated screening can’t detect “new images or efforts to advertise” the sale of such material. This has prompted Alex Stamos, the head of the Stanford Internet Observatory and Meta’s former chief security officer, to call for Meta to “reinvest in human investigators.”

The publishing of child-sex material goes against Meta’s terms of service and is an obvious violation of federal law.

“Child exploitation is a horrific crime,” a Meta spokesperson said in a response to the Journal. “We’re continuously investigating ways to actively defend against this behavior.”

The report from the Journal comes two months after law enforcement agencies warned Meta that their “end-to-end encryption” could prevent them from effectively detecting child-sex material on their platforms.

Researched and edited by Hayden Cunningham

“I would have done anything to have Turning Point USA when I was your age, so it really is a true honor to be here today.”

- Former White House Press Secretary Kayleigh McEnany