Instagram Helps Pedophiles Find Child Pornography and Arrange Meetups with Children (MUST READ)

The Wall Street Journal study found that Instagram enabled people to search hashtags such as '# pedowhore' and '# preeteensex,' which allowed them to connect to accounts selling child pornography.

Furthermore, many of these accounts often claimed to be children themselves, with handles like "little slut for you."

Generally, the accounts that sell illicit sexual material don't outright publish it. Instead, they post 'menus' of their content and allow buyers to choose what they want.

Many of these accounts also offer customers the option to pay for meetups with the children.

HOW THIS WAS ALL UNCOVERED:

The researchers set up test accounts to see how quickly they could get Instagram's "suggested for you" feature to give them recommendations for such accounts selling child sexual content.

Within a short time frame, Instagram's algorithm flooded the accounts with content that sexualizes children, with some content linking to off-platform content trading sites.

Using hashtags alone, the Stanford Internet Observatory found 405 sellers of what researchers labeled "self-generated" child-sex material, or accounts purportedly run by children themselves, with some claiming to be as young as 12.

In many cases, Instagram actually permitted users to search for terms that the algorithm knew might be associated with illegal material.

When researchers used certain hashtags to find the illicit material, a pop-up would sometimes appear on the screen, saying, "These results may contain images of child sexual abuse" and noting that the production and consumption of such material cause "extreme harm" to children.

Despite this, the pop-up offered the user two options:

1. "Get resources"
2. "See results anyway"

HOW PEDOPHILES EVADED BEING CAUGHT:

Pedophiles on Instagram used an emoji system to talk in code about the illicit content they were facilitating.

For example, an emoji of a map (🗺️) would mean "MAP" or "Minor-attracted person."

A cheese pizza emoji (🍕) would be abbreviated to mean "CP" or "Child Porn."

Accounts would often identify themselves as "seller" or "s3ller" and state the ages of the children they exploited by using language such as "on Chapter 14" instead of stating their age more explicitly.

INSTAGRAM "CRACKDOWN":

Even after multiple posts were reported, not all of them would be taken down. For example, after an image was posted of a scantily clad young girl with a graphically sexual caption, Instagram responded by saying, "Our review team has found that [the account's] post does not go against our Community Guidelines."

Instagram recommended that the user hide the account instead to avoid seeing it.

Even after Instagram banned certain hashtags associated with child pornography, Instagram's AI-driven hashtag suggestions found workarounds.

The AI would recommend the user try different variations of their searches and add words such as "boys" or "CP" to the end instead.

The Stanford team also conducted a similar test on Twitter.

INSTAGRAM VS TWITTER:

While they still found 128 accounts offering to sell child sexual abuse (less than a third of the accounts they found on Instagram), they also noted that Twitter's algorithm didn't recommend such accounts to the same degree as Instagram, and that such accounts were taken down far quicker than on Instagram.

@elonmusk just tweeted Wall Street Journal's article 2 mins ago labelling it "extremely concerning"

With algorithms and AI getting smarter, unfortunately, cases like this become more common.

In 2022, the National Center for Missing & Exploited Children in the U.S. received 31.9 million reports of child pornography, mostly from internet companies, which is a 47% increase from two years earlier.

How can social media companies, especially Meta, get better at regulating A.I in order to prevent disgusting cases such as this one?
×