On Section 230 and Instagram’s child pornography problem

Guest Post by Alex Berenson

The White House and social media companies care more about censoring views they don’t like than the facilitation of child porn – and rape – on their platforms

Instagram has a problem with child sexual abuse.

Instagram – and its parent company, Meta Platforms, which also owns Facebook – do not seem to care.

The Wall Street Journal ran a devastating piece today on child pornography and rape networks that Instagram does not merely tolerate but facilitates.

The piece is filled with ugly revelations from top to bottom. Near its end the reporters report “Instagram’s [automated] suggestions were helping to rebuild [a pedophile] network that the platform’s own safety staff was in the middle of trying to dismantle.”

Even worse, they reveal that Instagram would allow users to see posts it knew might be harmful or illegal, after a short warning:

(“See results anyway.” Unbelievable but true, this is real:)

SOURCE

I’d call the piece an expose, except other outlets have run similar attacks on Instagram’s enabling of child sexual abuse for years, without any effective response from Instagram, Meta, or Facebook. If anything, the problem appears to have worsened since 2020, when school closures left children prey for abusive adults.

The Journal article makes clear that the problem is not that the users posting this content are sophisticated or technologically savvy. They are not using encryption or even trying very hard to hide the content:

The pedophilic accounts on Instagram mix brazenness with superficial efforts to veil their activity, researchers found. Certain emojis function as a kind of code, such as an image of a map—shorthand for “minor-attracted person”—or one of “cheese pizza,” which shares its initials with “child pornography.”

The users don’t try harder to hide what they’re doing because they can’t – they’re chasing new buyers and users. Instagram’s virtue for them is that it is wide-open.

But why doesn’t Instagram try harder?

Assuming the answer is not that Meta and Instagram are run by a pedophile cabal – and let’s all hope that’s not the answer – the reason is that they don’t have to. The previous stories generated a day or two of bad press and then vanished.

Meanwhile, Section 230 of the Communications Decency Act, the infamous Section 230, gives social media companies essentially complete immunity for user-generated content.

Even a 2018 law called the Fight Online Sex Trafficking Act – which, as its name implies, is meant to increase the legal liability companies faced – has hardly pierced 230’s legal veil.

Last year, the federal 9th Circuit dismissed a claim from women who said the bulletin-board site Reddit had allowed images of them being abused as minors. And on May 30, the Supreme Court declined to hear the case – again refusing to set any limits on Section 230 and the protection it gives the companies.

(Smile for the camera, kiddo!)

SOURCE

This issue incenses me not just because I have three kids but because I know personally social media platforms can move quickly to ban content when it bothers them. Instagram has repeatedly taken down posts of mine that are nothing more than screenshots of my Substack articles reporting on the mRNAs.

But I am far from alone. During Covid, Instagram and Facebook heavily censored anti-lockdown posts. Facebook even banned posts on the lab leak theory until late May 2021.

Instead of putting the same effort into stopping child pornography and even use of its network to set up real-world physical sexual abuse of minors, Facebook and Instagram appear to be doing the minimum possible, relying on automated systems that match images to an existing database of child sexual abuse photos and videos.

Facebook may have concluded that using human moderators to examine images and hash tags would expose them to legal liability for pornography. Worse, it may have decided that setting strict automated limits would risk making it harder for bikini models who have some of Instagram’s largest audiences to post new glamour shots.

(21,559 likes. Willow Hand is 24, but you get the point. So does panda_wants_gummibears.)

The great irony here is that Section 230 explicitly allows for social media companies to move against sexually abusive content – its section C(2) allows for bans of “obscene” material in “good faith.”

But the companies would rather rely on the broader protections the federal 9th Circuit and other courts have said the law’s section C(1) gives them. Courts interpret 230 as allowing the companies both to censor content and users whenever they like and avoid any liability for the content they do allow.

They have the best of both worlds, and they use it. At this point, unless the Supreme Court restricts Section 230, it now seems that only boycotts and possibly legal and Congressional investigations of top executives will cause Meta and Instagram to tighten their rules against pedophiles.

Legal immunity is a hell of a drug.

-----------------------------------------------------
It is my sincere desire to provide readers of this site with the best unbiased information available, and a forum where it can be discussed openly, as our Founders intended. But it is not easy nor inexpensive to do so, especially when those who wish to prevent us from making the truth known, attack us without mercy on all fronts on a daily basis. So each time you visit the site, I would ask that you consider the value that you receive and have received from The Burning Platform and the community of which you are a vital part. I can't do it all alone, and I need your help and support to keep it alive. Please consider contributing an amount commensurate to the value that you receive from this site and community, or even by becoming a sustaining supporter through periodic contributions. [Burning Platform LLC - PO Box 1520 Kulpsville, PA 19443] or Paypal

-----------------------------------------------------
To donate via Stripe, click here.
-----------------------------------------------------
Use promo code ILMF2, and save up to 66% on all MyPillow purchases. (The Burning Platform benefits when you use this promo code.)
Click to visit the TBP Store for Great TBP Merchandise
Subscribe
Notify of
guest
4 Comments
Swrichmond
Swrichmond
June 8, 2023 7:32 am

‘or one of “cheese pizza,” which shares its initials with “child pornography’

Comet Pizza

Dangerous Variant
Dangerous Variant
June 8, 2023 9:04 am

It is typical globocapital BS that these companies are both “platforms”, with no obligation to editorialize content, and “media outlets”, with the right to editorialize because free market, depending on the nature of said content.

The same thing is rampant in the banking industry with their straddling of various functions that violate basic regulatory principles except when they don’t because reasons of some smoke and mirrors ‘perspective’.

Even so these platforms are null if people would stop using them. “But I don’t want my daughter to be some weirdo or outcast for not having a smartphone/FB/Insta/Tok etc etc etc” is a problem a lot closer to home. The irony is lost on all these parents complaining about creepy creeps creeping on their kids who are posting all kinds of stuff that would have been inappropriate not that long ago in the real world. But then most of these parents will shill out their kids for likes just the same. The whole “culture” is sick.

WilliamtheResolute
WilliamtheResolute
June 8, 2023 10:53 am

All media platforms are CIA honeytraps…these creatures only prosecute those that threaten some aspect of the cabal. I advocate hanging the robot Zuckerberg by his genitals at the castle gate as a warning to all the pedo dirtbags…worked in the medieval.

WTF
WTF
June 8, 2023 4:26 pm

Deleted my facebook account years ago and have never had a twitter or instagram account.