Tucker Talks Taboos After MSM Ignores Instagram Kiddie-Porn Bombshell

Via ZeroHedge

After his first episode topped 100 million views, Tucker Carlson is back with Episode 2, exploring how we, as a population, are controlled (or coerced) directly (through laws) or indirectly (through taboos).

Carlson observes the changing societal taboos in America, suggesting that they are being dictated from above rather than evolving organically, focusing explicitly on the shift in attitudes towards race-based attacks, adultery in politics, and child molestation.

“Let’s say you wanted to control a country,” the former Fox News man begins rather joltingly.

“Well,” he explains “you’d want to make sure you had the complete obedience of everybody within your borders who was authorized to use deadly force… you’d start with the military… [and other agencies] like the IRS.”

“Controlling the guns would be a top priority for you if ever wanted to go dictatorial.”

But, Carlson, asks, what if you wanted more, not simply to control people’s behavior, “but to control how they think.”

“In that case,” he remarks, “you’d need to take charge of its taboos.”

A taboo is something that by popular consensus is not allowed, it is not illegal, but it doesn’t need to be.

“Over time, social prohibitions are more powerful and more enduring than laws.”

Continue reading “Tucker Talks Taboos After MSM Ignores Instagram Kiddie-Porn Bombshell”

On Section 230 and Instagram’s child pornography problem

Guest Post by Alex Berenson

The White House and social media companies care more about censoring views they don’t like than the facilitation of child porn – and rape – on their platforms

Instagram has a problem with child sexual abuse.

Instagram – and its parent company, Meta Platforms, which also owns Facebook – do not seem to care.

The Wall Street Journal ran a devastating piece today on child pornography and rape networks that Instagram does not merely tolerate but facilitates.

The piece is filled with ugly revelations from top to bottom. Near its end the reporters report “Instagram’s [automated] suggestions were helping to rebuild [a pedophile] network that the platform’s own safety staff was in the middle of trying to dismantle.”

Even worse, they reveal that Instagram would allow users to see posts it knew might be harmful or illegal, after a short warning:

Continue reading “On Section 230 and Instagram’s child pornography problem”