I’ve been working on a story for a few months that I’m excited to share. It’s about a 21-year-old guy who was radicalized into the far-right, with help from his YouTube recommendations.
Didn’t realize that exposure to ideas of Nobel Laureate economist Milton Friedman put one in danger of alt-right brainwashing. But there he is, on the front page of the , in a collage for a story titled The Making of a YouTube Radical. []
Today on A1: the story that has ruined my mentions for 24 hours and my YouTube recommendations for much longer than that. Very proud to work at a place that takes internet culture this seriously.
This deep-dive may be the best story I’ve read about how YouTube’s algorithm became a grooming engine for the alt-right and reactionary cultural politics generally.
A crucial story of our times, by . New technologies are driving profound cultural shifts but it appears few are as powerful as YouTube’s recommendation engine and its effect on young men.
What those who get radicalized had in common was watching YouTube, says this NYT article. A generation ago what those who got radicalized had in common was reading books. Solution: ban those books? No.
An outstanding, important read by . Parents, especially parents of teens, need to read this to understand these patterns. It’s a crisis. People are exploiting listless young men for profit. And it usually starts with lonely teens.
Caleb Cain was a college dropout looking for direction. He was then pulled into YouTube’s far-right universe, watching thousands of videos filled with conspiracy theories, misogyny and racism. "I was brainwashed."
THREAD: This article in the by Tech reporter manages the unholy trio of the Big Media/Big Tech collusion: defamation; de-platforming & discriminatory gatekeeping. First, it defames & . 1/
this piece on a young man's indoctrination into the right fringes via YouTube (and how he was pulled out of it...also via YouTube) is stunning. Includes a full analysis of the guy's YouTube viewing history over a number of years. Read it.
There's a lot of fear-mongering about algorithms in here, but given it was left-wing videos that saved this guy from the alt-right, it would seem the overarching message is actually just "the answer to bad speech is more speech."
The subject of this article grew up in a broken home with no father / absent father. Now he wants Daddy (big tech) to protect him from mean scary videos. Like most of his generation he’s lost, and the left with content he found will leave him lacking.
Massive hit piece on YouTubers in today's New York Times. The latest wave of censorship is just getting started. They smell blood.
. is radicalising young men, and the first step is usually anti-feminism.
The elite media continues its quest to pressure tech companies into more stringently regulating speech, in the name of combating “extremism” and “hate” -- whatever that means
They are coming for everyone not towing the leftist agenda. Although not mentioned by name in the article, photos of and even are displayed. gets a shout-out. We're entering very strange times.
Read this tremendous piece by ⁦⁩, which is exactly how it goes: The Making of a YouTube Radical - The New York Times
hello parents! i know things are really hard and you are (understandably) giving kids and teens lots of screen time. but, *please be careful* about what your teens are looking at on youtube. the far right is always recruiting. don't let this be your kid
This may be the most nakedly partisan story the has ever published, totally devoid of any journalistic value. Looking forward to breaking it down on today's show.
There’s been some renewed interest in this story because of a poorly designed study that claimed to debunk it. A few points! 
Mind you, this is from an American perspective. But the same engine and dynamics apply to all extremisms in the world. From India to Europe to the Middle East. Wherever there's extremism, YouTube's AI will make sure to put it on steroids. (via )
"The new A.I., known as Reinforce, was a kind of long-term addiction machine. It was designed to maximize users engagement over time by predicting which recommendations would expand their tastes and get them to watch not just one more video but many more."
. interviewed Caleb about his radicalization. To back up his recollections, Caleb downloaded and sent Kevin his entire YouTube history, a log of more than 12,000 videos dating to 2015:
"The common thread in many of these stories is YouTube and its recommendation algorithm." The Making of a YouTube Radical
Social media has transformed society/civilization in ways we are only beginning to fully appreciate. takes a deep look at one important aspect of the shift, courtesy of YouTube.
There are “countless versions” of Cain’s story: “aimless young man — usually white, frequently interested in video games” seduced into communities propagating misogyny, homophobia, racism and more. This piece is familiar to anyone who understood Gamergate.
YouTube has inadvertently created a dangerous on-ramp to extremism by combining a business model that rewards provocative videos with exposure & advertising dollars, and an algorithm that guides users down paths meant to keep them glued to their screens
can't make this stuff up. The name YouTube gave to its internal AI algorithm is called "Reinforce."
“There’s a spectrum on YouTube between the calm section and Crazytown. If I’m YouTube and I want you to watch more, I’m always going to steer you toward Crazytown”
"researchers say YouTube has inadvertently created a dangerous on-ramp to extremism by combining…a business model that rewards provocative videos with exposure & ad $, & an algorithm that guides users down…paths meant to keep them glued to their screens"
Do algorithms have politics? Most definitely. At present ones at the heart of YouTube promote right wing, racist, misogynist, openly fascist world views.
YouTube is using AI to manipulate us. They improved their algorithms using reinforcement learning and tricked us into watching... less than 1% more content?
“I just kept falling deeper and deeper into this, and it appealed to me because it made me feel a sense of belonging,” he said. “I was brainwashed.”
this is impressive reporting but, also, i feel like i didn't learn anything new? Seems like it was mostly for people who don't realize that YouTube is one of the most important political and media mediums of our time
I found this very helpful, and very Orwellian.
If you watch one YouTube video, and don’t bother doing any independent fact checking or research to verify any of it, and just dive deeper assuming everything is factual as presented, The problem isn’t the content of the video. It is your gullibility.
The Making of a Youtube Radical
Engrossing piece arguing that algorithms (including REINFORCE) play a direct role in increasing radicalization. Is AI exploiting us? Or is AI just helping us exploit each other more efficiently? (The power of automation.) h/t
did you see your image showed up in the nyt hit piece accusing of creators leading people to the radical right?    
The Making of a YouTube Radical <--Clearly we need to ban the Internet. Do it for the children!
It's nice to see that has mentioned me as one of those scary "radicalizing" agents. Perhaps a right wing Nazi Jew? You need to scroll to the videos that this individual viewed to see Himmler Saad's "radical"l influence.
“We can really lead the users toward a different state, versus recommending content that is familiar.”
This is a freaking brilliant story from ⁦⁩ that, if you read to the end, can maybe give you hope for a way out of this mess.
“YouTube is the place to put out a message,” he said. “But I’ve learned now that you can’t go to YouTube and think that you’re getting some kind of education, because you’re not.” Lets build digital libraries to help all. we're trying
Each year I notice more and more people seeing 's recommendation algorithm as a "common thread" in extremism and radicalization. Social media engagement software seems to drive a shockingly large amount of modern social change, for better or worse.
A look at YouTube’s “long-term addiction machine,” by
The New York Times is officially a propaganda sheet. They think Nobel Laureate economist Milton Frieedman is an “alt-right brainwasher”. The very young, ignorant and Socialist writers working there have no idea about anything, including how America works.
Fascinating read on YouTube, algorithms, and radicalization. Love that some philosophers are featured.
This “Caleb Cain radicalized by YouTube” NYTimes story has a few lessons. Main takeaway is, don’t try getting your entire political philosophy from binge watching hours of 5-minute YouTube videos. READ BOOKS. History, philosophy, etc
“Caleb was a college dropout looking for direction. He turned to YouTube.”
Radikalisierung durch YouTube: in diesen Tagen sicher nicht besonders populär, aber wichtiger Beitrag.
Found this interesting, esp. w/regard to how often some people turn to YouTube for socio-political commentary. We've always had ranters. But ranters who used to be confined to the street corner now have large audiences
The Making of a YouTube Radical
The Making of a YouTube Radical
The world needs more slow, acoustic, thinking. If only it was more fashionable...
And another point, is parents should instil strong values in their kids from a young age. They’d far less likely to jump from one political ideology to another if they have a solid moral framework to go by. This kid clearly had no idea about anything
Massively important start, very well-told. Read ALL OF THIS.
#BreadTube 😂: "a new group of YouTubers are trying to build a counterweight to YouTube’s far-right flank. This group calls itself BreadTube, a reference to the left-wing anarchist Peter #Kropotkin’s 1892 book, The Conquest of Bread.”