YouTube Algorithms Don't Turn Unsuspecting Masses Into Extremists, New Study Suggests
Posted on AllSides April 26th, 2022
From The Right
![](https://dev.allsides.com/sites/default/files/styles/news_large_image/public/Screen%20Shot%202022-04-26%20at%208.19.48%20PM.png?itok=AbD7TUXJ)
Illustration: Lex Villena; Victor Koldunov, Leigh Prather, Andrey Burmakin | Dreamstime.com
ANALYSIS
"Over years of reporting on internet culture, I've heard countless versions of [this] story: an aimless young man—usually white, frequently interested in video games—visits YouTube looking for direction or distraction and is seduced by a community of far-right creators," wrote Kevin Roose for The New York Times back in 2019. "Some young men discover far-right videos by accident, while others seek them out. Some travel all the way to neo-Nazism, while others stop at milder forms of bigotry."
Never one to dial back alarmism, The Daily Beast put out a headline in 2018 calling YouTube's algorithm a "far-right radicalization...
Some content from this outlet may be limited or behind a paywall.