I asked YouTube's CPO about the perception that the recommendations bar pushes users toward more extreme videos. He said the company looked into it and found that while true in some cases, "there are other videos that skew in the opposite direction."
Some confounding answers from YouTube's product chief, who simply does not want to face the reality of what he helped create.
New: I talked with YouTube's chief product officer () about algorithmic recommendations, violent extremism, and the much-discussed rabbit-hole effect.
A few thoughts on this interview that did with the YouTube exec. So much of what's focused on here seems to be YouTube's intent. "We don't take into account, etc" But that's not really the issue? The issue is outcome.
YouTube has since shifted its model -- the OKRs even! -- to "responsible growth." New figures here the millions that see its information panels on conspiracies and take satisfaction surveys. But as good invu shows, this change is hard to grok
Here is the link to the NYTimes interview with Neal Mohan. I can not get over how awful his answers are.
but don't worry, YouTube's chief product officer says that "some of those videos might have the perception of skewing in one direction or, you know, call it more extreme. There are other videos that skew in the opposite direction"