Saturday, November 23, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

YouTube’s Plot to Silence Conspiracy Theories

In January 2019, YouTube began rolling out the system. That’s when Mark Sargent noticed his flat-earth views take a nose dive. Other types of content were getting down-ranked, too, like moon-landing conspiracies or videos perseverating on chemtrails. Over the next few months, Goodrow and Rohe pushed out more than 30 refinements to the system that they say increased its accuracy. By the summer, YouTube was publicly declaring success: It had reduced by 50 percent the watch time of borderline content that came from recommendations. By December it reported a reduction of 70 percent.

The company won’t release its internal data, so it’s impossible to confirm the accuracy of its claims. But there are several outside indications that the system has had an effect. One is that consumers and creators of borderline stuff complain that their favorite material is rarely boosted any more. “Wow has anybody else noticed how hard it is to find ‘Conspiracy Theory’ stuff on YouTube lately? And that you easily find videos ‘debunking’ those instead?” one comment noted in February of this year. “Oh yes, youtubes algorithm is smashing it for them,” another replied.

Then there’s the academic research. Berkeley professor Hany Farid and his team found that the frequency with which YouTube recommended conspiracy videos began to fall significantly in early 2019, precisely when YouTube was beginning its updates. By early 2020, his analysis found, those recommendations had gone down from a 2018 peak by 40 percent. Farid noticed that some channels weren’t merely reduced; they all but vanished from recommendations. Indeed, before YouTube made its switch, he’d found that 10 channels—including that of David Icke, the British writer who argues that reptilians walk among us—comprised 20 percent of all conspiracy recommendations (as Farid defines them); afterward, he found that recommendations for those sites “basically went to zero.”

Sign Up Today

Sign Up for WIRED's Longreads Newsletter

Another study that somewhat backs up YouTube’s claims was conducted by the computer scientist Mark Ledwich and Anna Zaitsev, a postdoctoral scholar and lecturer at Berkeley. They analyzed YouTube recommendations, looking specifically at 816 political channels and categorizing them into different ideological groups such as “Partisan Left,” “Libertarian,” and “White Identitarian.” They found that YouTube recommendations mostly now guide viewers of political content to the mainstream. The channels they grouped under “Social Justice,” on the far left, lost a third of their traffic to mainstream sources like CNN; conspiracy channels and most on the reactionary right—like “White Identitarian” and “Religious Conservative”—saw the majority of their traffic slough off to commercial right-wing channels, with Fox News being the hugest beneficiary.

If Zaitsev and Ledwich’s analysis of YouTube “mainstreaming” traffic holds up—and it’s certainly a direction that YouTube itself endorses—it would fit into a historic pattern. As law professor Tim Wu noted in his book The Master Switch, new media tend to start out in a Wild West, then clean up, put on a suit, and consolidate in a cautious center. Radio, for example, began as a chaos of small operators proud to say anything, then gradually coagulated into a small number of mammoth networks aimed mostly at pleasing the mainstream.

For critics like Farid, though, YouTube has not gone far enough, quickly enough. “Shame on YouTube,” he told me. “It was only after how many years of this nonsense did they finally respond? After public pressure just got to be so much they couldn’t deal with it.”

Even the executives who set up the new “reduce” system told me it wasn’t perfect. Which makes some critics wonder: Why not just shut down the recommendation system entirely? Micah Schaffer, the former YouTube employee, says, “At some point, if you can’t do this responsibly, you need to not do it.” As another former YouTube employee noted, determined creators are adept at gaming any system YouTube puts up, like “the velociraptor and the fence.”

Still, the system appeared to be working, mostly. It was a real, if modest, improvement. But then the floodgates opened again. As the winter of 2020 turned into a spring of pandemic, a summer of activism, and another norm-shattering election season, it looked as if the recommendation engine might be the least of YouTube’s problems.

*** This article has been archived for your research. The original version from WIRED can be found here ***