Monday, December 23, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Flat Earth

YouTube’s autoplay function is helping convert people into Flat Earthers

The first time I watched a Flat Earther YouTube video, I was entranced.

The theory was that Earth used to be populated by giant trees, and so any change in elevation on Earth was actually just giant tree stumps. It made so little sense that I needed to know more about the theory, so when YouTube automatically started playing another video after the first was complete, I continued watching. Then I sat through a third, and decided I should stop.

Apparently, this is the path that has led many people towards believing flat earth theories. The Daily Beast writer Kelly Weill reports that during her recent trip to the second annual Flat Earth Conference held in Denver, Colorado, YouTube was the catalyst for several Flat Earthers: one woman was “converted after three straight days of Flat Earth Videos,” and a speaker, a self-identified fan of popular conspiracy theorist Alex Jones, specifically mentions that YouTube’s autoplay function led him to a popular Flat Earth video which “‘woke him up’ to the movement.” Another attendee, a non-believer accompanying his brother, told Weill that his flat-Earther brother watches too many YouTube videos, which just reinforces his beliefs.

According to YouTube, its recommendation engine was designed for more mainstream categories like beauty tutorials or educational content, and it recognizes that it might not work as well for topics like news and science. But autoplay can lead to some weird corners of YouTube even when watching the most innocuous videos, like kids’ content. Last year, writer James Bridle revealed that while playing videos from popular children’s series, YouTube autoplay could lead viewers to disturbing knock-off parody videos, like Peppa Pig videos that feature her drinking bleach or eating her father. Other videos are computer-generated, he writes, but nonetheless produce images that would be disturbing for young viewers, like characters’ disembodied heads.

It’s a difficult problem to police. Content producers have every incentive to produce as much content as possible, regardless of quality, in hopes that autoplay (or YouTube search) will lead people to watch their videos, netting them a bit of cash through advertising. They likely can also put up videos faster than YouTube can take them down. While YouTube has taken down channels that consistently propagate misinformation, like Alex Jones‘s, the company is instead focusing on promoting high-quality content. In August, it announced new features to give viewers additional context about videos’ sources, especially when those videos might coincide with conspiracy theories.

One example YouTube highlighted in its announcement is that it now shows a link to Encyclopedia Brittanica’s entry for the moon landing alongside moon landing videos. The subtext, one assumes, is that this information may also pop up alongside moon truther conspiracy videos. According to YouTube, the company is looking to expand the topics for this feature soon, including for flat-Earth videos.

Meanwhile, though, it seems like YouTube’s algorithm might accidentally have a sense of humor; when I last pulled up a flat Earth video, it first showed me an ad for airplane travel, an example commonly used to debunk the flat Earth theory. On a clear day, from the air, you can see the curvature in the earth by looking at the horizon, and no one’s been known to have flown off the side of Earth into nothingness—at least not yet.

*** This article has been archived for your research. The original version can be found at Quartz ***