YouTube And Google Algorithms Promoted Notre Dame Conspiracy Theories

Technology

As Notre Dame burned on Monday, the internet was ablaze with conspiracy theories.

Trolls made baseless claims that the flames ravaging Paris’ iconic cathedral were the result of an act of terrorism, and pushed familiar Islamophobic narratives. One vlogger suggested France’s government had ignited the fire as a so-called false flag, and warned that President Emmanuel Macron “cannot be trusted.” With YouTube’s help, his video garnered nearly 50,000 views overnight.

In January, YouTube responded to public pressure by pledging to limit its algorithmic promotion of conspiracy theories. But its algorithm recommended the false-flag video to users who may have been searching for information about Notre Dame, according to AlgoTransparency, an algorithm-watchdog website.

As of Wednesday afternoon, the monetized, 13-minute video had close to 100,000 views, about 5,500 “likes” and just 130 “dislikes.” Commenters have called Macron a fascist, arsonist and Satan, and circulated their own unfounded theories. Some have called for him to be physically harmed.

“We’ve designed our systems to help ensure that content from more authoritative sources is surfaced prominently in search results and watch next recommendations in certain contexts, including when a viewer is watching breaking news related content on YouTube,” a YouTube spokesperson told HuffPost in a statement. “While we’re always open to feedback from the research community on how we can improve our systems, we disagree with the methodology, data and, most importantly, the conclusions made in AlgoTransparency research.”

The spread of disinformation online has had violent and tragic consequences in the recent past. Just last year, a man who embraced anti-Semitic conspiracy theories opened fire at a Pittsburgh synagogue and killed 11 worshippers. In 2016, a man fired a rifle inside a Washington, D.C., pizzeria because an online conspiracy theory had led him to believe that the restaurant was the covert site of a child sex-trafficking ring.

Facebook and Twitter, too, have struggled to contain false information about the fire that engulfed Notre Dame. Fake news accounts impersonating CNN claimed it was deliberately set, BuzzFeed reported.

French officials have determined the blaze was likely an accident. Typing “Notre Dame accident” into YouTube from an incognito browser on Wednesday yielded results including videos with the titles “Notre Dame Cathedral Fire: A) Jewish Lightening B) Muslim Terror C) Yellow Vest Revenge D) Accident” and “Notre Dame Fire Was it Staged or an ‘Accident?’ Hmmm,” as well as a video by conservative personality Glenn Beck, who floated the groundless idea that “Islamic extremism” was to blame.

A YouTube spokesperson said the company has “generally been unable to reproduce the results the Huffington Post have encountered on YouTube.”

The same search on Google, which owns YouTube, looked like this:



Screenshot/Google

“Search results are dynamic and can change rapidly, particularly in breaking news moments as more news articles are continually being published,” a Google spokesperson told HuffPost in a statement. “In times of breaking news and crisis, we aim to surface high quality results from a variety of authoritative sources. When we identify cases where low quality results are surfacing, we develop scalable solutions that can address a range of queries.”

YouTube has stressed that conspiracy theories represent less than 1% of all content on its site. But with hundreds of hours of footage uploaded to YouTube per minute, that figure still represents an enormous volume of troubling content, much like the other fringe videos YouTube promotes. Its algorithm generates more than 70% of user traffic.

As part of its efforts to counter the proliferation of fake news on its platform, YouTube introduced information panels with links to third-party sources to fact-check misleading video content. But the panels YouTube attached to livestreams of the fire at Notre Dame directed viewers to information about the 9/11 terrorist attacks. YouTube blamed this on an algorithmic error.

“We are deeply saddened by the ongoing fire at the Notre Dame cathedral,” the company told HuffPost on Monday. “These panels are triggered algorithmically and our systems sometimes make the wrong call.”

YouTube’s dominance in the media world lends credibility to the videos it recommends. More than half of adult users say the site is an important source of information, according to a Pew Research Center study. The number of users who go to YouTube for news nearly doubled between 2013 and 2018.

The video giant incentivizes content creators to keep people watching: Those with at least 1,000 subscribers and 4,000 watch-hours in a one-year period can earn money from ads, and sensational content drives more traffic. The Notre Dame false-flag video was interrupted three times by lengthy ads.

“It’s all about maximizing watch time,” Guillaume Chaslot, a former YouTube engineer who helped design the recommendation algorithm, explained to HuffPost last year. “The more watch time you have, the more ads you can show the user.”

Leave a Reply