When the notorious conspiracy theorist Alex Jones was kicked off YouTube and Facebook in 2018, the lesson was supposed to be that deplatforming works. Without access to his millions of followers on mainstream social media, Jones became an online ghost, diminished and shouting his dangerous unfinished business to a much smaller audience.
But some people online took a second lesson from the change: conspiracy theories, and the people who promote them, can get a lot of views—and money.
Last week Patrick Bet-David, a popular financial YouTuber with more than 2.2 million subscribers, invited Robert F. Kennedy Jr.—one of the most high-profile proponents of the debunked theory that vaccines cause autism—for a two-hour interview on his channel. At the beginning, Bet-David asked, “Why’d you agree to do a long-form interview?”
The answer was obvious to Kennedy, one of many anti-vaccination leaders trying to make themselves as visible as possible during the covid-19 pandemic. “I’d love to talk to your audience,” he replied.
Kennedy told Bet-David that he believes his own social-media accounts have been unfairly censored; making an appearance on someone else’s popular platform is the next best thing. Bet-David framed the interview as an “exclusive,” enticingly titled “Robert Kennedy Jr. Destroys Big Pharma, Fauci & Pro-Vaccine Movement.” In two days, the video passed half a million views.
As of Wednesday, advertisements through YouTube’s ad service were playing before the videos, and Bet-David’s merchandise was for sale in a panel below the video’s description. Two other interviews, in which anti-vaccine figures aired several debunked claims about coronavirus and vaccines (largely unchallenged by Bet-David), were also showing ads. Bet-David said in an interview that YouTube had limited ads on all three videos, meaning they can generate revenue, but not as much as they would if they were fully monetized.
We asked YouTube for comment on all three videos on Tuesday afternoon. By Thursday morning, one of the three (an interview with anti-vaccine conspiracy theorist Judy Mikovits) had been deleted for violating YouTube’s medical misinformation policies. Before it was deleted, the video had more than 1 million views.
YouTube said that the other two videos were borderline, meaning that YouTube decided they didn’t violate rules, but would no longer be recommended or show up prominently in search results.
After we asked YouTube for comment, one of the three videos had been deleted. Before it was deleted, the video had more than 1 million views.
According to YouTube’s own rules, videos containing “medical misinformation” about covid-19 are against advertiser guidelines — if not in violation of the platform’s community standards governing what content is allowed on YouTube at all. Taken together, the three videos gathered more than 3 million views in less than a week.
YouTube has introduced some attempts to counterbalance health misinformation in the past year—for example, by surfacing authoritative sources in some search results, adding information panels to some videos, and working to remove those that violate its evolving list of policies about such false claims. But the pandemic has heightened the urgency and raised the stakes: as covid-19 took hold, YouTube introduced specific policies prohibiting videos that question the transmission or existence of the disease, promote unsubstantiated cures, or encourage people to ignore official guidance.
Experts have said that social-media platforms’ moves to prioritize reliable information and demonetize, limit the reach of, or outright remove content containing misinformation can help limit its spread. But platforms have struggled with effective enforcement.