Philip DeFranco, a popular YouTuber with over five million subscribers, posted a video titled “YouTube is Shutting Down My Channel and I’m Not Sure What to Do About It” on August 31st 2016 which started the “YouTube is Over Party” sarcastic hashtag to trend on social media from people talking about the new restrictions on content being rolled out. YouTubers like myself had noticed our videos were getting regularly demonetized◦— meaning no advertisements were allowed to run on them if they included certain keywords in the title or description. Words like ‘war,’ ‘9/11,’ ‘police shooting,’ ‘ISIS,’ ‘terrorism,’ ‘sex,’ ’drugs,’ etc. It didn’t matter the context, they automatically got demonetized, but you wouldn’t notice unless you looked closely at the analytics since there was no notification about it.
What brought this to Philip DeFranco’s attention was that YouTube finally started emailing people when their videos were demonetized instead of just doing it without notice. One’s first thought to get around this would be to just avoid using certain keywords in the titles, descriptions and tags of videos, and that solved the problem◦— at least for a little while◦— but YouTube’s system kept getting more sophisticated by the day and now appears to analyze the transcripts of all videos uploaded. In 2009 YouTube began using voice recognition software and creating automatic transcripts for videos, and while not being 100% accurate, it is eerie to see that YouTube knows what the people in a video are saying because their servers are now “listening” to every word that is said in every video.537
A few months after the ‘YouTube is Over’ demonetization scare, the Wall Street Journal would target YouTube’s biggest channel, PewDiePie, which has over 57 million subscribers, and claim he’s making money by posting ‘racist’ and ‘anti-Semitic’ videos. PewDiePie, whose real name is Felix Kjellberg, is a 27-year-old guy from Sweden who started off as a “gamer” (a person who literally plays video games while other people watch) and later branched out into comedy skits and social commentary, and is a huge star rivaling many Hollywood A-listers in terms of popularity.
“Disney Severs Ties With YouTube Star PewDiePie After Anti-Semitic Posts,” was the Wall Street Journal’s headline where they boasted that they asked Disney about videos of his which they claimed included “Anti-Semitic jokes or Nazi imagery”538 Their story cast him in a false light and gave the impression that he might be racist or anti-Semitic because of some jokes he made in his videos. The Wall Street Journal even put out a video of their own to accompany their story which showed PewDiePie dressed as a soldier sitting in front of his computer watching an Adolf Hitler speech while smiling and nodding in agreement. What they failed to mention was this scene was from a skit he shot in response to previous false claims by the mainstream media which accused him of being racist, so he made the Hitler video as a joke making fun of their ridiculous claims.
This Wall Street Journal article on PewDiePie poured gasoline on what were just smoldering embers, and it blew up into a huge forest fire that would be used as a token example that advertisements for major brands were being shown on YouTube videos that were ‘racist,’ ‘inappropriate’ or ‘offensive.’ Wired magazine then ran the headline, “PewDiePie Was Always Kinda Racist, But Now He’s a Hero to Nazis,”539 and when they tweeted out the link they added the comment, “White supremacists have a new hero, and his name is PewDiePie.”540 After facing major backlash from their defamatory title, they later changed it to “PewDiePie’s fall shows the limits of ‘LOL JK.’541
His original series Scare PewDiePie on YouTube Red (a subscription service similar to Netflix) was immediately canceled, and YouTube pulled his channel from their premium advertiser program costing him a massive drop in income.542 Major YouTubers rallied behind him showing support, including Jewish ones,543 but the war against YouTubers was just beginning.
BuzzFeed, the infamous clickbait bottom feeders of the Internet, published an article titled, “How YouTube Serves As The Content Engine Of The Internet’s Dark Side,” pressuring YouTube to start demonetizing videos about ‘conspiracy theories.’544 The story began, “Everyone knows that Twitter and Facebook spread bad information and hate speech. But YouTube, which pays for conspiracy theories seen by millions, may be even worse.”545
They named one particular conspiracy channel with 150,000 subscribers and said that, “His videos, usually preceded by pre-roll ads for major brands like Quaker Oats and Uber, have been watched almost 18 million times, which is roughly the number of people who tuned in to last year’s season finale of NCIS, the most popular show on television.”546
BuzzFeed continued, “In the aftermath of the 2016 presidential election, the major social platforms, most notably Twitter, Facebook, and Reddit, have been forced to undergo painful, often public reckonings with the role they play in spreading bad information… And yet there is a mammoth social platform, a cornerstone of the modern Internet with more than a billion active users every month, which hosts and even pays for a fathomless stock of bad information, including viral fake news, conspiracy theories, and hate speech of every kind◦— and it’s been held up to virtually no scrutiny: YouTube.”547
The article goes on to complain about what they called the “conspiracy-industrial complex” on the Internet, “which has become a defining feature of media and politics in the Trump era,” and says it “would be a very small fraction of itself without YouTube.”548
They said the Internet’s biggest “conspiracy-news stars” live on YouTube and named a few channels like Alex Jones, Paul Joseph Watson, and Sargon of Akkad. The writer then reminisces about the good old days of YouTube, but says, “Today, it fills the enormous trough of right-leaning conspiracy and revisionist historical content into which the vast, ravening right-wing social Internet lowers its jaws to drink.”549
“Frequently, the videos consist of little more than screenshots of a Reddit ‘investigation’ laid out chronologically, set to ominous music,” he says. “Other times, they’re very simple, featuring a man in a sparse room speaking directly into his webcam, or a very fast monotone narration over a series of photographs with effects straight out of iMovie.”550
The articles goes on to lament, “Sometimes, these videos go hugely viral,” and mentions a few including one that is critical of the mass immigration of Muslims into Europe which had been viewed over 4 million times. “That’s roughly as many people as watched the Game of Thrones Season 3 premiere,” it says.551 “So what responsibility, if any, does YouTube bear for the universe of often conspiratorial, sometimes bigoted, frequently incorrect information that it pays its creators to host, and that is now being filtered up to the most powerful person in the world?”552
It concludes by asking, “But morally and ethically, shouldn’t YouTube be asking itself the same hard questions as Facebook and Twitter about the role it plays in a representative democracy? How do those questions change because YouTube is literally paying people to upload bad information?”553