As the 2016 election approached, many media analysts and tech bloggers began to realize that with so many people relying on Facebook as their primary news aggregator, that the site could leverage their power hoping to influence the election. New York Magazine published an article which asked, “Could Facebook help prevent President Trump?” and went on to say, “Not through lobbying or donations or political action committees, but simply by exploiting the enormous reach and power of its core products? Could Facebook, a private corporation with over a billion active users, swing an election just by adjusting its News Feed?”405
Paul Brewer, a communications professor at the University of Delaware, said, “Facebook would, like any campaign, want to encourage turnout among the supporters of its preferred candidate, persuade the small number of genuinely uncommitted likely voters, and target apathetic voters who could be convinced to get out to the polls.”406
Josh Wright, the executive director of a behavioral science lab, also admitted, “There’s lots of opportunity, I think, to manipulate based on what they know about people.”407 Wright pointed out how the site could fill people’s news feeds with photos or stories showing a particular candidate engaged in activities that Facebook knows they like in order to use “in-group psychology” to get people to identify with a candidate who shares some of their interests.
We tend to judge someone by what other people we like are saying about them, and so Facebook could highlight statements made by celebrities that people follow, or even our own friends, about a candidate in order to influence our opinion of that person. If you think Facebook wouldn’t engage in this kind of personalized high-tech manipulation, you would be wrong, because they already have.
A secret study Facebook conducted during the 2010 midterm elections, with help from researchers at the University of California, San Diego, investigated what’s called social contagion which is how behavior or emotions are copied by others. Facebook included over 60 million of their users in the experiment and found that they could influence people to actually get out and vote by showing people that their friends had voted, which then influenced others to go vote as well. “Our study suggests that social influence may be the best way to increase voter turnout,” said James Fowler, a UCSD political science professor who conducted the study. “Just as importantly, we show that what happens online matters a lot for the ‘real world.’”408 Their experiment increased voter turnout by 340,000 people.409
Facebook obviously has a political agenda. They’ve hosted a Q & A for Barack Obama,410 they hung a huge Black Lives Matter banner at their headquarters,411 and Mark Zuckerberg has been very outspoken about his support of illegal immigration,412 gay marriage,413 and other liberal causes. The company conducts internal polls of employees where they submit questions and vote on them in hopes of getting Zuckerberg to answer, and one poll in March of 2016 showed that a bunch of employees asked if the company should be used to help prevent Donald Trump from winning the election.414
UCLA law professor Eugene Volokh told Gizmodo, “Facebook can promote or block any material that it wants. Facebook has the same First Amendment right as the New York Times. They can completely block Trump if they want. They can block him or promote him.”415 Technically the First Amendment only prevents the U.S. government from suppressing someone’s speech, not a corporation.
Gizmodo’s report on the political bias of Facebook pointed out, “Most people don’t see Facebook as a media company◦— an outlet designed to inform us. It doesn’t look like a newspaper, magazine, or news website. But if Facebook decides to tamper with its algorithm◦— altering what we see◦— it’s akin to an editor deciding what to run big with on the front page, or what to take a stand on.”416 Whether they are legally allowed to do such a thing is one issue, whether such favoritism and censorship is deceptive and immoral is another.
“If Facebook decided to,” professor Volokh says, “it could gradually remove any pro-Trump stories or media off its site◦— devastating for a campaign that runs on memes and publicity. Facebook wouldn’t have to disclose it was doing this, and would be protected by the First Amendment.”417
“If Facebook was actively coordinating with the Sanders or Clinton campaign, and suppressing Donald Trump news, it would turn an independent expenditure (protected by the First Amendment) into a campaign contribution because it would be coordinated◦— and that could be restricted,” he said. “But if they’re just saying, ‘We don’t want Trump material on our site,’ they have every right to do that. It’s protected by the First Amendment.”418
In May of 2016, tech blog Gizmodo confirmed what many had suspected and what was obvious to those with common sense◦— that Facebook was systematically suppressing news stories from conservative outlets and those which presented a positive conservative message.419 “Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential ‘trending’ news section, according to a former journalist who worked on the project,” reported Gizmodo.420
The whistleblower revealed that the company suppressed stories about CPAC (the Conservative Political Action Committee conference), Mitt Romney, Rand Paul, and other topics from showing up on the trending module, even though they would have appeared there organically from so many people posting about them.
It wasn’t just one whistleblower, but several, and they also revealed that employees would manually insert topics into the trending list that they wanted to get more attention. One former employee said that positive stories about Black Lives Matter were often inserted into the trending box to help them go viral when they didn’t organically trend from people posting about them.421
“In other words,” Gizmodo reported, “Facebook’s news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation. Imposing human editorial values onto the lists of topics an algorithm spits out is by no means a bad thing◦— but it is in stark contrast to the company’s claims that the trending module simply lists ‘topics that have recently become popular on Facebook.’”422
They also called the news section “some of the most powerful real estate on the Internet” that helps dictate what hundreds of millions of people are reading. One of the news curators said they used a notebook to document stories that were censored which included ones about Lois Lerner, the IRS official who targeted conservatives for audits; stories about the Drudge Report, Ted Cruz, Steven Crowder, and more.
A second curator said, “It was absolutely bias. We were doing it subjectively. It just depends on who the curator is and what time of day it is. Every once in a while a Red State or conservative news source would have a story. But we would have to go and find the same story from a more neutral outlet that wasn’t as biased.”423
If a story was on Breitbart, The Washington Examiner, Newsmax or other conservative sites and was going viral and qualified to be included in the trending module, curators would wait until an outlet like CNN or The New York Times covered the story before it would be allowed to show up as a trend. One insider revealed that Facebook injected the latest Black Lives Matter protests into the trending module, giving them special preference to further their cause. The editors also prevented negative stories about Facebook itself from showing up in the trending section.