Выбрать главу

Well before Sandberg arrived at Facebook with a vision for ramping up its data mining efforts, Jeff Chester had been following Facebook’s growth warily. In November 2007, the privacy rights advocate had been sitting in his tiny office space in Washington, DC, following a live blog of Zuckerberg on his laptop. The Facebook founder, addressing ad executives in New York City, was announcing the company’s latest innovation: a revolutionary program called “Beacon.”21

Chester stiffened. Beacon, Zuckerberg continued, was an ad initiative that took information about a Facebook user’s purchases from other sites—movie tickets on Fandango, a hotel booking on Tripadvisor, a sofa on Overstock—and published it on the News Feeds of his or her friends. The companies were partners in Beacon, eager to be part of a program that gave them data on Facebook users and also promoted their goods and services through recommendations—even if they were involuntary. A purchase made, story read, or recipe reviewed would automatically show up in user feeds. The ideal in advertising was word-of-mouth endorsements, and Facebook was offering a way to provide them on a massive scale: their users would effectively serve as brand evangelists. Beacon erased the line between advertising and “organic” user comments. “Nothing influences a person more than a recommendation from a trusted friend,” Zuckerberg said. Already, Facebook had signed more than forty partners—including CBS, the New York Times, and TheKnot—that were paying Facebook for the privilege of getting their brands in front of the social network’s users.

Chester jumped out of his chair and called his wife, Kathryn Montgomery, a professor of media studies at American University. “This is unbelievable!” he shouted. “You’ve got to hear what this guy from Facebook is saying!” From Chester’s perspective, Facebook’s plan was the logical extension of what advertising had always done: hijack the minds of consumers to persuade them at the checkout counter. Chester, for his part, had been fighting what he saw as manipulation since the 1990s, when he railed against television broadcasters and advertisers for product placement on shows and the promotion of junk food on children’s programs. With the advent of the internet, he had trained his sights on the unregulated world of online advertising. He’d founded a privacy rights group called “the Center for Digital Democracy,” and in 1998, with the help of Montgomery, he’d successfully pushed for a law protecting kids on the internet, known as the federal Children’s Online Privacy Protection Act.

That the Center for Digital Democracy consisted of just one full-time employee, Chester, and was located in Dupont Circle, far from the center of political activity near Capitol Hill, did not discourage him. The native Brooklynite relished the role of outsider, and with his rumpled slacks, rimless glasses, and disheveled hair, he looked the part, too, a standout in the slick suit-and-tie universe of Washington lobbyists. He scoured about ten technology and advertising industry trade publications every day, looking to uncover unsavory business practices, which he then compiled into individual emails to journalists. His one-liners often proved irresistible. Just a few days before Zuckerberg’s announcement, Chester had been quoted in the New York Times describing behavioral advertising as a “digital data vacuum cleaner on steroids.”

In his mind, the Beacon announcement marked a dangerous new low. Zuckerberg had not asked permission from Facebook account holders to use them as sales agents; Beacon enrolled them automatically. Facebook was widening its data net, exploiting insights about its users in ways that crossed ethical lines. He called privacy groups, fired off a statement to the press, and reached out to his media contacts.

“It was a wakeup call. Beacon had the seeds of all the problems that would come later,” Chester recalled. “Regardless whether the user wanted to or not, Facebook was going to monetize everyone who was using the social network and turn individuals into advertisers.”

The next day, Facebook users were shocked to see their private lives on display. Suddenly, that Travelocity flight reservation to Florida, last night’s guilty pleasure movie rental, or eBay bid was popping up in the News Feeds of friends, relatives, and coworkers. One woman discovered that her boyfriend had bought her a diamond ring.

The outcry was immediate. On November 20, the public advocacy organization MoveOn.org circulated a petition asking Facebook to shut down the service; within days, it had amassed fifty thousand signatures.22 As the controversy gathered momentum, Coca-Cola and Overstock dropped out of the program, telling reporters they had been misled into believing that Beacon would require user activation. When Facebook responded that the feature could be turned off, users produced contrary evidence. A security researcher at Computer Associates said that after he had opted out of Beacon, he noticed network traffic patterns revealing that recipes he had saved on Epicurious were tracked by Facebook.23

The public fury over Beacon was missing the point, Chester thought. Sure, it was embarrassing and invasive to have your shopping activity exposed to acquaintances and family members. But that was a distraction from the real threat: the fact that they were being tracked and monitored by an entirely new entity. Everyone knew to be wary of the government’s reach, but in Chester’s estimation the danger wasn’t what the public or law enforcement knew about you. It was what commercial enterprises and advertisers did. It was what Facebook knew.

Mere weeks after his appearance in New York, Zuckerberg apologized for springing the new ad tool on users and announced that he would change the setting to make it an opt-in instead of a default program requiring users to opt out. In a blog post, he assured users that Facebook would stop sharing their shopping activity without permission. He acknowledged that the rollout had been botched. “We missed,” he said, “the right balance.”24

The controversy, and Zuckerberg’s weak apology, completely ignored the real privacy abuses. As Chester and Montgomery saw it, personal data should be kept out of view from advertisers—and Facebook hadn’t done anything to change that. “Facebook said you can choose whom you share information with on the platform,” explained Montgomery, “but behind the platform, where you don’t see what is really happening and how they are making money, they don’t give you a choice about what the platform shares about you with advertisers.”

With Sandberg’s hiring, the company entered a new phase of advertising. She lured big brands like Adidas and Papa John’s Pizza to create quizzes and fan pages to get users to engage directly with advertisers and was overseeing the development of ad targeting based on geography and languages.25

She also swiftly emerged as the most effective spokesperson for the company, fulfilling one of the key roles the CEO had in mind for his number two. She crafted an entirely new spin on Facebook’s handling of data privacy and repositioned Facebook as a leader on the issue, pointing to how it offered users granular controls over who (the public, friends, selected individuals) could see particular content. Facebook didn’t “share” data with advertisers, she asserted, a talking point the company would repeatedly fall back on, even though some critics argued it was a distinction without a difference. True, Facebook didn’t physically hand over or directly sell data to advertisers. But advertisers were targeting users by age, income, employment, education, and other demographics. As Siva Vaidhyanathan, a professor of media studies at the University of Virginia, pointed out, the evidence was clear that the company’s profits came from data: “Facebook has been duplicitous and irresponsible with our data for years.”