Выбрать главу

Cox was ideally positioned to translate Zuckerberg’s vision for News Feed for other employees: Zuckerberg wanted people to stay connected to their friends by spending hours every day scrolling through News Feed. He also wanted them to stay connected to the site; the goal was to keep users logging as much active time on the platform as possible—a metric that would come to be known as “sessions.” From an engineering perspective, the News Feed system was by far the most intense and complicated design Facebook had tackled. It took nearly a year to code, but its impact was immeasurable: News Feed would not only change the course of the platform’s history but go on to inspire scores of other tech companies around the world to reimagine what people wanted to see on the internet.

Just after 1 a.m., Pacific Standard Time, on September 5, 2006, employees crowded into one corner of the office to watch News Feed go live. Outside of Facebook employees, and a handful of investors whom Zuckerberg had briefed, no one knew about Facebook’s plans for a major overhaul. Some advisers had pleaded with Zuckerberg to do a soft launch. Zuckerberg ignored them.

Instead, when the designated hour came, people logging into Facebook from across the United States were suddenly shown a prompt informing them that the site was introducing a new feature. There was only one button for them to click, and it read “Awesome.” Once the button was clicked, the old Facebook disappeared forever. Few users bothered to read the accompanying blog post from Sanghvi, which introduced the new features in a cheerful tone. Instead, they dived headfirst into Zuckerberg’s creation. At least one user was not impressed: “News Feed sucks,” read an early post. Zuckerberg and his engineers laughed it off. It would take people some time to get used to the new design, they thought. They decided to call it a night and go home to sleep.

But the morning brought angry users outside Facebook’s offices on Emerson Street, and virtual protests to a Facebook group called “Students Against Facebook News Feed.”17 The group was furious that relationship updates were suddenly being posted in what felt like a public message board. Why did Facebook need to broadcast that a relationship had gone from “Just friends” to “It’s complicated”? they asked. Others were dismayed to see their summer vacation photos shared with the world. Though the feature built on information they had made public on the site, users were just now coming face-to-face with everything Facebook knew about them. The encounter was jarring.

Within forty-eight hours, 7 percent of Facebook users had joined the anti–News Feed group, which was created by a junior at Northwestern University. The company’s investors panicked, with several calling Zuckerberg to ask him to turn off the new feature. The ensuing PR fallout seemed to support the suggestion: privacy advocates rallied against Facebook, decrying the new design as invasive. Protesters demonstrated outside the Palo Alto office and Zuckerberg was forced to hire Facebook’s first security guard.

And yet, Zuckerberg found comfort in the numbers. Facebook’s data told him that he was right: users were spending more time on the site than ever before. In fact, the Facebook group Students Against Facebook News Feed proved that the News Feed was a hit—users were joining the group because they were seeing it at the top of their News Feed. The more users who joined, the more Facebook’s algorithms pushed it to the top of the feed. It was Facebook’s first experience with the power of News Feed to insert something into the mainstream and create a viral experience for its users.

“When we watched people use it, people were really, really, really using it a lot,” Cox recalled. “There was a ton of engagement, and it was growing.”18 The experience confirmed Cox’s dismissal of the initial public response as the kind of knee-jerk reaction that had accompanied the introduction of all new technologies throughout history. “When you go back and you look at the first radio, or the first time we talked about the telephone and everybody said this is going to invade our privacy to put telephone lines in our houses because now people will call and they’ll know when I’m not home and they’ll go break into my house,” he said. “That’s probably happened a few times, but on balance, telephones are probably good.”

Still, Zuckerberg knew that he had to do something to calm the backlash against the platform. At the end of a day spent fielding calls from friends and investors, he decided to say he was sorry. Just before 11 p.m. on September 5, almost twenty-four hours after News Feed launched, the CEO posted an apology on Facebook titled, “Calm Down. Breathe. We Hear You.” The 348-word note set the tone for how Zuckerberg would deal with crises going forward. “We are listening to all your suggestions about how to improve the product; it’s brand new and still evolving,” he wrote, before noting that nothing about users’ privacy settings had changed. (Whether that was in fact the case, within weeks, Facebook’s engineers would introduce tools allowing users to restrict access to some information.) Facebook wasn’t forcing users to share anything they didn’t want to share. If they weren’t happy with what they’d posted, well . . . they shouldn’t have posted it in the first place. Ultimately, the note read less like an apology than an admonition from an exasperated parent: This food is good for you. Someday you’ll thank me.

The New York Times had published its paper for more than one hundred years under the motto “All the News That’s Fit to Print.” Facebook was publishing its news under a different kind of motto: All the news from your friends that you never knew you wanted.

Almost immediately, the company ran into the issue of the lack of an editor, or defining principles. Newspapers drew on years of editorial judgment and institutional knowledge to determine what they would publish. The task of deciding what Facebook would and would not allow on its platform fell to a group of employees who had loosely assumed roles of content moderators, and they sketched out early ideas that essentially boiled down to “If something makes you feel bad in your gut, take it down.” These guidelines were passed along in emails or in shared bits of advice in the office cafeteria. There were lists of previous examples of items Facebook had removed, but without any explanation or context behind those decisions. It was, at best, ad hoc.

This problem extended to advertising. The ads themselves were uninspiring—postage stamp boxes and banners across the site—and the small team that oversaw them generally accepted most submissions. When Director of Monetization Tim Kendall was hired, just before the launch of the News Feed, there were no set guidelines dictating acceptable ad content. And there wasn’t a vetting process in place for Kendall and the ad team that reported to him. They were essentially making it up as they went along. “All policy decisions on content were totally organic and done as a response to problems,” a former employee said.