“Our goal is to build the perfect personalised newspaper for every person in the world,” said Facebook’s Mark Zuckerberg in 2014. This newspaper would “show you the stuff that’s going to be most interesting to you”.
To many, that statement explains perfectly why Facebook is such a terrible source of news. A “fake news” story proclaiming that Pope Francis had endorsed Donald Trump was, according to an analysis from BuzzFeed, the single most successful item of news on Facebook in the three months before the US election. If that’s what the site’s algorithms decide is interesting, it’s far from being a “perfect newspaper”.
It’s no wonder that Zuckerberg found himself on the back foot after Trump’s election. Shortly after his victory, Zuckerberg declared: “I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way . . . is a pretty crazy idea.” His comment was greeted with a scornful response.
I should confess my own biases here. I despise Facebook for all the reasons people usually despise Facebook (privacy, market power, distraction, fake-smile social interactions and the rest). And, as a loyal FT columnist, I need hardly point out that the perfect newspaper is the one you’re reading right now.
But, despite this, I’m going to stand up for Zuckerberg, who recently posted a 5,700-word essay defending social media. What he says in the essay feels like it must be wrong. But the data suggest that he’s right. Fake news can stoke isolated incidents of hatred and violence. But neither fake news nor the algorithmically driven “filter bubble” is a major force in the overall media landscape. Not yet.
“Fake news” is a phrase that has already been debased. A useful definition is that fake news is an entirely fabricated report presenting itself as a news story. This excludes biased reporting, satire and lies from politicians themselves.
At first glance, such hoaxes appear to be ubiquitous on Facebook. The BuzzFeed analysis finds that the five most popular hoax stories were more successful than the five most popular true stories. (This list of true stories includes the New York Post’s “Melania Trump’s Girl-on-Girl Photos From Racy Shoot Revealed”, a reminder that not all mainstream journalism is likely to win a Pulitzer.)
But hoax stories are less significant than this analysis suggests — partly because Facebook is not the main source of news for Americans (that’s still television news), and partly because true reports will generally be covered in some form by dozens of outlets, which will dilute the popularity of any one version. Each hoax, however, is unique. No wonder the most popular hoaxes outperform the most popular true reports.
In January 2017, two economists, Hunt Allcott and Matthew Gentzkow, published research studying exactly how prevalent fake news had been before the election. Their clever method tested people’s recall of fake news, as compared with true news stories and “placebo” stories — fake fake news, invented by the researchers. People didn’t remember many fake news stories, and claimed to remember quite a few placebos. Overall, there just didn’t seem to be enough fake news to swing the election result — unless it was potent stuff indeed, even in small doses.
“The average voter saw one fake news story before the election,” Gentzkow told me. “That number is a very different picture from what you might get from watching the public discussion.”
Of more concern is that Facebook — and its “most interesting to you” algorithm — simply supplies news that panders to each user’s ideological biases. It’s undoubtedly true that we surround ourselves with people who agree with us on social media. But it’s not clear that Facebook’s algorithm is the biggest problem here. Twitter was politically polarised even in the days when it used no algorithm at all. And newspapers have ideological biases too.
One recent study of online news reading was conducted by Seth Flaxman, Sharad Goel and Justin Rao, who had access to browser data from Microsoft, and used it to examine how people consumed news online. They found a mixed picture: social media did seem to push stories that were further from the centre of the political spectrum but they also exposed people to a greater variety of ideological viewpoints. That makes sense. Reading the same newspaper every day is a filter bubble too.
Gentzkow studied the contrast between online and offline news using data from 2004-2009, working with fellow economist Jesse Shapiro. They found little evidence then that online news consumption was more polarised than traditional media. But things are changing quickly. “My guess is that segregation is noticeably and meaningfully higher than in the past,” Gentzkow says, “but still quite modest.”
This feels like an important moment. Fake news is not prevalent, but it could become so. Filter bubbles are probably no worse than they have been for decades — but that could change rapidly too.
“A lot ultimately hinges on what the motivations of American voters are,” says Gentzkow. “Do people actually care at all about getting the truth and having accurate information?”
He’s hopeful that, deep down, people watch and read the news because they want to learn about the world. But if what voters really want is to be lied to, then Facebook is the least of our problems.
Written for and first published in the Financial Times.