...I encountered thousands of such fake stories last year on social media — and so did American voters, 44 percent of whom use Facebook to get news.
Mark Zuckerberg, Facebook’s chief, believes that it is “a pretty crazy idea” that “fake news on Facebook, which is a very small amount of content, influenced the election in any way.” In holding fast to the claim that his company has little effect on how people make up their minds, Mr. Zuckerberg is doing real damage to American democracy — and to the world.
He is also contradicting Facebook’s own research.
...
In 2012, Facebook researchers again secretly tweaked the newsfeed for an experiment: Some people were shown slightly more positive posts, while others were shown slightly more negative posts. Those shown more upbeat posts in turn posted significantly more of their own upbeat posts; those shown more downbeat posts responded in kind. Decades of other research concurs that people are influenced by their peers and social networks.
...
These are not easy problems to solve, but there is a lot Facebook could do. When the company decided it wanted to reduce spam, it established a policy that limited its spread. If Facebook had the same kind of zeal about fake news, it could minimize its spread, too.
If anything, Facebook has been moving in the wrong direction. It recently fired its (already too few) editors responsible for weeding out fake news from its trending topics section. Unsurprisingly, the section was then flooded with even more spurious articles.
...
In addition to doing more to weed out lies and false propaganda, Facebook could tweak its algorithm so that it does less to reinforce users’ existing beliefs, and more to present factual information. This may seem difficult, but perhaps the Silicon Valley billionaires who helped create this problem should take it on before setting out to colonize Mars.