There’s plenty of blame to go around, but the list of actors has to start with Facebook. And for all its wonders — reaching nearly 2 billion people each month, driving more traffic and attention to news than anything else on earth — it’s also become a single point of failure for civic information. Our democracy has a lot of problems, but there are few things that could impact it for the better more than Facebook starting to care — really care — about the truthfulness of the news that its users share and take in.Benton wants Facebook to hire editors who would, among other things, flag fake stories as fake:
As BuzzFeed’s Craig Silverman has documented repeatedly — and as anyone who has spent much time on their relatives’ profile pages can probably attest — Facebook has become a sewer of misinformation. Some of it is driven by ideology, but a lot is driven purely by the economic incentive structure Facebook has created: The fake stuff, when it connects with a Facebook user’s preconceived notions or sense of identity, spreads like wildfire. (And it’s a lot cheaper to make than real news.)
One example: I’m from a small town in south Louisiana. The day before the election, I looked at the Facebook page of the current mayor. Among the items he posted there in the final 48 hours of the campaign: Hillary Clinton Calling for Civil War If Trump Is Elected. Pope Francis Shocks World, Endorses Donald Trump for President. Barack Obama Admits He Was Born in Kenya. FBI Agent Who Was Suspected Of Leaking Hillary’s Corruption Is Dead.
These are not legit anti-Hillary stories. (There were plenty of those, to be sure, both on his page and in this election cycle.) These are imaginary, made up, frauds. And yet Facebook has built a platform for the active dispersal of these lies — in part because these lies travel really, really well. (The pope’s “endorsement” has over 868,000 Facebook shares. The Snopes piece noting the story is fake has but 33,000.)
Another idea would be to hire a team of journalists and charge them with separating at least the worst of the fake news from the stream. Not the polemics (from either side) that sometimes twist facts like balloon animals — I’m talking about the outright fakery. Stories known to be false could be downweighted in Facebook’s algorithm, and users trying to share them could get a notice telling them that the story is fake. Sites that publish too much fraudulent material could be downweighted further or kicked out entirely.I think something like this has to be done, because I don't think we can survive as a nation if we all come to believe malicious lies about each other.
Would this or other ideas raise freedom of speech or other thorny issues? Sure. This would be easy to screw up — which is I’m sure why Facebook threw up its hands at the pushback to a human-edited Trending section and why it positions itself a neutral connector of its users to content it thinks they will find pleasing. I don’t know what the right solution would be — but I know that getting Mark Zuckerberg to care about the problem is absolutely key to the health of our information ecosystem.