Is it Facebook and Google's responsibility to stop fake news? | The Tylt
A lot of fuss has been made over fake news since the 2016 election, and people are pointing to Google and Facebook as the main culprits. Critics say Facebook and Google should not be the ones who determine what is and isn't considered fake news—that essentially turns both companies into the thought police. Others say Facebook and Google play a huge role in the media ecosystem. Both organizations should acknowledge that role and take the lead in fighting fake news. What do you think?
Is it Facebook and Google's responsibility to stop fake news?
Following the election, Americans had a small panic over fake news, which many pundits said was directly responsible for getting Trump elected. Everyone between Russian intelligence, kids in Moldova, and random people in the United States were taking advantage of how Facebook, Google and other social platforms to spread disinformation or make money from sensational, factually inaccurate stories.
Susie Cagle highlights the problem Facebook created for itself in Divided and Platformed, a comic made for the Hass Institute which explores fake news, platforms, and the large underlying issues pinning everything together.
Critics say platforms like Facebook, Google and Twitter have become key players in how people communicate and therefore, how politics works. Critics singled out Facebook specifically for its News Feed, the algorithmically sorted and managed content feed designed to give users exactly what they want. The problems Facebook created and are facing are also applicable to companies like Google, Reddit and Twitter, which are all struggling to various degrees with misinformation on their platforms.
Facebook continues to pretend it is a neutral bystander despite its obvious and outsized role in politics today.
Something like 170 million people in North America use Facebook every day, a number that’s not only several orders of magnitude larger than even the most optimistic circulation reckonings of major news outlets but also about one-and-a-half times as many people as voted on Tuesday. Forty-four percent of all adults in the United States say they get news from Facebook, and access to to an audience of that size would seem to demand some kind of civic responsibility — an obligation to ensure that a group of people more sizable than the American electorate is not being misled. But whether through a failure of resources, of ideology, or of imagination, Facebook has seemed both uninterested in and incapable of even acknowledging that it has become the most efficient distributor of misinformation in human history.
Critics say Facebook is ultimately responsible because it created the conditions possible for fake news to flourish. Its feed identified and served users content they wanted to see, while actively hiding content users did not want to see. This makes sense for Facebook to do—if it wants to maximize the time people spend on site, why would it give anyone content that might not sit well with them? This dynamic creates echo chambers, placing users with others who think like them—and no one else. What Facebook is doing is nothing new, but it's never happened at such a huge and concentrated scale before.
The filter bubble has been a much discussed concern — for multiple years — but the consequences of algorithmically denuding the spectrum of available opinion, whilst simultaneously cranking open the Overton window along the axis of an individual’s own particular viewpoint, are perhaps becoming increasingly apparent this year, as social divisions seem to loom larger, noisier and uglier than in recent memory — at very least as played out on social media.
Facebook's algorithm is at the heart of how people find and consume information in today's world. This is true for Google, Twitter and every other social media platform. Because these organizations wield so much power, it is on them to make sure the environments they design and create do not have outsized negative impacts in the reality we live in. Only Facebook and Google has access to the mountains of data their respective platforms generate, and only Facebook and Google are able to change the way information moves on their platforms. Critics say they are the ones responsible for what happens on their sites.
Facebook may want to claim that it is remaining neutral, but that is a false and dangerous stance. The company’s business model, algorithms and policies entrench echo chambers and fuel the spread of misinformation.
Letting this stand is not neutrality; it amplifies the dangerous currents roiling the world. When Facebook is discussed in tomorrow’s history books, it will probably not be about its quarterly earnings reports and stock options.
To its credit, since the election, Facebook has implemented several fixes that it says will do a lot to halt, or at least mitigate, the spread of fake news. One of the biggest new features is a warning label which alerts users that content in their feed is potentially misleading. Instead of taking the lead to stop fake news, it's partnering with organizations like Snopes, ABC, The Associated Press and Politifact to vet content on their platform.
Others say Facebook and Google did not create the fake news problem and should not be part of the fight to stop it. First, fake news is nothing new. Fake news existed before Facebook was ever a thing—living on fringe website and spread through email chains. Before that, tabloids pushed fake news to consumers in gas stations and supermarkets everywhere. There's always been fake news for as long as there's been news. Second, critics say people aren't as stupid as others might think.
And yet, America still held elections over the last few decades with all of these sources of fake news, and did so successfully. Why? Because despite the attempts to paint the U.S. electorate as a bunch of unsophisticated hicks, most adults have no problem distinguishing fake news from the real thing. Voters have more resources than ever to help them consume news responsibly. They don't need Facebook to pre-digest their news and then spoon-feed it to them.
Facebook and Google's attempts to "fix" fake news by changing the way it surfaces information to consumers is a straightforward effort to censor some information over others. It's paternalistic and it assumes its users do not know what's best for themselves, or how to think critically. Maybe it's true, maybe it isn't, but do you really want proprietary algorithms to decide what is true and what isn't? It should be on the consumer to seek out better information and define what "better information" even is.
If consumers want better news, then they need to seek it out. Consumers should not rely on a community-driven news feed for their information, but instead seek out original sources, determine which they can trust, and then verify information before sharing it.
President Trump's supporters see Google's and Facebook's efforts to stop the spread of fake news as censorship of their political views. They say the news organizations responsible for vetting information are just as fake as the fake news they're disputing.
It ultimately comes down to people being uncomfortable with letting Google and Facebook say what is real and what is not. Despite their marketing and buzzwords, these companies do not exist for the greater good. These companies have a fiduciary responsibility to investors, not to the public. That's the bottom line.
Cagle's Divided and Platformed explores this idea here:
Those who do not trust Facebook, Google and other tech companies say it's on the average person to be critical of what they read and what they share. It's not on these companies to fix it because they'll never truly have the public's interest at heart. Americans don't need Facebook and Google to stop fake news. They need Facebook and Google to give them access to the full picture.
America still held elections over the last few decades with all of these sources of fake news, and did so successfully. Why? Because despite the attempts to paint the U.S. electorate as a bunch of unsophisticated hicks, most adults have no problem distinguishing fake news from the real thing. Voters have more resources than ever to help them consume news responsibly. They don't need Facebook to pre-digest their news and then spoon-feed it to them.