The Media Today

Most Americans think platforms should stop filtering news

August 15, 2018
 

Sign up for The Media Today, CJR’s daily newsletter.

Whether giant social platforms like Google, Facebook and Twitter should filter (or censor) content on their networks is the subject of much debate, thanks in part to notorious conspiracy theorist and nutritional supplement-peddler Alex Jones of Infowars, who has seen his video diatribes and pages removed by YouTube, Facebook, Spotify, and Pinterest, but not by Twitter. In that context, a look at public attitudes towards such behavior seems particularly timely, and that’s exactly what the Knight Foundation and Gallup have come up with in a study published this morning.

For the survey, “Major Internet Companies as News Editors,” Knight (which also funds CJR) and Gallup asked more than 2,000 US adults for their opinions on whether the platforms are doing a good job of delivering the news, whether they need to change, and if so, how. The good news is that more than half of those surveyed said they believe internet companies in general help people become better informed about the world around them. The bad news is that about 85 percent feel the platforms aren’t doing enough to stop the spread of misinformation.

What exactly does stopping misinformation mean, in the eyes of users? Platforms removing more content from bad actors like Alex Jones, right? Apparently not. More than 60 percent of those surveyed said they were concerned that removing or excluding content from news feeds gives people “a biased picture of the news,” and restrict the expression of certain viewpoints. About 80 percent said internet companies should show all users the same information from the same news organizations—in other words, no filtering whatsoever.

Then there’s the real kicker: Almost 80 percent of those who responded to the survey said they believe internet companies should be regulated like traditional media—although it’s not clear exactly what they meant by this. Newspapers aren’t subject to a lot of regulation about what kind of content they can publish, apart from obscenity rules. Broadcasters are regulated and licensed by the FCC, but in general, the mainstream press are free to publish misinformation in much the same way that Facebook is, with one very important difference: They can be sued for slander or defamation, and Facebook can’t.

That’s because the major web platforms such as Facebook are protected by Section 230 of the Communications Decency Act, which insulates them from legal liability for things their users post. This is a protection that isn’t available to traditional media outlets (except for user content hosted on their websites). Some critics, including members of Congress, have mused out loud about either watering down the protection Section 230 provides, or possibly removing it altogether. It sounds as though many of those who were surveyed by Knight and Gallup would support such a change, although the web companies argue losing this protection would make it almost impossible for them to stay in business.

Here’s more on the thorny problem of filtering content on giant web platforms:

  • De-platforming works: In the wake of the “de-platforming” of Alex Jones, one important question is whether doing this has any real impact, or whether it only makes him look like a martyr, and plays into the conspiracy theories about the platforms being biased against conservatives. According to a piece at Vice Media’s Motherboard site, there is some evidence that silencing someone on distributed platforms actually does succeed in reducing their reach and influence.
  • How to cover hate: When it comes to toxic groups such as the racist fringes of the conservative movement, what is the right way to cover these groups without giving them too much attention? CNN’s Brian Stelter wrote about this conundrum in the context of the second Unite the Right rally in Charlottesville, and Amanda Darrach wrote for CJR about this same question, a piece based in part on an interview with Whitney Phillips, a researcher with Data and Society.
  • Deleting conspiracies: One of the parents of a child who was murdered in the Sandy Hook shootings tells The New York Times how he spends most of his days trying to erase conspiracy theories that his son’s death was a hoax and he is a “crisis actor.” Although he has convinced many platforms to take such content down, Leonard Pozner says he has had no success with Automattic, the corporate parent of WordPress.com, which says misinformation doesn’t breach its rules.
  • Be careful what you wish for: David Greene of the Electronic Frontier Foundation says he’s in favor of having a discussion about the platforms removing content, but that there is a lot more to talk about than Alex Jones. The web giants routinely silence the voices of Moroccan atheists, women talking about harassment, Muslim activists, and many other groups, he says, and we should be “extremely careful before rushing to embrace an internet that is moderated by a few private companies.”
  • A global problem: The problem of what to do with a demagogue like Alex Jones may seem unique to the US, but Facebook and other platforms have been fighting similar problems for years in a number of countries, says Max Fisher in The New York Times. In Myanmar, a Buddhist monk has been using the social network to spread a message of hate against the Rohingya people for some time, and other groups have used it for similar purposes in Sri Lanka, and both cases have led to real-world violence.
Sign up for CJR’s daily email

 

Other notable stories:

  • Brian Feldman, who writes for New York magazine’s Select All site, says he is cheering the news that Facebook reportedly doesn’t care about publishers and doesn’t want to talk about traffic with news companies any more, a signal that it is done sending them clicks. “That old world absolutely sucked and made the internet ecosystem a discernibly worse place,” Feldman writes.
  • Josh Russell works as a programmer and system administrator at Indiana University and has a quiet home life as the father of two children, but in his spare time he fights online trolls, and has helped expose a number of fake accounts and disinformation networks set up by the notorious Russian troll farm, the Internet Research Agency.
  • Justin Ray writes for CJR about David Bishop, the owner and sole reporter for a website called FLA News, who reported that a candidate in a Republican primary did not have the degree she claimed to have on her campaign website. After initially denying the story and forcing Bishop to take the story down, the candidate later admitted she lied about the degree and shut down her campaign.
  • Everyone is mad about how Twitter, Facebook, and YouTube have distorted the online conversation and failed to act against misinformation, so we should all go back to using Tumblr, one of the original social networks, according to Jeremy Gordon. The only downside? Alex Jones is apparently moving to the network as well.
  • The owner of New York magazine and related websites such as The Cut is considering a sale of the properties, according to a report in The Wall Street Journal. The assets are controlled by a trust set up by former New York financier Bruce Wasserstein, who acquired New York in 2003 for $55 million. Wasserstein’s daughter, Pamela, took over running the company in 2016.
  • Dan Gillmor, a veteran technology writer who teaches journalism at Arizona State University in Phoenix, warns journalists that “the war on what you do is escalating,” and says if media companies and individual reporters want to try and fight back, they need to cooperate more and devote their time and resources to big stories such as the corruption inside Washington.
Mathew Ingram was CJR’s longtime chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.