Sign up for The Media Today, CJRâs daily newsletter.
On Wednesday, both Facebook and Twitter took steps to limit the distribution of a news story from a mainstream publication, on the grounds that it was based on hacked emails and of questionable accuracy. Twitter actually prevented users from posting a link to the story, and in some cases prevented users from clicking on existing links to it, instead showing them a warning with a message saying the story violated the company’s terms of service. Facebook didn’t stop anyone from posting a link to the story, but reduced its reach by tweaking the News Feed algorithm so fewer users would see it.
The story was a New York Post report alleging that Democratic presidential candidate Joe Bidenâs son, Hunter, introduced his father to the head of a natural gas company in the Ukraine. The source? Emails allegedly retrieved from Hunter Biden’s laptop by a computer repair shop and given to Trump attorney Rudy Giuliani. In Twitter’s case, the company argued that the story breached its policy against distribution of content obtained through hacking, and said documents included with the story also contained an individualâs identifying information, which is against privacy rules. Facebook, meanwhile, said its position against “hack and leak” operations required it to reduce the distribution of the story while it was being fact-checked by third-party partners.
Unsurprisingly, these moves triggered an avalanche of censorship accusations from conservatives. Sen. Josh Hawley went so far as to argue in a letter to the Federal Election Commission that removing the story was a benefit to Biden, and therefore amounted to a campaign finance violation, and said the Judiciary Committee will vote on whether to subpoena Twitter CEO Jack Dorsey to explain his actions. Others, including Sen. Ted Cruz, argued that Facebook and Twitter had breached the First Amendment. Rep. Doug Collins said that the blocks were “a grave threat to our democracy.”Â
Such arguments ignore the fact Facebook and Twitter are protected by the First Amendment, and also by Section 230 of the Communications Decency Act, which allows them to make content-moderation decisions without penalty. Many of the arguments are also clearly being made in bad faith, and are a variation on the “platforms censor conservatives” canard that has been rattling around Congress for years without a shred of evidence.Â
At the same time, however, it’s true that the decisions made by the two platforms are problematic. For instance, Twitter’s policy not to allow users to post “content published without authorization” is extremely vague, and could theoretically block not just questionable stories from the New York Post, but also valuable investigative stories based on leaked content, including the Pentagon Papers and virtually everything from WikiLeaks. (Late Thursday, the company said it has revised its policy, and will now apply labels instead of blocking users from posting links that refer to hacked material.)
The incident also highlights a broader problem with both platforms, and that is a lack of detail about their policies, and how and when they are implemented. Twitter CEO Jack Dorsey admitted that the company didn’t do a good job of explaining itself when it first blocked the Post story, but the followup wasnât that helpful; while it said the story violated multiple policies, it didnât contain a lot of detail about either one. Facebook, meanwhile, has a habit of just pointing to its algorithm as though it absolves the company of any need to explain itself, and routinely promises things that never come to pass.
âThere will be battles for control of the narrative again and again over the coming weeks,â Evelyn Douek, a lecturer at Harvard Law School, told the New York Times. âThe way the platforms handled it is not a good harbinger of whatâs to come.â
This episode is not only infuriating for those who would like some clarity on the decision-making at these platforms, but it makes it that much easier for bad faith actors to argue that the companies are doing something unsavory or illegal, which leads to show trial-style hearings that often amount to a lot of sound and fury, signifying very little. If we are to trust these giant tech corporations to make decisions around what kind of journalism can be shared on their networks, we’re going to need a lot more transparency and a lot less hand waving.
Has America ever needed a media defender more than now? Help us by joining CJR today.