Join us
The Media Today

The social media platforms, the ‘Big Lie,’ and the coming elections

September 27, 2022
Meta Mirrors: International Journalism Festival, Hana joy, 2022

Sign up for The Media Today, CJR’s daily newsletter.

In August, Twitter, Google, TikTok, and Meta, the parent company of Facebook, released statements about how they intended to handle election-related misinformation on their platforms in advance. For the most part, it seemed they weren’t planning to change much. Now, with the November 8 midterms drawing closer, Change the Terms, a coalition of about sixty civil rights organizations, says the social platforms have not done nearly enough to stop continued misinformation about “the Big Lie”—that is, the unfounded claim that the 2020 election was somehow fraudulent. “There’s a question of: Are we going to have a democracy?” Jessica González, a Free Press executive involved with the coalition, recently told the Washington Post. “And yet, I don’t think they are taking that question seriously. We can’t keep playing the same games over and over again, because the stakes are really high.”

González and other members of Change the Terms say they have spent months trying to persuade the major platforms to do something to combat election-related disinformation, but their lobbying campaigns have had little or no impact. Naomi Nix reported for the Post last week that coalition members have raised their concerns with platform executives in letters and meetings, but have seen little action as a result. In April, Change the Terms called on the platforms to “Fix the Feed” before the elections, requesting that the same companies change their algorithms in order to “stop promoting the most incendiary, hateful content”; “protect people equally,” regardless of what language they speak; and share details of their business models and approaches to moderation.

“The ‘big lie’ has become embedded in our political discourse, and it’s become a talking point for election-deniers to preemptively declare that the midterm elections are going to be stolen or filled with voter fraud,” Yosef Getachew, a media and democracy program director at Common Cause, a government watchdog, told the Post in August. “What we’ve seen is that Facebook and Twitter aren’t really doing the best job, or any job, in terms of removing and combating disinformation that’s around the ‘big lie.’” According to an Associated Press report in August, Facebook “quietly curtailed” some of the internal safeguards designed to smother voting misinformation. “They’re not talking about it,” Katie Harbath, a former Facebook policy director who is now CEO of Anchor Change, a technology policy advisory firm, told the AP. “Best-case scenario: They’re still doing a lot behind the scenes. Worst-case scenario: They pull back, and we don’t know how that’s going to manifest itself for the midterms on the platforms.”

Change the Terms first called on the platforms to reduce online hate speech and disinformation following the deadly 2017 neo-Nazi march in Charlottesville, Virginia; since then, the coalition notes, “some technology companies and social media platforms remain hotbeds” of such activity, offering the January 6 Capitol insurrection as a prime example. The coalition tried to keep up the pressure on the platforms throughout the past six months to “avoid what is the pitfall that inevitably has happened every election cycle, of their stringing together their efforts late in the game and without the awareness that both hate and disinformation are constants on their platforms,” Nora Benavidez, director of digital justice at Free Press, told the Post

As Nix notes, the coalition’s pressure on the social media platforms was fueled in part by revelations from Frances Haugen, the former member of Facebook’s integrity team who leaked thousands of internal documents last year. Haugen testified before Congress that, shortly after the 2020 election, the company had rolled back many of the election-integrity measures that were designed to stamp out misinformation. An investigation by the Post and ProPublica last year showed that a number of Facebook groups became hotbeds of misinformation about the allegedly fraudulent election in the days and weeks leading up to the attack on the Capitol. Efforts to police such content, the investigation found, “were ineffective and started too late to quell the surge of angry, hateful misinformation coursing through Facebook groups—some of it explicitly calling for violent confrontation with government officials.” (A spokesman for Meta said in a statement to the Post and ProPublica that “the notion that the January 6 insurrection would not have happened but for Facebook is absurd.”) 

A recent report showed that misinformation about the election helped create an entire ecosystem of disinformation-peddling social accounts whose growth the platforms seem to have done little to stop. In May, the Post wrote about how Joe Kent, a Republican congressional candidate, had claimed “rampant voter fraud” in the 2020 election in an ad on Facebook. The ad was reportedly just one of several similar ads that went undetected by internal systems.

Sign up for CJR’s daily email

YouTube told the Post recently that the company “continuously” enforces its policies, and had removed “a number of videos related to the midterms.” TikTok said it supports the Change the Terms coalition because “we share goals of protecting election integrity and combating misinformation.” Facebook declined to comment, and referred to an August news release listing the ways the company said it planned to promote accurate information about the midterms. Twitter said it would be “vigilantly enforcing” its content policies. Earlier this year, however, the latter company said it had stopped taking steps to limit misinformation about the 2020 election. Elizabeth Busby, a spokesperson, told CNN at the time that the company hadn’t been enforcing its integrity policy related to the election since March 2021. Busby said the policy was designed to be used “during the duration” of an election, and since the 2020 election was over, it was no longer necessary.

Here’s more on the platforms:

  • Whiffing it: In Protocol’s Policy newsletter, Ben Brody writes that the election misinformation problem is about more than just the US. “Take Brazil,” he says. “President Jair Bolsonaro appears to be poised to lose his reelection bid, which he kicked off by preemptively questioning the integrity of the country’s vote.” Facebook has already missed a lot of misinformation in Brazil, critics say. In addition, Brody notes, there are potentially contentious elections elsewhere, including in nations “with civic turmoil or tenuous freedom, such as Turkey, Pakistan, and Myanmar. If we want to fix this, we need to acknowledge the problem is bigger than Big Tech whiffing it on content moderation, especially in the US.”
  • The time of Nick: Nick Clegg, president of global affairs at Meta, said he will be the one to decide whether to reinstate former president Donald Trump’s account in January of next year, according to Politico. Trump was banned from Facebook for two years in the wake of the January 6 attack on the Capitol. At an event in Washington put on by Semafor, the news startup from former Times media reporter Ben Smith, Clegg said whether to extend Trump’s suspension is “a decision I oversee and I drive,” although he said he would consult with Mark Zuckerberg, Meta’s CEO. “We’ll talk to the experts, we’ll talk to third parties, we will try to assess what we think the implications will be,” Clegg said.
  • Predator and prey: More than seventy lawsuits have been filed this year against Meta, Snap, TikTok, and Google claiming that adolescents and young adults have suffered anxiety, depression, eating disorders, and sleeplessness as a result of their addiction to social media, Bloomberg reports. In at least seven cases, the plaintiffs are the parents of children who’ve died by suicide. Bloomberg said the cases were likely spurred in part by testimony from Facebook whistleblower Haugen, who said the company knowingly preyed on vulnerable young people to boost profits, and shared an internal study that found some adolescent girls using Instagram suffered from body-image issues.

 

Other notable stories:

  • The public-stock offering of Truth Social, the media platform Donald Trump started after he was banned from Twitter and Facebook in January 2021, could be in trouble, CNBC reported yesterday. Digital World Acquisitions, the special-purpose acquisition company that wanted to take Truth Social public, has lost $138 million of private financing, the company stated in a recent regulatory filing. Per CNBC, investors said they pulled their funds from Digital World because of legal problems facing the company and Trump, as well as the app’s lackluster performance.
  • British broadcasters say they have been told by Buckingham Palace that they can only save sixty minutes’ worth of TV footage from Queen Elizabeth’s funeral, and that the royal family has a veto over any clips included in that total, The Guardian reported. “Once the process is complete, the vast majority of other footage from ceremonial events will then be taken out of circulation,” the paper wrote. “Any news outlets wishing to use unapproved pieces of footage would have to apply to the royal family on a case-by-case basis.”
  • South Korea’s President Yoon Suk-yeol has accused the country’s media of damaging the country’s alliance with the US, after a TV network aired a recording of him apparently swearing about US lawmakers following a session at last week’s United Nations General Assembly in New York. Yoon’s press secretary said the president was referring to the South Korean parliament.
  • Journalists in Guatemala say public officials have used a law designed to stop violence against women to prevent journalists from reporting on certain stories, according to the Los Angeles Times. The paper cited eight recent instances when journalists have received restraining orders for causing “psychological violence” to the subjects of the stories they were reporting on, subjects who in some cases were the female relatives of public officials accused of corruption. The Guatemalan law was passed in 2008 to reduce the high rates of gender-based violence in that country. 
  • As protests over the death of Mahsa Amini continue to spread across Iran, the government has responded by shutting down mobile internet services and disrupting social media sites, Wired reports. Social media platforms have been filled with videos of female protesters burning their head coverings or waving them in the air, in defiance of the government’s ban on women showing their hair. (Amini died in police custody after being arrested for allegedly wearing her head covering improperly.)
  • Police in Pakistan said Sunday that they arrested Ayaz Amir, a well-known columnist and TV personality, for his alleged involvement in the death of his daughter-in-law. Amir, who appeared in court in Islamabad on Sunday, is accused of helping his son Shahnawaz, who police say attacked and killed his wife at their home.  
  • Some journalists in Chicago fear that their access to police radio frequencies may be limited, after the Chicago Police Force announced they will switch to digitally encrypted radio channels at year’s end, the Chicago Tribune reported. City officials said the move will prevent outsiders from interjecting with rogue chatter, and that Broadcastify, a live audio platform, will stream the police radio communications online for free, at a thirty-minute delay. However, dispatchers can censor information from those broadcasts, which has led to transparency concerns in the local journalism community. 

Has America ever needed a media defender more than now? Help us by joining CJR today.

Mathew Ingram was CJR’s longtime chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.