Sign up for The Media Today, CJR’s daily newsletter.
Last week, executives from some of the world’s largest social platforms—Meta, which owns Facebook and Instagram, as well as TikTok, Snap, Discord, and X, formerly known as Twitter—testified at a Senate hearing about children’s safety online. The session featured the sort of grandstanding by senators that often occurs in such hearings, including a bizarre detour into whether Shou Zi Chew, the CEO of TikTok (which is based in China), is a member of the Chinese Communist Party (even though he is from Singapore). One particularly striking moment came after Josh Hawley, the Republican senator from Missouri, asked Mark Zuckerberg, the CEO of Meta, to apologize to some of the families present at the hearing, including Maurine Molak, whose son David died by suicide at sixteen after he was cyberbullied, and Todd and Mia Minor, whose son Matthew died at twelve after he took part in an online “blackout challenge.” Zuckerberg complied. “No one should have to go through the things that your families have suffered,” he said.
Parents in the viewers’ gallery, including Molak and the Minors, weren’t just there for an apology from Zuckerberg or the other executives, but to pressure legislators to pass the Kids Online Safety Act, a law that, according to its supporters, could help prevent the kinds of dangers that their children were exposed to. Molak told NBC News that she and other parents are up against a “billion-dollar lobby campaign” funded by the tech platforms and aimed at blocking the legislation. She added that she and the other parents in attendance are “sick and tired of [tech platforms] deploying all of these people to crush the work that we’re doing.”
KOSA was first proposed in 2022 by Richard Blumenthal, the Democratic senator from Connecticut, and Marsha Blackburn, the Republican senator from Tennessee. Blumenthal has said that he was inspired to craft the legislation after Frances Haugen—a former Facebook staffer turned whistleblower, whose disclosures from inside the company were widely covered by numerous major news outlets in 2021—appeared before Congress that year. As part of her testimony, Haugen submitted internal documents that appeared to show that bosses at Meta knew that Facebook and Instagram were harming teenagers, by encouraging emotionally or physically harmful behavior, but had taken little or no action to prevent by way of mitigation. Blumenthal said that KOSA was necessary because Haugen had showed that Facebook knowingly “exploited teens using powerful algorithms that amplified their insecurities.”
As soon as it was introduced, however, the bill faced significant opposition from a number of digital-rights advocates and child safety organizations, which argued that its approach to protecting children online was both vague and in some ways misdirected. Others said that implementing the bill could do more harm than good—in 2022, more than ninety human rights and LGBTQ advocacy groups signed a letter arguing that it would “make kids less safe,” and could be “weaponized” to smother the discussion of contentious social topics. Others still said that the bill’s tracking protocols—intended to identify children using social media—amounted to an infringement of privacy, and that its restrictions on certain kinds of speech were in breach of the First Amendment.
Following discussions with KOSA’s most virulent critics, Blumenthal and Blackburn introduced a modified version of the bill that removed some of the more contentious requirements. But it has continued to face an uphill battle. The Electronic Frontier Foundation has argued that despite the modifications, KOSA is still “troubling,” since it would effectively require surveillance of anyone under the age of sixteen, and would put the “tools of censorship” in the hands of state attorneys general while endangering the rights and safety of young people online. According to the EFF, the bill would hold platforms liable if they do not “prevent and mitigate” societal ills such as anxiety, depression, bullying, and suicidal behavior—but deciding which aspect of a service is contributing to these ills would be up to the Federal Trade Commission and attorneys general in each state. The EFF says that this would put the platforms in an impossible situation, lacking clear guidance as to which aspects of their services infringe what demands of the law and thus making them more likely to censor any discussions that could hypothetically get them into trouble.
(Similar criticisms have dogged the UK’s new Online Safety Bill, which became law in September. That law forces any large social platform that hosts user-generated content in the UK to remove and prevent illegal material, including posts relating to terrorism and hate crimes as well as any “harmful and age-inappropriate” content that might be accessed by children. Platforms that don’t comply with the law can be fined up to the equivalent of twenty-three million US dollars or 10 percent of their global annual revenue, depending on which figure is higher. The Center for Strategic and International Studies has noted that while children’s-advocacy groups have celebrated the law, civil liberties groups and tech companies have said that its provisions on content moderation will limit freedom of expression, since the severe penalties for noncompliance could cause platforms to err on the side of overmoderating.)
Back in the US, KOSA’s critics have included danah boyd, a researcher with Microsoft (who spells her name without capital letters). Boyd has studied children’s online behavior since the days of MySpace, and wrote in a recent blog post that bills such as KOSA “pretend to be focused on helping young people when they’re really anti-tech bills that are using children for political agendas.” According to boyd, laws like KOSA will do little to change the fact that suicidal ideation and suicide completion rates among young people are increasing, as are depression and anxiety. Such laws presume that tech is the cause of all young people’s problems, boyd says—and more than that, presume that if tech companies were simply “forced to design better,” they could fix those problems. For boyd, this attitude constitutes a form of “technological solutionism.”
Mike Masnick of Techdirt, meanwhile, has pointed out that there isn’t even a consensus among experts as to whether being online is harmful to children at all. Last fall, a study by the Pew Research Center found that for a majority of teens, social media was more helpful than harmful, while a report from the American Psychological Association found no causal link between social media and harm. A similar report from the US surgeon general also found no causal link between social media and harms to teens, and an Oxford University study of nearly a million people in more than seventy countries found no evidence that social media leads to psychological harm. A recent report from the Journal of Pediatrics looked at decades of research into young people and mental health. It, too, found no data to support the claim that social media was a part of the problem.
And one of the risks with KOSA, critics say, is that state attorneys general will exploit its vague language around online safety to censor content that they feel is inappropriate for children, including discussions of homosexuality, trans rights, and race. The EFF notes that Blackburn has previously referred to education about racial discrimination as “dangerous for kids.” (At one point, she suggested that KOSA would help “protect minor children from the transgender,” though a spokesperson said that her comments were “taken out of context.”) The conservative Heritage Foundation supports the bill because it believes that censoring LGBTQ content is necessary to protect children. And a number of states have already passed laws limiting public education about the history of race-, gender-, and sexuality-based discrimination. If KOSA passes, the EFF argues, platforms are likely to “preemptively block conversations that discuss these topics.”
Some social platforms and tech companies have been among KOSA’s critics. But on the eve of the Senate hearing last week, Microsoft announced that it supports the proposed legislation. Brad Smith, the company’s president and vice-chairman, said that the bill provides a “reasonable, impactful approach” to address the online safety of children. Masnick argues that Microsoft’s public support for KOSA is “an easy way to cozy up with Congress,” and carries little risk because the company doesn’t own a social network. Linda Yaccarino, the CEO of X, told the committee that her company is also in favor of the bill, as did Evan Spiegel, the CEO of Snap. Some critics believe that they did so for a similar reason—to curry favor with regulators—though unlike Smith, they do, of course, work for social networks. Zuckerberg, for his part, said that he agrees with the “basic spirit” of the bill, but declined to endorse it.
Ari Cohn of TechFreedom, a nonprofit think tank that focuses on technology issues, has argued that one of the biggest challenges for KOSA is likely to derive from the First Amendment: the law places a “duty of care” on the platforms to militate against the harmful effects of certain kinds of speech, and yet the “overwhelming majority” of this speech is constitutionally protected. The courts, Cohn says, have so far refused to impose “vague, expansive, and inherently unmeetable duties of care” on platforms or other entities that disseminate constitutionally protected expression, because doing so would chill First Amendment activity. That could turn out to be a significant hurdle for KOSA—unless, of course, the courts decide that teenagers don’t have any First Amendment rights. But this seems unlikely.
Other notable stories:
- Phyllis Zorn—a reporter at the Marion County Record, the Kansas paper that was raided by police last year after Zorn used a state database to access a public record, touching off a national outcry—is suing the city of Marion and a clutch of current and former officials, claiming that the raid violated her First and Fourth Amendment rights and affected her physical and mental health. Zorn’s lawsuit alleges that Gideon Cody, then the police chief, offered Zorn investment to found a competitor to the Record after finding out that the paper was investigating his past, then put Zorn on his “enemies list” after she laughed at the suggestion. The Kansas City Star’s Katie Moore has more.
- Last week, WLRN, a public radio station in Miami, abruptly canceled Sundial, a popular show, and laid off the team behind it, including Carlos Frías, the host. Yesterday, Frías filed a federal complaint—a precursor to a likely lawsuit—alleging that he was fired after complaining internally about discrimination at the station. Among other things, Frías alleges that an editor said that Sundial was “sounding very Latino” and started tracking the ethnicity of the show’s guests, while a higher-up said that producers needed to consider listeners’ “cultural comfort zones.” Martin Vassolo has more for Axios Miami.
- In yesterday’s newsletter, we noted that two students at Northwestern University in Illinois were facing charges under an obscure law after producing and distributing a parody cover of the student paper that accused the university of complicity in the genocide of Palestinians. Following protests by student groups and the student paper’s own editorial board, however, the paper’s parent company, which had initially taken the matter to the police, backed off; now prosecutors have dropped the charges.
- The Post’s Ashley Fetters Maloy spoke with five top editors of fashion magazines to find out how the role—once associated with jet-setting, luxury offices, and other glamorous perks—has changed. “Some of these perks still exist,” Fetters Maloy writes. “But as the magazine industry has entered the leaner, faster-paced internet era, a diverse and dynamic class of millennial women has risen to the top of fashion media.”
- And Wired’s Kate Knibbs spoke with Nebojša Vujinović Vujo, a Serbian AI entrepreneur who has built a business “snapping up abandoned news outlets and other websites and stuffing them full of algorithmically generated articles.” Vujo “gets why writers are unhappy that their work has been erased and replaced by clickbait,” Knibbs writes, but still argues “that his life has been tougher than that of the average American blogger.”
ICYMI: Dahlia Lithwick on the Colorado case, the election, and the press
Has America ever needed a media defender more than now? Help us by joining CJR today.