Sign up for The Media Today, CJRâs daily newsletter.
All new technologies have their champions and their naysayers, luddite matching tit for tat with techno-utopian. The arrival of artificial intelligence in newsrooms is no different. Some see its various iterations as tools for reducing grunt work; others see a field full of ethical land mines. Most see a little bit of both.
Artificial intelligence is a boon when processing large amount of material, and adapting to digitalization. Natural language processing can help analyze tweets en masse. The New York Times used machine learning to take its archive of recipes, add a structure to each one (by classifying what is an ingredient, what is a step, etc.), and create the NYT Cooking app. The Times also hopes to use automation to create a digital archive for its photo collection, which is currently sitting in the basement of the Times building.
Artificial intelligence still hasnât entered most newsrooms, and, as Jonathan Stray has written for CJR, it wonât replace journalism so much as augment it. As journalism dips its feet into automated story writing, data scraping, and the likeâand as the tools themselves become easier to use and widely availableânow is the time to consider what AI has to offer journalism, and weigh its potential drawbacks.
This week, a group including technologists, journalists, and legal experts gathered at Columbia Journalism School for a conference on the impact of artificial intelligence on journalism. Hosted by the Tow Center for Digital Journalism and the Brown Institute for Media Innovation, the event touched on AIâs entrance into reporting and writing, as well as into the relationship between news outlets and their readers. (You can watch the full livestream here.)
Take, for example, the advent of virtual voice-activated assistants such as Amazonâs Alexa, Google Home, and the new Apple HomePodâor even the now commonplace Siri on iPhone. These services, said Harvard Berkman Fellow Judith Donath, are anthropomorphized algorithms: They are computer code meant to evoke a human quality. This becomes relevant to the news when these services start delivering us headlines every morning. Because they sound human, they could change our behavior. âYouâre not worried your newspaper will think youâre shallow,â Donath said.
The biggest stumbling block for the entrance of AI into newsrooms is transparency. Transparency, a basic journalistic value, is often at odds with artificial intelligence, which usually works behind the scenes. This raises ethical issues when journalists begin using AI to assist in reporting. How transparent can or should they be about the code behind the story? Does explaining technical concepts increase trust, or decrease it? For instance, if a robot writes an article (they already write many basic sports stories), how should a news outlet disclose that to the reader? Other questions are more complicated, especially when journalistic ethics confront the proprietary concerns of technology companies. Angela Bassa, director of data science at iRobot, said the choice to be transparent often means sacrificing other benefits such as profit and scalability. When a data operation gets big, itâs harder to make it public simply because of the effort required to anonymize, format, and host the data. On the reporting side, lawyer Amanda Levendowski pointed out, working with AI means collecting vast swathes of data. Journalists have a special responsibility to acquire it legally and ethicallyâeven public social media posts.
Personalization of news is also a major concern. It poses problems for the establishment of a shared reality. Facebookâs all-knowing âalgorithmâ was blamed after the 2016 election for feeding people the news they wanted to see, rather than a balanced, bipartisan diet. Personalization also challenges the concept of news as a public record. If we each see the same basic story, but tailored to our age, gender, or cultural touchstones, there is no single story to archive, and therefore no single history of a given event.
The idea of artificial intelligence entering our lives becomes more approachable when it is broken down into realistic scenarios. But that doesnât make determining what we, as journalists, want our relationship with AI to be any less difficult. As Jon Keegan, a senior research fellow at Tow, said after the event: âJournalists who want to work with AI have a serious responsibility to understand the caveats with using these tools, how they work, and to speak with experts who use this tool in their daily work.â
Has America ever needed a media defender more than now? Help us by joining CJR today.