Sign up for The Media Today, CJRâs daily newsletter.
Algorithms. Algorithmic transparency. Algorithmic accountability.
These are terms we have all become accustomed to hearing whenever journalists, industry executives, commentators, or academics contemplate the contemporary media landscape. Less is known about how everyday news consumers understand these terms. It turns out they have a lot to say.
As part of our ongoing research into the relationship between platforms and publishers, the Tow Center for Digital Journalism conducted 13 focus groups with news consumers aged 18 to 65 from four cities around the US: Bowling Green, Kentucky; Elkhart, Indiana; New York; and San Francisco. Full results from the focus groups are available here.
One of the key areas we interrogated in the study is everyday news consumersâ attitudes towards the little black boxes that control which news they receive via social platforms. The specific term âalgorithm,â deliberately avoided by the moderators, only cropped up in four of our sessions. But while the terminology varied, themes related to algorithms recurred throughout. And the consequences of platformsâ secretive approach to how their algorithms function were observable wherever we went.
We came away convinced that the need for algorithmic transparency is more urgent than ever. Some people explicitly say they want to know more. Others, underestimating the reach of algorithms, arguably need to know more.
One participant in Elkhart memorably described the moment she awakened to Facebookâs algorithm: âI started watching information about Facebook and found out they were following me enough that they only sent me the stuff that I clicked on. I wasnât getting both sides of the story… They were just following me and giving me sugar when I was really looking for more… They were skewing the news to what I had picked. They personalized it⌠and thatâs not why I was there. I was there to get information that was different or a different viewpoint than I was getting, and Iâm very mad at Mark Zuckerberg.â
In Bowling Green, Kentucky, one of the younger participants realized during the discussion that his Instagram feed was now also dictated by an algorithm, leading him to reflect on how little understanding he had about what his feeds present him: âItâs just now hitting me that Instagram is not [chronological] either. Like, Iâm sorry, but I would like to know why theyâre telling me that Iâm more likely to like this personâs pictures other than a different personâs pictures⌠I donât understand exactly how that works, and I would definitely like to.â
While platforms often announce major algorithm changes via blog posts, they are often somewhat vague and, arguably, under-publicized. Instagram, for example, announced it was introducing an algorithm via a blog post that said usersâ feeds would âsoon be ordered to show the moments we believe you will care about the mostââwithout giving any indication of how it could or would make such decisions.
Some participants underestimated the role or complexity of algorithms by taking platforms at face value. One described how responsibility for Redditâs scoring system is up to âall users,â neglecting the role of the platformâs algorithm. He told the group, âIf itâs something interesting, itâs voted up, and if itâs not, itâs useless.â Consequently, âcorrect stuff [is] usually towards the top.â This, of course, is not the case. Reddit, like Facebook and Instagram, is controlled by an algorithm. Indeed, following a change to Redditâs front page algorithm in December 2016, TechCrunch reported:
Yes, itâs true: Those numbers on the site arenât just “upvotes minus downvotes” or anything so simple. The blue ball machine gif [that Reddit CTO Christopher Slowe] shared as an indication of how the system works is probably closer to the truth. And he indicated in another comment that there is “some slight fuzzing” to stymie would-be reverse engineers of the algorithm.
The âblue ball machine gifâ (pictured below) is truly baffling, yet it was apparently shared to provide a sense of the complexity of the algorithm without giving away any proprietary information.
The image doesnât reveal any trade secrets. But it does show why platform users are confused about how information is served to them, and how far there is to go for any degree of transparency to be achieved.
Other participants argued that they or their friends held the balance of power, likely overestimating the level of control they have on the platforms. A participant in Elkhart, Indiana, insisted, âItâs up to me what I see. What the social networks provide, I donât know, I guess thatâs also voluntary because youâre following who you want to follow⌠Itâs not really Twitterâs fault. Itâs not Facebookâs fault. They have their faults, but what I see may not be their fault.â
In New York, a participant made the case that Facebook friends have more power over what appears than the platformâs algorithm: âYour news feed is populated by your friends, and your friends are the ones perpetuating your news feed⌠As far as Facebook having the control, I really donât agree with that. I think itâs definitely your social circle that controls that, at least on Facebook.â
Finally, multiple participants described abandoning certain platforms due to a lack of transparency about algorithm changes or how their personal data is used to shape their news feeds (and, indeed, how it is collected and used more generally).
One participant in Bowling Green, Kentucky, went into great detail about Redditâs supposed lack of transparency about changes made to its front page algorithm. These changes, coupled with a more concerted effort to remove material deemed to be offensive, were, in his opinion, tantamount to âfilteringâ and designed to suppress right-wing perspectives. This, he argued, had been sufficiently egregious to drive him away from Reddit and over to 4Chan, a platform he felt was less âfiltered.â
Participants from separate age groups in New York described similar rationale for either curtailing their use of Facebook or abandoning it completely. âI live in a very liberal bubble, New York and [a college], so Iâve been trying to mediate. I donât think Facebook helped that, thereâs all sorts of stuff about their algorithm and stuff like that, so I donât go to it for news anymore because I donât want to keep spiraling down that sort of very narrow worldview,â one said.
Having travelled 9,000 miles around the country, and dissected 20 hoursâ worth of discussion, we saw a pressing need for algorithmic literacy. People have a right to know more about how and why the information served to them is being prioritized.
The problem, of course, is that we canât have algorithmic literacy before we have algorithmic transparency. For that, we are dependent on the tech companies, who have arguably skirted around what is arguably a pretty straightforward ethical responsibility. The ball is in their court.
Has America ever needed a media defender more than now? Help us by joining CJR today.