Join us
Tow Center

Study: Readers are hungry for news feed transparency

October 24, 2017
Image: Pexels

Sign up for The Media Today, CJR’s daily newsletter.

Algorithms. Algorithmic transparency. Algorithmic accountability.

These are terms we have all become accustomed to hearing whenever journalists, industry executives, commentators, or academics contemplate the contemporary media landscape. Less is known about how everyday news consumers understand these terms. It turns out they have a lot to say.

As part of our ongoing research into the relationship between platforms and publishers, the Tow Center for Digital Journalism conducted 13 focus groups with news consumers aged 18 to 65 from four cities around the US: Bowling Green, Kentucky; Elkhart, Indiana; New York; and San Francisco. Full results from the focus groups are available here.

One of the key areas we interrogated in the study is everyday news consumers’ attitudes towards the little black boxes that control which news they receive via social platforms. The specific term “algorithm,” deliberately avoided by the moderators, only cropped up in four of our sessions. But while the terminology varied, themes related to algorithms recurred throughout. And the consequences of platforms’ secretive approach to how their algorithms function were observable wherever we went.

We came away convinced that the need for algorithmic transparency is more urgent than ever. Some people explicitly say they want to know more. Others, underestimating the reach of algorithms, arguably need to know more.

One participant in Elkhart memorably described the moment she awakened to Facebook’s algorithm: “I started watching information about Facebook and found out they were following me enough that they only sent me the stuff that I clicked on. I wasn’t getting both sides of the story… They were just following me and giving me sugar when I was really looking for more… They were skewing the news to what I had picked. They personalized it… and that’s not why I was there. I was there to get information that was different or a different viewpoint than I was getting, and I’m very mad at Mark Zuckerberg.”

Sign up for CJR’s daily email

In Bowling Green, Kentucky, one of the younger participants realized during the discussion that his Instagram feed was now also dictated by an algorithm, leading him to reflect on how little understanding he had about what his feeds present him: “It’s just now hitting me that Instagram is not [chronological] either. Like, I’m sorry, but I would like to know why they’re telling me that I’m more likely to like this person’s pictures other than a different person’s pictures… I don’t understand exactly how that works, and I would definitely like to.”

While platforms often announce major algorithm changes via blog posts, they are often somewhat vague and, arguably, under-publicized. Instagram, for example, announced it was introducing an algorithm via a blog post that said users’ feeds would “soon be ordered to show the moments we believe you will care about the most”—without giving any indication of how it could or would make such decisions.

Some participants underestimated the role or complexity of algorithms by taking platforms at face value. One described how responsibility for Reddit’s scoring system is up to “all users,” neglecting the role of the platform’s algorithm. He told the group, “If it’s something interesting, it’s voted up, and if it’s not, it’s useless.” Consequently, “correct stuff [is] usually towards the top.” This, of course, is not the case. Reddit, like Facebook and Instagram, is controlled by an algorithm. Indeed, following a change to Reddit’s front page algorithm in December 2016, TechCrunch reported:

Yes, it’s true: Those numbers on the site aren’t just “upvotes minus downvotes” or anything so simple. The blue ball machine gif [that Reddit CTO Christopher Slowe] shared as an indication of how the system works is probably closer to the truth. And he indicated in another comment that there is “some slight fuzzing” to stymie would-be reverse engineers of the algorithm.

The “blue ball machine gif” (pictured below) is truly baffling, yet it was apparently shared to provide a sense of the complexity of the algorithm without giving away any proprietary information.

The image doesn’t reveal any trade secrets. But it does show why platform users are confused about how information is served to them, and how far there is to go for any degree of transparency to be achieved.

Other participants argued that they or their friends held the balance of power, likely overestimating the level of control they have on the platforms. A participant in Elkhart, Indiana, insisted, “It’s up to me what I see. What the social networks provide, I don’t know, I guess that’s also voluntary because you’re following who you want to follow… It’s not really Twitter’s fault. It’s not Facebook’s fault. They have their faults, but what I see may not be their fault.”

In New York, a participant made the case that Facebook friends have more power over what appears than the platform’s algorithm: “Your news feed is populated by your friends, and your friends are the ones perpetuating your news feed… As far as Facebook having the control, I really don’t agree with that. I think it’s definitely your social circle that controls that, at least on Facebook.”

Finally, multiple participants described abandoning certain platforms due to a lack of transparency about algorithm changes or how their personal data is used to shape their news feeds (and, indeed, how it is collected and used more generally).

One participant in Bowling Green, Kentucky, went into great detail about Reddit’s supposed lack of transparency about changes made to its front page algorithm. These changes, coupled with a more concerted effort to remove material deemed to be offensive, were, in his opinion, tantamount to “filtering” and designed to suppress right-wing perspectives. This, he argued, had been sufficiently egregious to drive him away from Reddit and over to 4Chan, a platform he felt was less “filtered.”

Participants from separate age groups in New York described similar rationale for either curtailing their use of Facebook or abandoning it completely. “I live in a very liberal bubble, New York and [a college], so I’ve been trying to mediate. I don’t think Facebook helped that, there’s all sorts of stuff about their algorithm and stuff like that, so I don’t go to it for news anymore because I don’t want to keep spiraling down that sort of very narrow worldview,” one said.

Having travelled 9,000 miles around the country, and dissected 20 hours’ worth of discussion, we saw a pressing need for algorithmic literacy. People have a right to know more about how and why the information served to them is being prioritized.

The problem, of course, is that we can’t have algorithmic literacy before we have algorithmic transparency. For that, we are dependent on the tech companies, who have arguably skirted around what is arguably a pretty straightforward ethical responsibility. The ball is in their court.

Has America ever needed a media defender more than now? Help us by joining CJR today.

Pete Brown is the Research Director at the Tow Center for Digital Journalism, and runs the Content Analysis Hub for the Publishers and Platforms project.