Sign up for The Media Today, CJR’s daily newsletter.
David Carr got into the Super Bowl media scrum with a thumbsucker about why so many people watch the Super Bowl these days (as well as the Grammys and other such live-TV spectacles).
The short answer: a desire for community in an age of smartphone “hobbit holes” and individualized media streams.
At a time of atomization in which we all end up down the hobbit holes of our special interests, big live television fulfills a need to have something, anything, in common. You can go on Twitter on any given night to discuss the second episode of the third season of “Girls” with your like-minded pals, but if you want to talk about something that your boss, your mother, your cabdriver and your bartender all have an opinion on, this week it will probably include the words “Peyton Manning” and some cliché about what can happen on any given Sunday.
Similarly, the week before, we were all chatting about the weddings that Queen Latifah presided over at the Grammys, and the week before that, Jacqueline Bisset’s strangely riveting speech at the Golden Globes. Those moments happened at a specific time and place that your DVR may have recorded, but did not really capture.
That may be part of the reason that even as network ratings have dropped 29 percent over the last decade, the Grammys have added six million viewers, the Academy Awards have added three million give or take, and the Golden Globes have managed to hold steady over the same time period, according to the Nielsen Company.
Maybe so, but Carr’s numbers don’t quite add up.
The Grammys drew 28.5 million viewers this year, up 2.3 million from 2004—not 6 million. The Oscars drew 40.3 million last year, up 7.7 million from 2003—not “three million give or take.” The Globes were watched by 19.7 million viewers, down 7.1 million from 2004, not flat.
But raw viewer totals aren’t the best way to measure how popular a broadcast has been over many years. The US has grown by about 25 million people since 2004, equivalent to another Texas.
Ratings shares are a better metric, and those show a somewhat different story, as you can see from the Times‘s own graphic next to some of the numbers it contradicts:
Even that’s not the best measure for this question, though. Household share measures the percentage of viewers watching a specific show against the pool of people with their TVs on. The old-fashioned Nielsen ratings show the percentage of people watching out of everyone who owns a TV—whether it’s on or off.
The Super Bowl is actually the great exception, with ratings in recent years that exceed those in the 1990s and slightly best the 1980s average.
The Grammys, though, averaged a 21.5 rating in the 1980s and 16.3 in the 1990s. They managed a 16 this year.
And the Oscars averaged 30.8 in the 1980s and 30.2 in the 1990s. They scored a 23 last year. This isn’t news. It was just two years ago, after all, that the Times reported that “Oscars’ Flat TV Ratings Worry Hollywood.”
This isn’t to say that Carr’s overall thesis—that we seek common experience in an atomized media world—is wrong. A 23 rating for the Oscars in an age of essentially unlimited media options is arguably more impressive than a 36.7 in 1974, when most people had just two other channels.
And Carr would have had a much better case if he’d looked at the numbers since smartphones went mass instead of a decade ago. Sometime around 2008-2009, the numbers for the biggest live TV events started moving noticeably upward, as you can see in the NYT’s chart.
It could be a statistical blip, a measurement issue, better marketing, or yes, even “a need to have something, anything, in common.”
Ryan Chittum is a former Wall Street Journal reporter, and deputy editor of The Audit, CJR’s business section. If you see notable business journalism, give him a heads-up at rc2538@columbia.edu. Follow him on Twitter at @ryanchittum.