Archive

Archive for the ‘PPM’ Category

In-Car Listening: the Black Hole of Radio Ratings

February 11, 2012 Leave a comment

With all the hand-wringing about new in-dash audio alternatives to radio, it’s amazing that we don’t really know how much time people spend listening to radio in the car.  

PPM doesn’t separate in-car listening from other out-of-home listening. And while diaries do break out in-car listening, the recorded “reality” of diary measurement neither matches listener perceptions nor what PPM seems to show, albeit from behind the curtain. 

If you ask them, most North Americans will tell you that at least 50% of the time they spend listening to radio is done in the car. But diaries typically show that just over a third of all listening takes place in the car.

Meanwhile, although PPM generally shows the same overall split as the diary for in-home and out-of-home listening, it actually shows higher out-of-home listening on evenings and weekends than the diary. It’s safe to assume that most of this evening and weekend listening is not happening at work, but rather in the car—during the trip to the grocery store or while you take the kids to soccer—those occasions that you’re not likely to recall a week later when you’re filling out a diary. This makes you wonder just how much out-of-home tuning that PPM records during the workday is actually at work listening—and how much represents in-car listening when you’re running errands or going to meetings out of the office that again wouldn’t be recalled or in turn recorded in a diary.

All of this suggests that we should probably be crediting more radio listening to the car than what our ratings systems—or, more specifically, the diary system— have traditionally led us to believe. And, as commuting times grow ever longer, this means the car becomes an even more important competitive arena for radio listening.

Should PPM-rated stations design and target their programming to the in-car listener, and promote usage of listening to their station in the car—much like Lite Rock stations were built for the workplace and promoted to reach the at work listener? And just how should radio react when it comes to sizing up and dealing with the potential impact of Pandora and other new audio alternatives in the car?

We need to add an accurate accounting of in-car listening to the ‘to do’ list of our ratings services—ideally somewhere near the very top.

So, How Many People Really Tune Out When the Spots Come On?

December 19, 2011 1 comment

The headline sure is encouraging.  According to the Coleman Insights/Arbitron/Media Monitors study, What Happens When the Spots Come On, radio stations hold 93% of their lead-in audience through the average stop set.

But, before you start loading up on long stop sets, there’s something of a devil lying in the details here. When they presented the research, Coleman and Arbitron were careful to note that the 93% doesn’t refer to how many listeners stay tuned throughout the stop set but rather to the overall audience levels prior to and immediately following the break. The actual proportion of listeners who stay tuned through the average stop set is lower than 93%.  The problem is we don’t know exactly how much lower.Image

Radio stations experience a natural audience churn that happens independent of any tune-out triggers.  Listeners get in or out of their car, enter or leave the room where the radio is on, or do any number of other activities that begin or end their listening session and have nothing to do with their reaction to what they just heard.

This natural churn is also captured in the results of the Coleman/Arbitron/Media Monitors study.  As a result, it effectively dilutes the impact of spots on listeners who tuned in at the beginning of the stop set.

Imagine that we have a radio station with ten listeners tuned in via PPM. After the first spot, one of the ten listeners switches to another radio station while another one ends their radio listening session (let’s say by leaving their car). Meanwhile, still another listener turns on their radio for the first time that day. We now have nine listeners, eight of whom were there before the first spot. (That’s 90% of the number of listeners we had when we started, while just 89% of the listeners still listening to radio are still with us.)  Once again, after the second spot, one listener starts their listening session with us, another one ends their radio listening session altogether, while a third listener switches to another station. But, this time, a listener also switches from another station to our station. (Maybe they were running spots too.).  All in all, two listeners have jumped on and two have jumped off after the second spot. We still have nine listeners (90% of the overall audience level of when we started), but our station audience now holds just six of the eight listeners—or 75% of those who are still listening to radio.

Of course, this just an example. The amount of natural audience churn and the number of listeners switching to and from one station to another during the stop set is not called out in the research. Jaye Albright, who drills into the granular detail of all things PPM for her clients in Canada using BBM’s minute-by-minute data, tells me she was recently looking at 25-54 switchers to a major market Country station and found it’s roughly a 4-8% “in” and 4-8% “out” in every single minute.

This is not to say the results of the Coleman/Arbitron/Media Monitors results aren’t meaningful. It’s good to know that, regardless of tune-out, overall audience levels stay quite high during the average stop set, even if we should also consider the net effect—how much the audience might have grown if we weren’t running any spots. It’s also helpful to note the differences in audience levels by demo, format and ethnicity—News/Talk, Urban and Hispanic stations for example should be taking these results to the bank.

Coleman, Arbitron, and Media Monitors also quite rightly point out that the study says nothing about the impact of high commercial loads on the brand.

The research partners promise further analyses in 2012. Hopefully, we’ll get more insight into how spots and other content affect not just audience levels but actual switching behavior.

%d bloggers like this: