Stop Programming By Consensus

If you've been following Bridge Ratings' research pieces about on-demand music streaming and its value to radio programmers, the following is the latest insight we've gained from discussing this important technology with programmers in markets large and small.

Back in the day when there were no radio station monitoring services like BDSradio (from Nielsen) or Mediabase, radio programmers had to rely on their own gut, local research and listener input to determine the best songs to play.

Sometimes, programmer ingenuity provided insight.

If a programmer wanted to hear a respected station in another city, they'd ask the GM for some travel money, get on a plane and spend a few days in that other market manually monitoring the station, logging all songs, promos and clocks.

Returning to their home market, the Program Director would lovingly analyze their notes and determine the application - if any - to their local situation.

With the coming of technology these types of market trips are generally no longer necessary, what with monitoring services and on-line streaming.

Isn't technology great?

In this case, I think not.

Published station and radio format charts are now available to programmers, many of whom depend on these charts to determine song selection and rotations. The published charts do have their value to some program directors.

These format charts are an aggregation of dozens - even hundreds - of stations in different markets. Now that music research has been eliminated from many radio station budgets, the phenomenon of "Consensus Programming" has disrupted radio's ability to properly expose music to its listeners.

Programming by consensus means that programmers all across our great land look to the published charts for their formats and adjust their music categories based on the aggregate.

The resulting playlists may be 100% appropriate for some market situations.

Or more likely - those lists are a general view of radio airplay across fifty states.

The result is hundreds - maybe thousands - of radio stations are playing song lists that are very similar.

And this is where the wheels come off.

For over two years, Bridge Ratings has been providing on-demand streaming music research to our clients and we have learned at least a couple of important concepts:

1. Programming by consensus results in stations adding songs too late and getting off songs too early in more cases than not.

2. The lifecycle of hit songs - whether current or old - is much longer than we've ever thought.

Here are two examples:

A) The current multi-format smash "I Took A Pill In Ibiza" by Mike Posner has just recently appeared in published charts in the top ten most-played songs on Top 40 radio. It's still trending up. Our streaming research showed that true consumption of that song was in the Top 5 eight weeks ago!

What does this mean? It means that the published charts showed "Ibiza" gradually climbing the charts from outside the top 50 to it's current Top 10 status. Radio's listeners were streaming this song multiple times a week long before radio caught on!

B) Country music star Chris Stapleton's song "Fire Away"  blasted into the top 20 most on-demand streamed songs right after Chris' ACM award windfall on April 3. Yet the song was not even ranked in the top 50 most-played songs by the aggregate of America's Country music stations.  Based on personal guidelines, a Country music programmer seeing this may consider that it's too early to play that song and will wait to see if it rises high enough to warrant adding to their playlist.

Meanwhile, Country music fans were streaming the heck out of that song.

If not enough Country stations add "Fire Away", it could very well stall outside the top 40 and never get a rightful place on American broadcast radio.

In our analysis, Bridge Ratings found that in 55% of the cases studied, the aggregate music charts are not representative of true music consumption has observed in week after week of on-demand streaming data.

As digital data becomes more available through streaming data providers and platforms like Shazam, programmers are, indeed, better equipped to see how music fans are consuming.

Click Here to enlarge.

Yet, the most accurate method we have found to determine song popularity, longevity and viability, is on-demand streaming data.

The chart to the right compares a recent Pop song's on-demand streaming lifecyle with that of consensus/published charts over the course of 15 weeks.

Upon release, on-demand streaming for this song vaulted into the top five almost immediately. It's popularity grew as more fans became aware through word-of-mouth, broadcast radio and other streaming services.

It sustained this lofty position for the full fifteen weeks.

By comparison, upon its release, radio added the song and it was first ranked #78 on published radio airplay charts. As the chart shows, it took six to seven weeks before the aggregate of radio had pushed the song into the top 10 where it slowly faded after programmers must've considered the song overplayed or burned out.

As this song's progress on the published chart slowed, programmers got off the record or slowed its rotation.  Meanwhile, demand for the song remained extremely hot through on-demand streaming platforms from YouTube to Spotify to Amazon Prime.

We have found that on-demand streaming where consumers choose what they want to listen to, is more closely aligned with the behavior of radio listening than any other type of music research.

So, if you're a radio programmer reading this...do your listeners a favor and stop programming by consensus.