|
Post by CountEmUp on Mar 24, 2007 21:08:14 GMT -5
Hello everyone, great to be here. This is my first post, and I want to say how enjoyable it is to read all the comments from other AT40 fans.
I’m curious to tap people’s knowledge about chart methodology, specifically in the 80’s and early 90’s. Everyone here is familiar with the typical song’s chart-life in that time period. Usually a song would debut somewhere in the bottom 30’s, move up anywhere between 5 and 10 notches for a few weeks, maybe enter the top 10, find its peak, then fall back down the chart. Fairly predictable, but the consistency was nice and provided a safety zone, and a standard to measure one song against another.
But my question is, was there really any sound methodology that resulted in that type of chart movement? For instance, say Madonna came out with a hot new song. It would debut, maybe, at #35. Maybe next week it made a big move to #23. Then another big leap to #12, and next week into the top 10 in “only” its fourth week. But, did radio stations really wait an entire month before putting a brand new Madonna song into heavy rotation? That doesn’t make sense.
Even though CHR hits were being pumped out pretty rapidly at the time, I can’t believe there was so much competition for airtime that a hot new record would take that long to climb the chart. A number-one record typically took about 8 weeks to get there.
I don’t even think the record-sales part of the equation matters here, because if you followed the Radio & Records charts during the same era (including CT40), their chart movements were extremely similar to Billboard’s, but were based entirely on airplay.
So, is it really true that radio stations across the country were telling the surveys that the hot new records were ranked #37 at their stations? Perhaps even if they did put a song immediately into heavy rotation, they decided to “play the game” so to speak, and simply report the song near the bottom of their chart?
Or am I just missing something completely here?
I’d love to hear some insight!
|
|
|
Post by jedijake on Mar 25, 2007 11:01:52 GMT -5
I think if you consider every station in every city all over the country, it would all have added up so that new songs were ranked based on every spin they got.
However, I would notice that in, say New York, some songs seemed like they should have been at #1 immediately when released. I often noticed that, by the time a song actually hit its peak, the airplay actually DECREASED from my standpoint.
But of course we can only view that from a small vantage point since we aren't hearing what is happening around the country. It was still better than what is happening these days with songs sticking around forever.
But if you look at Billboard's Hot 100 (mostly sales now) and the airplay charts, songs are taking MUCH longer to hit the top. A song like "Over My Head" is taking up to 20-25 weeks to hit #1.
|
|
|
Post by CountEmUp on Mar 26, 2007 21:34:43 GMT -5
Yeah, the way it was then is definitely better than now.
I think one of the reasons songs seemed to hit #1 after their airplay was already dropping is because we were hearing AT40 about a week-and-a-half after the Billboard chart was released (am I correct?), not to mention that the chart reflected the previous week's activity. So yeah, I think the countdown was always a bit behind.
Also I would figure that sales lagged a bit behind airplay, which could keep a song at the top as its airplay started to decline. And that was probably borne out by the fact that songs often hung around AT40 a week or two longer than they did on the R&R chart (and often debuted a week later, too).
You're right that you can't tell what's happening all over the country, but I would assume that top artists like George Michael or Whitney Houston were pretty universal across the country at their prime, yet all their new records would consistently not even hit the top 40 until their second week in the top 100, and take several more weeks to hit the top 10.
I actually liked it that way, but I just couldn't figure out how it was possible.
|
|
|
Post by tacomalo on Mar 27, 2007 0:38:47 GMT -5
I grew listening to the radio in a smaller market in a midwest college town. In those days before segmentation, our local stations were often slow to pick up the hottest songs, even if they were from top artists. Our stations didn't always give as much airtime to the dance songs, but played more of a mix of pop along with country rock and regional acts (such as Styx and Kansas before they went national) and a lot less of the dance and R&B songs as they were running up the charts.
In fact, I found that there were a number of songs on AT40 every week that we weren't hearing locally, at least until they reached the upper echelon of the charts.
So I think, that at least up until the early 80s (when the soft rock stations and oldies stations started to bifurcate markets), it did take time for most songs to reach a national audience, so it should be expected that songs that had already been played to death in New York were just hitting their peak out in the heartland.
|
|
|
Post by TomBest on Mar 27, 2007 13:48:05 GMT -5
Interesting discussion. I think there are several factors in play here:
- Dayparting: Reserving established records and recurrents for the daytime hours. Things loosened up after school and evenings. New records get that initial play in non daytime hours and eventually made the full rotation and reaped the chart benefits.
- Where in the album "cycle" the single is . An initial single from a new album from an established artist might jump quicker on the chart (especially if released ahead of the album). The second single from a new artist might break faster and be more successful than the first (Examples: Go Gos, Village People). Latter singles from teh album (The 80s was the era of the deep album) would break more slowly if consumers already had the album. (George Michael, Janet Jackson)
Non radio factors: Songs appearing in movies ("Power of Love"), TV shows ("At this Moment") and in the 80s, interesting music videos ("Money for Nothing") had a built-in audience by the time they charted.
Then there is the curious case of "We are the World". On paper the 21-5-2-1 run was spectacular, but in reality we were bombarded with the song from its release, by the time it hit number one, we were tired of it.
|
|
|
Post by CountEmUp on Mar 27, 2007 22:00:56 GMT -5
Great points everyone. I guess I'm thinking too much about how the big cities handled records compared to the smaller towns, and of course the Billboard chart had tons of stations reporting to it. The dayparting makes sense too... hadn't thought of that.
Yeah, chart runs like We Are The World is what I was thinking should have happened more often, but these answers help explain that.
Another note about songs that moved up slower than others, is that the slow climb always helped (sometimes significantly) a song's final ranking in the year-end chart. Which I always thought was a bit unfair. Sure, longevity should count too, but I always kinda felt like a song that rocketed up to the top shouldn't be penalized for that. Songs like "Bad" rose fast and dropped fast, but if it hit #1, should it really be ranked lower than top-5 hits that dragged their way up the Hot 100 before their top 40 runs?
That raises the question, how did they credit #1 songs on the year-end chart? What if "Bad" had unusually high airplay and sales the few weeks it was #1... did it get the same year-end points for those weeks as any other #1 that might have just barely beat out the #2 song?
We Are The World falls into that same boat; it was #21 or something like that for 1985, but surely it should have ranked higher. I think it was #1 in sales for the year, but if most of those sales were crammed into the weeks it was #1, did a lot of those sales get neutralized?
|
|
|
Post by mstgator on Mar 28, 2007 17:53:31 GMT -5
That raises the question, how did they credit #1 songs on the year-end chart? What if "Bad" had unusually high airplay and sales the few weeks it was #1... did it get the same year-end points for those weeks as any other #1 that might have just barely beat out the #2 song? We Are The World falls into that same boat; it was #21 or something like that for 1985, but surely it should have ranked higher. I think it was #1 in sales for the year, but if most of those sales were crammed into the weeks it was #1, did a lot of those sales get neutralized? Pretty much, yes. Back in the pre-SoundScan/BDS days, Billboard's yearend charts were compiled using a complex inverse point system. Rather than the simple #1=100 points, #2=99, etc., the points for each position were basically an average of how many points each position was worth during the entire chart year (okay, that's hard to explain, lol). What it boils down to is that "We Are The World" earned the same number of points for each of its weeks at #1 as any other song that hit #1 in 1985. Since Billboard's charts back then were based on ranked lists rather than actual sales/airplay numbers, there was pretty much a ceiling for how many points any #1 song could earn anyway, regardless of whether it was selling a million more copies a week than a "normal" #1 or #2.
|
|
|
Post by jedijake on Mar 29, 2007 21:39:36 GMT -5
Also, something to keep in mind is this. When YE charts were based on the weekly charts themselves, only chart positions were used to determine their position at the end of the year.
When Soundscan began to be used, airplay was monitored digitally. Unfortunately, airplay AFTER a song left the chart was still monitored. Recurrent hits raked in points well beyond their chart life was over.
Of course some songs stayed on the charts for 40-50 weeks.
|
|
|
Post by CountEmUp on Mar 29, 2007 23:40:00 GMT -5
Good points. I guess that makes sense; they didn't really have raw data, so if the Hot 100 was a ranking based on other rankings, they didn't have a better way to figure the YE chart.
The other thing that always seemed really skewed was the way they treated songs that were on the chart at the end of the calendar year. I thought I read in Billboard once how they used to "freeze" the Hot 100 for the final week of the year, and gave double credit to all those songs. Maybe someone can confirm if that was true, but based on the YE charts, it sure seems so. And while that might make sense in theory, the results didn't seem to prove that theory right, because those songs always seemed to rank way higher than they should have. I can list examples but I'm sure you know what I'm talking about.
|
|
|
Post by bandit73 on Apr 1, 2007 17:44:59 GMT -5
But, did radio stations really wait an entire month before putting a brand new Madonna song into heavy rotation? One word: Cincinnati. The major Hot 100 panelist in Cincinnati in the '80s was WKRQ. It was very rare for WKRQ to add a new song out of the box, even if it was Prince or Madonna. I think WKRQ even refused to add "Sign 'O' The Times" by Prince at all, despite it being a huge national hit. (In light of this, AM soul station WCIN boasted that it was the only station in the area that played this song. Tiny AM top 40 station WCLU played it, but this station was sold and changed formats while the song was still moving up the chart.) WLS in Chicago was also very slow to add new music. (I think WLS was still a Hot 100 reporter until 1987 or so.) WKRQ and WLS were big stations that had a lot of influence with the Hot 100. There probably weren't any other Hot 100 reporters at the time that were as stodgy, but large markets like Chicago and Cincinnati were much slower to add new music than small markets like Bozeman, Montana, or Gainesville, Georgia.
|
|
|
Post by jedijake on Apr 1, 2007 20:38:30 GMT -5
I am very glad they used the chart methodology they did in the 80's. It was my time to listen to music, from middle school through my first 2 1/2 years of college.
With songs entering and leaving the charts within 10-15 weeks, it made hearing them reminiscient of particular events during those years. Then, when the year-end countdown rolled around, it was like a flashback to various times of the year. It was truly a trip back through the year.
With the charts the way they are now, it would seem to be difficult to really connect songs to times of the year. What song was hot during the spring? What song was hot during the summer? They blend too much from season to season. Also, with songs spending 8-11 weeks at #1 during the late 90's, it would have been difficult to link a #1 song to a month or season.
Some of the music of the middle to late 90's actually sounds better than some of the music in the very late 80's (88/89) but music seems to be much less nostalgic because of LONG chart runs.
|
|