The Myth of Advertising Decay

Sometimes, it seems like omniscience.  With our Ad Impact product, we’re able to measure the impact of advertising exposure on online behavior.  Did an ad lead to more visits, searches, sales?  The answer rests comfortably in Compete’s data.  This gives us an enviable perspective on how (and whether!) specific ads and ad strategies work.

But of course, it takes work and experience to turn data and information into intelligence and insights.  (Omniscience would be deadly boring, if you think about it.)  Though we’ve been doing advertising effectiveness analysis for over a year now, we’re continually learning new things about online advertising.  The newest learning?  The time-honored concept of "advertising decay" is a myth.

Want proof?  Here’s a "pushdown" ad that Banana Republic ran on nytimes.com for one day, on September 10:

nytimes.com Banana Republic Ad

And here’s what happened in the four weeks that followed:

Viewthrough rate of exposed consumers

What caught our attention was the gradually expanding delta between the exposed and control groups: 4.5% through one week, 4.7% through two, reaching 5.3% through three, and holding steady there through four.

Brand managers expect that the impact of media exposure will decay over time.  As far as we know, they’re right — when it comes to attitudinal metrics, such as brand awareness, message recall, and so on, which are measured through survey responses.  We all forget, and "˜decay’ is just another word for "˜forgetting.’  When certain advertising messages stick for longer than a few weeks (think of jingles from childhood that you still recall), someone has done some truly exceptional work.

Meanwhile, on the behavioral side of ad effectiveness analysis, we’d often seen the expanding delta, and had chalked it up as an outcome achieved by additional exposures.  But this Banana Republic ad ran for just a single day and without targeting; this means that in the weeks that followed, the control and exposed groups were equally likely to have been exposed to other BR marketing on and offline.

Conclusion: people may forget which advertisements they’ve seen and heard, but those ads continue to affect their behavior long after they’re forgotten, and sometimes with increasing effect.

Right?

Look again.  The mathematically inclined reader may already sense that something’s fishy with that analysis.  It is true that the data reveals ongoing behavioral impact, and that Banana Republic reaped an incremental gain in traffic each successive week.  But the increment actually did decrease over time, perhaps in close correlation to the expected decay in awareness, favorability, and recollection. This should be very clear when we re-present the data so that it focuses on the net new visitors to BR each week rather than the cumulative total visitors since campaign launch:

Percent of Consumers First Site Visit Since Campaign Launch

By the fourth week, there’s only a tiny surplus (less than 0.1%) of new, ad-exposed visitors compared to new visitors who weren’t exposed.

What does this mean for our advertiser, publisher, and agency clients?  The first graph didn’t lie: it’s vital for them to use a timespan of weeks and not days when evaluating advertising impact.  The incremental benefit of the Banana Republic ad took at least three weeks to play out (when the cumulative delta reached 5.3%).  Also: using a longer timespan for analysis is essential if you want to understand decay in advertising impact on online behavior"¦.which turns out not to be a myth at all.

Categories: Ad Impact™

Bookmark the permalink.

Comments

Your email address will not be published. Required fields are marked *