Compared to What?

This month’s blog post talks about the challenges of determining which of a variety of marketing programs were most and least effective.

In 2007, I was asked to be a judge for CAPMA’s (Canadian Agencies Practicing Marketing Activation) Promo! Awards, which honour the year’s best marketing activation programs.  I had been involved with CAPMA and the Promo! Awards in previous years, but I hadn’t yet served as a judge.  I happily accepted.

The judging process was thorough and consistent, with very clear criteria for scoring 155 entries over 16 categories.  When judging or evaluating many different programs, it is critical to apply the same consistent process to judging each program.

As you might have guessed, one of the judging criteria was “Results”.  I remember looking at each entry’s results and noticing how everyone seemed to be using different metrics to prove their program was best.  I also remember feeling unsure about how to score my opinion of each entry’s results.

Not surprisingly, some of the results were excellent, such as:

  • Program delivered a 23% conversion rate
  • Downloads increased 1,590% over the previous campaign

I knew these results were good, but how good?  Compared to what?  A 23% conversion rate seems impressive, but what was the average conversion rate of their other programs?  How good is 23%?  How do I score that?  An increase in downloads of 1,590% seems pretty spectacular, I think, unless the previous campaign was a complete disaster.

What’s a better result, impressive or spectacular?  I’d pick spectacular, but I have no idea how to score it relative to impressive.  I know how to score a disaster, but there weren’t any disastrous entries!

I’m being a bit ridiculous to illustrate a point.  We need a way to rate numerical results, especially when comparing different metrics, and words won’t do. We need a way to score them.

The following splendid results were easier to understand as they were better at addressing the “compared to what?” question.

  • Achieved a +6.2% share swing vs. the main competitor
  • Sales increased 24% vs. declining sales for all other competitive brands.

Which of these two results do you think is better, and how would you score them out of 10?  To assign our scores, we judges had to use our judgment, which CAPMA properly directed us to apply consistently according to the criteria they gave us.

With 65 judges involved over two rounds of judging, differences in judgment here and there didn’t matter, especially if we each applied our own judgment consistently.  The process worked, and the cream rose to the top, but not many businesses have big enough marketing budgets to justify using a complex process and 65 judges to evaluate the effectiveness of their marketing spending.  Can your business afford 40 judges?  Maybe 15?  How about 1?

Whether you’re the only judge, or one of many, here are some guidelines to apply to judging your company’s marketing programs:

  1. Develop your own standard way to rate your programs.
  2. Apply your standard consistently to all your programs so you’ll have a meaningful way to compare them to each other.
  3. To address the “Compared to what?” question, focus on your objectives.  Think about what you want to achieve with each program and what kind of outcomes would make you happy.
  4. With your objectives in mind, pick as few or as many metrics as you think are relevant to each program.
  5. Score your results against your objectives.

For financial metrics, your objective can be what you budgeted for during the period your program should influence your customers.  For non-financial metrics, focus on customer behaviour and other business activity that you know is good for your business and leads down the path to profit.

Measuring marketing effectiveness can be complex and challenging, but try to keep it simple.  Focus your scoring on evaluating how effective each program was at meeting its objectives and driving value for the business. If you consistently apply a scoring system that tells you how well each program did at meeting its objectives, then you’ll have a way to compare them to each other.

The next time you’re wondering if your marketing campaign has delivered good results, start by asking “Compared to What?”.

About Rick Shea
Rick Shea is President of Optiv8 Consulting, a marketing consultancy that helps small to mid-sized organizations improve their marketing impact and business outcomes through customer insights, strategic discipline and effective content. Copyright ©2007-2023 Optiv8 Consulting, a brand operating under Rick Shea Enterprises Inc. All rights reserved. You may reproduce this article by including this copyright and, if reproducing electronically, including a link to: http://www.optiv8.com/

Speak Your Mind

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.