I have worked in marketing for over a decade. I have created campaigns across PR, DM, email and the web (though I confess, not advertising). I'm no guru, but I have a reasonable amount of experience.
I have worked in marketing for over a decade. I have created campaigns across PR, DM, email and the web (though I confess, not advertising). I’m no guru, but I have a reasonable amount of experience. During that experience I have presented the results of the campaigns I have created to many clients: on paper, electronically, face to face, and to their whole boards.
This has often been an uncomfortable experience. Why? Because I have always been aware of the gap between the campaign outputs that I can measure, and the reality of their business. More often than not, campaign outputs are only loosely tied to revenues, and when they are, the evidence is more anecdotal than empirical.
Let me give you some examples.
PR: Coverage Levels
Little more evidence is needed of the panic that measurement creates in PR agencies than the fact that they honestly tried to promote Advertising Equivalent (AVE) as a legitimate metric of value. This wildly spurious faux-mathematical system converts a piece of media coverage into an estimated value based on what it would cost to purchase the same number of square centimetres in the same publication. That in itself is riddled with issues - e.g. how do you account for the difference between a whole article about the client, versus a single (possibly negative) sentence in a bigger article? But worse, the agencies tended to apply a multiplier (up to 12x in my experience) based on the fact that PR coverage is more ‘valuable’ than advertising.
Clearly the output of AVE is wildly subjective and nigh-on random. But that didn’t stop me using it in the absence of anything better (at least when I was desperate). Because if nothing else, it at least put a dollar value on our efforts, and went some way to convincing the client we were worth their retainer.
Email: Opens, Clickthrus etc
It’s digital, it must be easy to measure, right? Wrong. I know I’m not the only person who has found when following up an email campaign that despite what the report says, the person I’m calling has no recollection of the mail. I’m sat there staring at a screen telling me they opened it twice, clicked through once in the last 24 hours, and yet they say they’ve no recollection of it.
Email campaign reports taken alone are too inaccurate and even when they are perfect, the outputs need serious interpretation before they are useful. Clickthru does not necessarily mean interest.
Web Analytics: Page Views and Unique Users
Again, how can something digital be so inaccurate? Because it is interfacing with wetware: human beings and their unpredictable behaviour. Unique users are usually anything but. Page views are next to meaningless. For marketing purposes do I really care what browser someone was using?
In summary, marketing metrics are some of the most meaningless out there. Is it too much to ask to be able to measure a campaign by whether or not someone is going to spend some money?