We’ve all become used to the clichéd ‘data is the new oil’. In digital media at least, data is what drives performance, data is what helps us make business cases, helps us to prove or disprove hypotheses and what ultimately justifies our fees. Having data at our fingertips is not new, it’s just that we have more of it now than we did before. If you’re in the data business, you’ll know the skill is not in the data collection or aggregation, anyone can do that, the skill lies in the interpretation. Clients today want ‘actionable insight’ (another over-used cliché in digital media circles), they don’t simply want reports. Clients want data to drive them forward, to feed their planning processes and to optimise performance. In doing a piece of data analysis I always liken the process to the peeling of an onion; there’s always another layer of information required to complete the picture and provide a more rounded answer to what’s going on. The ‘peeling’ can be the most challenging and yet rewarding practice for a digital marketer. Below I have described two examples of many, where my initial findings using a specific data source changed completely after further interrogation.
1. Using branded search as a measure of intent
I’m often amazed that more brands don’t keep a regular check of their branded search volumes. Branded search is one of the most timely, granular and relevant data sets in which to understand consumer intent. I’ve unpicked many key insights from this under-utilised ‘tea leaf’. One of the challenges with this data is that empirical impression level data is only available when you are buying your brand on paid search. Furthermore, branded search volumes are only useful when they are ‘always on’, requiring maxing out on one’s branded paid search budget, which can be a costly enterprise for some brands. This detail aside, branded search volumes are a good indicator of brand awareness. I’ve often seen presentations which show a rise in branded search volume being a measure of success. Whilst branded search volume is a good indicator, it is best used in conjunction with other data sets to interpret it more accurately. For example, I’ve recently been working with a client whose branded searches were down massively year on year. Initially this caused much alarm for the client. Such a significant decline in branded search volume was surely the result of failed traditional media campaigns or some macro trend we hadn’t spotted. When we analysed the site side data alongside it, we found that the main reason for the drop in searches was the removal of a loyalty scheme which was driving a negative margin for the business, but a significant volume of navigational traffic. In this case the decline in branded search volume was not the result of declining brand awareness at all. The additional web analytics data provided the much needed context.
2. Using video share rates as a success metric
For anyone that has run campaigns on YouTube they will be aware of a whole host of new data that can be used to understand creative performance. Creativity has traditionally been something considered impossible to measure outside of the standard qualitative survey led data sources. YouTube enables one to see the average duration that viewers stay with an ad before skipping, providing some very valuable insight into how different creative executions perform. A recent pre-roll campaign I ran showed a huge disparity between the number of viewers willing to watch the video until its end and the number that decided to share the video on a social network. On view rate alone the campaign appeared to be performing well. On share rate however, it looked like the campaign was under performing. To confused matters more, the same campaign running on Facebook showed a very high share rate. The same metric across different publishers can have a vastly different value; propensity to share and/or comment on Facebook is naturally higher than on YouTube, my campaign was testament to that. It also showed me that just because a video does not have a high viral effect it can still manage to keep an audience engaged. Share rate is not always the best measure of success.
Both of the examples above are still not conclusive. In the first it may well have transpired that the loyalty scheme which drove a negative margin for the business drove a significant enough halo effect by impacting additional sales across the rest of the site, to have made it more valuable than previously thought. It didn’t by the way. There will always be another layer to peel off in the quest to drive ‘actionable insight’. Data alone cannot do this.
Nathan Levi is a contributor at The Makegood and the Head of Media at VCCP, a media optimisation company.