Marketers are generally great at measuring the overall returns of their campaigns. The more digital natives climb the ranks to head up their function, the more we’re immersed in complex metrics and smarter analytics. But does the same hold true when we’re talking about creative in particular? We recently surveyed 200 senior marketers to see how they were measuring the effectiveness of their creative. Here’s what we found:
Only 28% of marketers test their creative with their target market pre campaign. The majority preferred to assess it based on its match to their brief or internal team/managerial feedback. So why is pre-launch testing not pursued more avidly by these data driven marketers?
Our thinking is that it takes time and money they’re not willing to invest. The combination of internal pressure to get a campaign live and start delivering results combines with a natural human desire to show the world what you’ve made.
Does it matter? Well yes. The investment in creative is usually a fraction of the overall campaign budget. By not testing it ahead of launch you put that whole spend at risk. That said, the risk is lessened by a great brief that already heavily leveraged customer insight.
Scarily, 40% of the marketers in our survey had no mechanism for measuring creative effectiveness. So, that’s nearly half with no way of knowing whether their creative was doing its job or not. This is very worrying as it means that there’s a large proportion of marketers who aren’t viewing creative as a route to better ROI.
If something isn’t worth measuring, surely that’s because it isn’t viewed as integral to the success of the campaign? Not so. When we got to the section about the effect creative has on ROI the vast majority thought that it did have the potential to both positively and negatively impact campaign effectiveness.
We also asked whether our respondents A/B tested their creative. Here we got much more encouraging results with 24% always A/B testing and 62% doing it sometimes.
What we weren’t able to discern from our results was why this is. Is this decision made based on the importance of the campaign? The ease of executing an A/B test in the medium? Regardless of what was driving the decision to execute and A/B test on creative, we were reassured that in this sphere at least. Some degree of measurement of effectiveness was the norm and only 14% had never did it.
Our conclusion is that there’s a huge amount of potential gain marketers could leverage through better measurement and optimisation of creative. Anyone who’s looking for a step change in campaign effectiveness should certainly make this their first port of call.