Making data-driven decisions
Before setting out to prove accountability, marketers must ensure their campaigns are actually effective. Selecting the right type of data to measure is key to achieving this. In a McKinsey article, Brown et al. (2017) note that organisations that leverage customer insights “outperform peers by 85% in sales growth and more than 25% in gross margin”. In essence, data can help make work more effective by revealing actionable insights to inform decisions before, during, and after campaigns go live. Here, evaluation acts as a compass, helping marketers become more strategic by channelling resources where it matters most.
Types of Decisions Informed by Evaluation
There are two types of decisions where evaluation can be helpful: prioritisation decisions and initiation decisions.
Data that informs prioritisation decisions allows marketers to optimise campaigns on the fly, helping them prioritise campaign elements that are performing well, and rectify any that aren’t. This is where shorter-term, immediate feedback and behavioural data such as digital performance metrics can prove most useful, as they allow marketers to optimise and course-correct in real time. Conversely, data that guides initiation decisions helps marketers make judgements at the beginning of a project when real-time feedback isn’t available. It includes data such as benchmarks, previous learnings, and testing, all of which can help inform and validate the direction they should take.
The Role of Pre-Testing
Pre-testing, if used wisely, can be particularly helpful to marketers yearning for effectiveness. In a recent Marketing Week article, Ritson (2024) highlights the importance of using pre-testing as a useful exercise to help improve creative work, that way extending and advancing its effectiveness. It shouldn’t be used as the deciding factor on which creative to run, but rather as a way to inform the refinement process that will make it work its hardest.
How Golley Slater uses Pre-Testing to make Decisions
At Golley Slater, we frequently commission and conduct creative pre-testing for our clients to refine our creative output and maximise its impact. We typically use qualitative methods to test our creative with its intended audience, but we avoid using these discussions as a means to simply pick a ‘winning’ route based on subjectivity. Instead, we ensure we’re assessing how effective each route is at conveying the intended message or eliciting the desired response. What we learn from these discussions then helps us refine our work to make it more effective.
Ultimately, evaluation is instrumental in helping marketers make informed decisions on which path to take and which to avoid, but it relies on measuring the right type of data. Data that helps them learn, optimise, and course-correct to make their campaigns as effective as possible.
How you can make data-driven decisions
Next time you consider what to measure to evaluate your campaigns, make sure you:
- Select data that will help you make strategic decisions, not just prove accountability
- Use short-term behavioural metrics to prioritise and optimise live campaigns
- Consult benchmarks, learnings, and pre-tests to guide strategy before launch
- Leverage pre-testing as a refinement tool, not simply an arbiter of creative
To be able to benefit from the learning potential that evaluation can offer, it can’t become an afterthought left to the end of a campaign. Our next Effectiveness Insider article will outline the importance of building evaluation into your approach from day one.
References
Brown, B., Kanagasabai, K., Pant, P. & Serpa Pinto, G. (2017). Capturing value from your customer data, McKinsey, 15 March. Retrieved May 18, 2024. (https://www.mckinsey.com/capabilities/quantumblack/our-insights/capturing-value-from-your-customer-data)
Ritson, M. (2024). Pre-testing ads is not divisive, it’s a no-brainer, Marketing Week, 23 April. Retrieved May 18, 2024. (https://www.marketingweek.com/ritson-pre-testing-no-brainer/).