Does being a number make it true?
17th January 2020 | Posted in Marketing communications
‘Smallpox cases double in three years’, screams a headline.
I’ve made this one up, but we’ve seen plenty of similar stories over the years. Great pulling power, but what’s the real story?
Suppose there was only one case three years ago, and this year there are two. Not the need for panic that the headline suggests.
Interpreting surveys and statistics to create a narrative is a tool that’s used by the media and marketing. The numbers don’t lie, do they?
Well they can certainly bend ‘the truth’ to a particular point of view, depending on how they’re interpreted. And you only have to look at all the polls around election time to realise that results depend on the questions asked, the way they are framed, and which answers are published. Not to mention who gets asked the questions and whether they respond honestly.
Statistics and marketing
As marketers and copywriters, I believe we have a responsibility to be honest and transparent with how we report figures.
That’s not a particularly straightforward thing to say.
I’ve been involved in creating surveys and interpreting results for commercial organisations, and I know there are plenty of grey areas here. The client wants to ask what their audience thinks about a topic, because they want to use the results to build a case for buying their products and services. So we ask questions that we hope will give us helpful answers. And we ask sufficient questions that we can dispense with those for our final report which don’t really work with our narrative. We’re not lying exactly, are we? I have a few uneasy moments about it, but I don’t lose sleep.
More worrying are the figures we are sometimes presented with as journalists, bloggers and members of the public. Some of the information we’re given is fairly harmless, but some can be used to spread unnecessary fear, doubt and uncertainty.
How can we look behind the numbers?
When I’m sent a press release for my blog, When They Get Older, I will usually ask for the spreadsheet behind the headlines. I need more details to pull out the conclusions that are specifically of interest to my audience. But I also want to see how the numbers were derived.
The questions I would ask about a poll or survey include:
- Who was asked? Were they likely to be biased? Were they the people who are knowledgeable about the subject?
- How many people were asked? I studied statistics as part of my university course, and the term that’s stayed with me is ‘statistically significant’. If a pollster is trying to extrapolate a survey of 20 people to determine the point of view of thousands, that could be misleading.
- Why are the results released as percentages? It’s easier to write an eye-catching headline with a percentage, but if we go back to my first sentence, saying that something has doubled only makes true sense if you know how many cases there were in the first place.
I’m not saying that we should be totally cynical about any statistics we read, or use. But as consumers of information we don’t want to be misled, and as marketers, we do a disservice to our profession if we mislead.
I was inspired to write this blog by another blog created by Response Source, which is a useful service for journalists. It’s an extensive list of sources for guidance and training for interpreting statistics and organisations such as Full Fact which are focused on exploring claims made in the public eye.