Nine out of ten journalists say that statistics are an important part of their research and reporting. Okay, I made that up, but it seems like the truth. Statistics are little bits of information that easily illustrate the scope of something. They also feel reliable. Maybe that’s because you can check who conducted a study or gave a poll, or maybe it’s because they’re expressed in numbers, which always seem sort of stable and less prone to trickiness than words. But, as this article brings up, while a survey may have been given with the best, most unbiased of intentions, there are some factors that can influence the way people respond. Here are a few to consider:
1. Culture. The article I cited above focuses on how a recent study confirmed that a person’s cultural background likely affects their responses to survey questions. You might be wondering how hard is it to – often anonymously – give your opinion. But if you think about it, there are some cultures that value, say, being peaceful and polite. People from these backgrounds might tend to give less extreme responses on a numeric scale questionnaire. Other cultures value self-expression or ostentation; they may give more extreme replies.
2. Wording. The Pew Research Center is one of the most famous and trusted polling agencies in America. Not only do they conduct surveys; they also study different aspects of polling. One problem they’ve explored is the importance of wording. Their website lists a number of ways this can factor into inaccurate responses, including a lack of clarity regarding whether or not a question is open, giving or withholding details, and using double negatives.
3. Social desirability bias. “Social desirability” is a concise term for a person’s desire to seem normal and likeable to others. It’s also a major cause of inaccurate survey responses. You’ve probably taken this into account without realizing it when you read statistics about topics like, say, porn-watching frequency. Some, or maybe even most, respondents might have answered truthfully, but it’s likely that at least a few modified their responses to seem more “normal”.
4. Who’s responding? Slate.com’s Will Oremus wrote an intriguing article about this a while back: Even if the results of surveys are accurate, who, exactly, is responding? Pew Research found that only about 10% of the people they called were willing to answer their questions. Not too surprising: after all, many of us don’t have the time or interest. So, Oremus explains, the people who do respond have a certain type of personality that may not reflect the majority of the population. He reports that, according to Pew, in addition to being more likely to have recently volunteered or contacted a local politician, “respondents are somewhat more likely to be smokers, more likely to be on food stamps, and…more likely to be Internet users….” He adds that “telephone surveys also tend to reach fewer urbanites and fewer people who live alone.”
5. What kind of survey? Oremus evokes other issues that can affect poll results, including what’s known as “coverage bias.” Imagine, for example, that a survey is given online. That means that people who don’t use or have access to the internet are unlikely to respond. Or, as Oremus’ points out, if a survey is conducted by mail, a majority of the respondents be elderly. If a poll is supposed to reflect something about the general population, these are huge impediments to getting accurate information.
Oremus finishes his piece by reassuring us that, in a way, surveys are actually more reliable today than they were in the past. But it’s still important to be wary. The next time you come across some statistics, try to think like Ian Malcolm in Jurassic Park (in the movie, he’s played by Jeff Goldblum, who I’m sure a majority of people found super-sexy in the role): remember that chaos is the rule. It seems human nature find a way to break through the electric fences of order and numbers every time.