Ron Sellers at Grey Matter Research wrote me recently, saying “The attached report is probably the most important thing I’ve done in the insights industry.” Teaming up with Harmon Research, Grey conducted an online survey and wrote a report called Still More Dirty Little Secrets of Online Panels. They investigated respondents from the largest US panels and found fraud and cheating by panelists in nearly half of all the completed surveys. Grey/Harmon separated the respondents into two groups, Valid and Bogus, then compared the results. Bogus respondents lied about items they purchased and services they used, their brand awareness, their familiarity with companies, their behaviors, their attitudes, and more. Grey/Harmon showed how their investigation was able to identify cheaters’ patterns and inconsistencies. Okay, you say, nothing’s perfect, that’s life, what’s the big deal? Okay, I say, what percentage of your interviews are you willing to accept as fraudulent? Most grudgingly say five or ten percent and are shocked when told as many as half of panel surveys are unacceptable. And of course if 46% is the average and the higher-quality providers can hold that down to twenty percent, then the lower-cost providers are doing worse than that.
The U.S. Census.
Every ten years the U.S. Census conducts a national mail survey. With access to all kinds of lists, including your Social Security number, IRS filings, driver’s license, and more, people are very easy to find. With 97% of their target population returning completed surveys, the Census has data on 125 million households. Most surveys’ response rates are lower than 10% and the best are only one in four. Even the most casual observer must surely wonder what businesses are missing when so many people don’t complete their surveys.
As the attention span of consumers continues its long decline, so do survey response rates.
Interest in and involvement with the subject of the surveys is what all surveyers hope for. Typical questionnaires are too long, too detailed, and too boring. They produce survey fatigue, which delivers lower response rates than surveys that are designed to be short and interesting. Consumers are inundated with surveys these days, also driving down response rates.
Two ways the U.S. Census gets its 97% response rate.
- People receiving their surveys are warned that if they fail to complete and return the survey, they are breaking the law and subject to a fine.
- Before imposing a fine, the Census Bureau makes door-to-door visits to every household that fails to return their completed questionnaires and interviews them in person. It is safe to say this level of thoroughness is not seen in most business, consumer, and market research.
Early consumer and market surveys used interviewers to gather opinions.
Querying the population at large with clipboards in hand, they knocked on doors and stopped people on the sidewalk. Interviewers would ask a series of questions and record people’s responses on a paper survey with a pencil. Later, when most U.S. households had installed telephones, most surveys were conducted by telephone (also using an interviewer) with a randomly selected sample.
Surveys using interviewers produce better data than self-administered surveys.
Bizfluent.com says personal interviews provide more information about the study subject’s answers while providing statistical precision. Nonverbal cues such as lack of eye contact, jittery mannerisms, or defensive posturing provide context that cannot be collected from a self-adminstered questionnaire. Two other advantages of using interviewers is they can ask for clarification when something is unclear and probe gently for additional detail. Twenty years ago was the tipping point when more than half of all U.S. adults got online and surveys followed them there. Online research has made interviewers an endangered species.
The side hustle.
A search for <get paid for taking surveys> took me to dozens of sites like surveyjunkie.com, surveymoneymachines.com, swagbucks.com, and easycashonline.net. Their names are an indication of the promise of big money for your opinions. Sites like paidsurveyinfo.com, millennialmoney.com, and moneycanbuymehappiness.com steer you to what they say are the best paid survey sites, and earn a fee doing it. There are so many sites like this that I quit after reviewing 15 pages of results.
How many people have signed up for these paid survey panels?
A search of panel providers shows many claim to have millions of panelists. The real total is impossible to calculate because panel providers continue to count those who haven’t taken a survey in years as well as those who never took a single survey after signing up. As you might expect, duplication and overlap is huge among cheaters who sign up for as many panels as they can. There is also a great deal of erosion when people who sign up for panels discover all is not easy money and quit. NerdWallet says they conducted a small experiment where the surveys they took earned them less than $2 an hour – hardly money machines. To make matters worse, an unknown number of the pay-me for surveys sites are scams.
What are the downsides of panels that vendors don’t tell you about?
Each panel says they have the cleanest, purest, and highest quality online research sample money can buy. If the Grey/Harmon figure showing 46% of respondents are bogus is to be believed, most of these claims are untrue.
- Cheating is a big problem. Unlike traditional research methods where interviewers can verify the identify of survey takers one-on-one, members of online research panels can usually cheat without getting caught. One way scammers make money is by falsifying their experiences. For example, when panelists are offered a survey for people who have bought a new car in the last 6 months, many lie so they can qualify, take the interview, and get paid. One-on-one interviewers could probe for detail and reveal the cheaters, but panel vendors don’t because it takes too long, costs too much, and clients almost never ask how the vendor checks panelists’ credentials. The result is that your survey about car buying includes many who haven’t bought a new car recently (if ever) – and you pay for every one of them. Panel providers keep their costs down by setting a low bar, not a high one.
- Panel cheaters speed through surveys. They do this by totally ignoring the instructions and the questions and clicking buttons as fast as they can. Grey/Harmon found large numbers of cheaters completing surveys in only a few seconds rather than the several minutes it would take to give careful consideration to their answers.
- Even legitimate panelists take lots of surveys. This leads to practice bias (over time, panelists develop insider knowledge of panel processes and adjust their actions and their responses accordingly). The more surveys panelists take, the less they are like real consumers and the more they are like performing seals.
What are panel vendors doing about cheaters and frauds?
Even the reputable ones have real problems with delivering qualified survey takers, but few take the time to audit their panelists. The problem is pervasive enough that panel vendors have taken to defending their samples. When pressed, they typically acknowledge the problems, then dismiss them because they are salespeople and panels are what they sell. Although not entirely adversarial, vendors and the people that pay them have very different objectives: you want good data and they want to maximize their profit margins.
The American Association for Public Opinion Research put together a task force to review online panels and make recommendations.
They found very few online panels use probability sampling, which is the kind needed to perform most statistical analyses. This is another dirty little secret unknown to research buyers and ignored by sellers. The AAPOR says that because most panels are non-probability samples, “they represent a substantial departure from traditional methods.” Add to this the fundamental problem of coverage error. Most are shocked to hear that a third of the U.S. adult population does not use the internet on a regular basis. This means that 80 million of the adults who buy your products and services are not panelists. This is an inherent and significant source of error that results in skewed data. The AAPOR’s recommendations:
- Do not use non-probability samples when your objective is to accurately determine what’s going on in the population at large.
- When panels are appropriate, choose yours carefully. Standards vary widely across the industry.
How to make the best of a bad situation.
- Be skeptical, doubt everything you get, and treat your data as unreliable until you have been able to demonstrate it to be trustworthy.
- Take control. Vendors will not volunteer panel shortcomings and when asked will brush aside your concerns. Ask the hard questions and evaluate the answers you get. I suggest asking them to talk a bit about the AAPOR task force.
- DIY only if you have the expertise, the desire, and the time.
As someone who has been writing for years about the decline in the quality of consumer and market research, I applaud what the skeptics are saying.
At the same time, I know they are preaching to the choir. Regular readers know I have long advocated that executives need to ask hard questions about their research. But like Ron and others, I see that cautions and warnings like this strike a chord only with those who are already aware of the problems. Those who are paying for their companies’ research need to know how wrongheaded most research is, but don’t see the importance or just don’t care. Almost none are actively engaged with taking measures to improve their research output by improving their standards and practices.
When you get around to wondering how to overcome the pervasive problems with getting high-quality data that will help guide your decision-making, ask me to help. It’s what I do.
Postscript.
Ron and I talked today about how for all their shortcomings, panel providers are not evildoers, but simply practical businesspeople responding to buyers’ demands for cheaper research (regular readers know last year U.S. companies spent $900 billion on marketing while spending less than 1% of that amount on research). This makes it easy to see who’s driving the bus, doesn’t it?
Research has become a mere commodity to be bought at the lowest possible price. When bids are won and lost over a nickel a survey, panel providers are not motivated to sell higher standards to customers who don’t know what those higher standards are and don’t care, either. He added this final note: “One thing I want survey buyers to understand is that research is not a black hole where you hand over the controls to someone else and hope everything works out okay. Either learn everything you need to know about how to get quality respondents or find someone who does and let them take care of things – but make sure it’s not just someone who claims they deliver quality respondents (because they all do) but someone who can prove it.”
The willingness of research buyers to sacrifice quality to save $50 on a survey of 1,000 consumers is a textbook example of being penny-wise and pound-foolish.
To read more articles like this, click here.