If I told you results from my survey said 63.5% of new car buyers want more cupholders and 36.8% want black seats, you might believe me, especially if I showed you a serious-looking report, or better yet, a dynamic slide presentation. If I admitted my study didn’t include new cars, but only USED cars, you would find it less valuable. And if I told you it’s not even all used cars – just those bought from RENTAL CAR FLEETS – you would be even less pleased. And how about if I told you the study did not include all rental car fleets, just Avis and Enterprise, and not all of their cars, but just Ford and Toyota sedans? And what if I finally got to the very bottom and said my results do not report what buyers of used Ford sedans from Avis rental car fleets actually want, but what a handful of salespeople guessed they want? You’d laugh me out of your office. And rightfully so.
But most reports that are unsound don’t admit it, for obvious reasons.
Here is a firsthand example: We began a Global Golf Tourism project by looking at information readily available online. Dozens of sites showed the exact same findings and conclusions because every one had cut-and-pasted from a single source. And as we saw, every one used the same single-source information to make business cases and strategic decisions.
The survey findings, such as they were, purported to be the foundation of a comprehensive report on Golf Tourism. Here’s what we found when we took a closer look.
THERE WERE NO GOLFERS SURVEYED. This is not a misprint.
- Not a single person who had ever taken a golf tour was asked any questions about his or her experiences.
- So who did they survey, if not golfers? They surveyed people who sell and arrange tours.
- Did they ask for facts? No, they asked for opinions and estimates.
- How many did they survey? Only a few dozen.
And by the way, what is the size of the Global Golf Tourism market? 50 million golf tourists spending more than $20 billion a year.
Most businesspeople don’t know how to tell a well-conducted study with a meaningful sample from a bad one.
We counsel them to have no confidence whatsoever in “findings” of studies like this. Why? There may be some facts in there somewhere, but there won’t be many, and there’s no way of telling which from which, unless we are able to read and understand the methods section – the blueprint for how we planned and conducted the study – the specifications, tolerances, limits, etc.
Click here for one good way to learn how to tell the wheat from the chaff.