A few weeks ago I asked readers if it would surprise them to hear that when it comes to research, the top brass almost never ask to see the ingredients label. This earned me a bunch of amen, brothers from people who have seen firsthand how too few executives ask the critical who, what, when, where, why, and how questions. This is preaching to the choir, an idiom that means delivering one’s message to those who already believe it.
Chief executives are caught off guard when I tell them they take leadership bona fides for granted.
I tell them leaders never ask if their Legal Department is led by someone with the proper credentials, including formally- and legally-mandated postgraduate training, testing, and licensing. They never ask if their Finance Departments are led by individuals who have invested thousands of hours specializing in a field with stiff and exacting standards, have earned postgraduate credentials, and are licensed and certified as experts. They just assume that is the case and it almost always is. The problem is that most business leaders also assume the people who lead their research functions are equally well-credentialed when they almost never are. And because nine times in ten those in charge of research are not experts, most of their studies ask the wrong people the wrong questions for the wrong reasons.
Most clients tell me their Market Research departments are led by people with marketing degrees because it makes sense, right? Actually, no. Marketing and Market Research only sound alike. Astrology and Astronomy sound a lot alike, too, I tell them, but one is a science and the other is a con game. It is easy to see Astrologers and Astronomers would be lousy at each other’s jobs. Marketers and Market Researchers are lousy at each other’s jobs, too, but Marketing runs Research without knowing how and not the other way around.
Remember back when Marketing used to be called Sales, HR was called Personnel, and we had janitors, secretaries, and telephone operators? Marketing departments are still led by sales experts, not by researchers, and don’t give me any of that they took a research course on their way to an MBA. I used to teach research in graduate business school but walked away from it in part because it can no more be learned in a few hours than can law or finance. The more important reason I stopped is when I realized that marketers don’t need to learn how to do research any more than they need to learn how to do law or accounting. My focus now is on teaching them a simpler and more useful skill: how to tell the difference between good research and bad.
Every day at universities around the world, huge numbers of marketers are being trained to influence people while smaller groups of researchers are being trained to understand people. Some clients tell me their company’s research is more wisely led by engineers. Agreed, I say, engineers know more about math and science than marketers, but they are interested in things and researchers are interested in how people use things. If you can’t see the difference, you are better off not reading any more of this nonsense.
Sales (oops, Marketing) techniques work well for moving inventory, but they screw up research something awful. Engineering techniques work well with exacting machine tolerances so they can pump out millions of copies, but screw up questionnaires like you can’t even begin to imagine.
If you believe exceptional consumer and market research provides extraordinary opportunities to sell more of your existing products and services, develop successful new ones, and make your customers happy, loyal, and your strongest advocates, there are two things you can do.
I advise clients to sit down with the person who is accountable for your consumer research. Tell them you are interested in learning more about the company’s research standards and practices, with an eye toward improving the output. Ask them to explain, broadly, how they go about planning your organization’s research. Here are some good questions to ask. You will think of others as your conversation unfolds:
- How do we determine what gets studied? What doesn’t get studied?
- What is our procedure for assigning resources to conducting studies?
- What procedure do we use to align resources with risk? With opportunity?
- How do we determine what level of investigation is required?
- How do we ensure our research is linked to the needs of the business?
- How do we calculate the value of our research?
- Do we have an overall plan that integrates all individual studies?
- What are the specific studies that are on our current research calendar?
- Which are we funding and why are we doing them?
- Which have our top priority? Why?
- What studies are we not funding and why?
Studies have ingredients (objectives, methods, samples, more). Each study should be clearly labeled so we can see how much stock we want to put into conclusions and recommendations. Labels should clearly define the study’s relevance to a specific business problem, the effectiveness of its approach, how the results will be used, and where it fits into the big picture. Look for robust explanations of value vs. cost, risk vs. reward, breadth vs, depth, and more. Ask how they calculate the return on research investment.
My other suggestion is that clients attend an actual research presentation.
I’m not talking about a slickly packaged dog and pony show that includes only one or two research slides. I’m referring to the presentations your research people make to your marketers and/or engineers. Nine out of ten executives tell me they are never invited to sit in, so I urge them to invite themselves. The next time your company has a research presentation, go. (Here I warn them if they announce their intent beforehand, they will not see what really goes on and the room will be crowded with managers who would otherwise never attend.)
Ask the presenters to please begin by describing the study’s objectives, methods, and samples before getting into the results. If their explanation includes technical terms and jargon (a bad sign), ask them to please use common language and straight talk. Here are some questions you should be asking:
- What was the business need that triggered this study?
- What did we hope to learn from this study?
- What resources did we assign to this study?
- Who was responsible for each task?
- Why did we choose the methods we did?
- Why did we choose the samples we did? Which groups did we include? Why? Which groups did we not include? Why?
- What were some of the compromises and tradeoffs we made?
- Were we able to use the information as we had intended?
When research is well-planned and -executed, the people who did the study are proud of it and delighted you are interested in how things get done around here. When the people who did the study don’t give me good answers to these questions, I don’t bother looking at their data, findings, and recommendations. I don’t think you should, either.
When it comes to how your competitive intelligence is handled, learn to read the ingredients label.