Dear Kristen, 

We’ve developed three creative concepts for an upcoming campaign to promote an existing product to a new target audience. I suggested to our leadership team that we test the concepts with the audience before we go to market, and they agreed, but I’m concerned about how they think we should conduct the testing and what we’ll learn from it. They think that we have to do lots of focus groups, which will kill our budget. They also want the research results to drive our whole marketing strategy, which seems risky to me. Can you provide a reality check? 

Signed, 

Risky (Research) Business

Dear Risky, 

You’re absolutely right: Creative testing can be high stakes—especially if your team has ambitious expectations and a modest budget. Because market research usually requires a significant investment of time, money, effort, and thought, these projects can be fraught with high hopes and the desire to hold the research (vs. human decision makers) accountable for providing the answers. 

It sounds like you’ve already encountered some misguided ideas about creative testing. Let’s separate four myths about creative testing from reality: 

Myth #1: The only way to test creative concepts effectively is through focus groups. 

Reality: There are lots of different ways to test creative that will give you reliable results. 

At LMD, we’ve done our fair share of focus groups to test creative concepts—and it’s true that seeing and hearing the reactions of your audience to creative concepts is extremely valuable. However, focus groups can be time-consuming and expensive. And, unless you recruit your own, you have to hire participants—which comes with its own risks. Instead, think creatively about how you can get audience insights in other ways

For example, LMD needed to get reactions from consumers about three different creative concepts we developed for the Prince George’s County Beautification Initiative. Because we had an aggressive project schedule and a limited budget, we conducted “walk-up” creative testing at two public county events. We displayed all the options for the creative concepts at each event and asked attendees to vote on their favorite by filling out a short questionnaire that served as their ballot. To increase participation, we used their ballots as entries to a gift card drawing. It was a fun and fast way to get feedback—and we got insights from more than 120 participants. 

Myth #2: The creative testing will make the decision for us about which creative concept to use.  

Reality: Your results should guide you to a decision, but it may not make the decision for you. 

You’d like to think (and hope) that once the creative testing is done, you'll know beyond the shadow of a doubt which way you should go creatively. That’s a wonderful goal, but chances are the testing won’t play out that way. In our experience, creative testing results usually show a strong inclination toward a particular concept vs. a landslide winner. Very often we get results that suggest that, with tweaking, any of the concepts could be successful in market. Layer on top of that the fact that participants like to “Frankenstein” concepts together by adding and taking away elements of each concept, plus your organization’s own brand standards and aesthetic preferences, and you’ll have a much more nuanced, complex answer to grapple with. 

LMD recently encountered this when we conducted creative testing for a new campaign for the US Coast Guard. We presented three different concepts in focus groups around the country. Not only did each group tend to like different concepts, the results varied by market. To arrive at a final concept, we developed a formula to score each group’s results on a weighted scale. We also considered comments from participants about what they liked about each concept and looked for ways to incorporate some of those elements into the concept that had the highest overall score. 

Myth #3: Because we don’t have that many opportunities to do audience research, we should ask the participants a bunch of stuff that’s not necessarily related to the creative concepts. 

Reality: Your research should focus on tightly scoped research objectives. Anything else you learn is gravy. 

It’s tempting to think, “We have our target audience in the room, so let’s ask them everything we ever wanted to know”—after all, you spent a lot of money on this research effort, right? Be forewarned: This is a bad idea for a variety of reasons. First, you should always lay out a clear goal for your participants when you start your research by telling them, “This is why we asked you to participate, and this is what we want to know from you.” Doing so shows your participants that you respect their time and value their feedback. Also, asking questions that are outside the scope of your objectives can feel distracting and disruptive to participants. 

Finally, a research effort that loses its purpose also loses its value. Asking random questions will produce a lot more data to sort through, especially if the results are jumbled and lack clarity. You only have so much time and money, so focus on what you absolutely must know. 

Myth #4: We expect to learn mostly new information in the creative testing. 

Reality: Your results will likely reinforce what you already know, substantiate some suspicions, and tell you a few new things. 

I can’t tell you how many times I’ve heard a client say, “But we already knew that!” when I present results from creative testing or other research efforts. Even though your research may not provide a ton of new information, there’s absolutely nothing wrong with getting confirmation from your audience that what you already know (or think you know) is correct. Isn’t it better to confidently put creative in market, implement a marketing plan, or buy media, knowing that the information you have is corroborated by your audience? Getting this validation means that you can defend your decisions because they’re based on fact, not hunches or hearsay. 

Don’t worry—you’ll discover some new things, maybe even some things that surprise you—just don’t expect everything you learn to be revelatory. 

Now that we’ve busted some common myths about creative testing, here are three easy tips to manage your research risk and maximize your investment: 

  1. Keep an open mind. Determine your research objectives (i.e., what you hope to learn) and participants (who you will ask), and then choose the methodology that will help you achieve those objectives and engage those participants most efficiently. 
  2. Have a plan. Once you determine your objectives and participants, concisely document your approach with a plan that details your project goal, research objectives, participants, methodology, total budget and cost breakdown, and time frame with milestones. 
  3. Gain consensus. Present your plan to your organization’s key decision-makers. Answer any questions they may have and take their suggestions and feedback. Revise your plan until you get sign-off from all the staff members who are involved in your project. This will help you avoid disagreements and Monday-morning quarterbacking after it’s too late to change course. 

Considering conducting creative testing before you go to market? LMD can help. 

Kristen
Newton
Vice President, Research & Content
As Vice President, Research and Content, Kristen works closely with clients to uncover what drives stakeholders' actions, attitudes, and beliefs, and uses these insights to create actionable strategies and powerful...Read more