DIY Deja Vu

Deja VuA week doesn’t go by in #MRX land (Market Research in the Twitterverse) without bashing the Do-It-Yourself phenomenon of survey production.

This reminds me of my somewhat distant past in graphic design..I was a Marketing Communications Manager for a large paper company at the time–when the advent of “Desktop Publishing” invaded the hallowed halls of commercial design and production.

At that time Graphic Designers were artists, using pen and paper, specifying fonts and photos, laying out designs with non-photo blue pencil. There were type houses whose sole job was to produce beautiful fonts on sheets of paper for the layout artists to cut and arrange. No WYSIWG here–typesetting was command code only. Think HTML for the dark ages.

Then a little company named Adobe created PostScript enabling Aldus to create PageMaker–the first page layout program.

Suddenly anyone could be a ‘designer’ because they had a new tool that enabled them to layout type and images. Graphic Designers balked at this idea and many resisted the change. But, in time, using a computer to do layouts became de rigueur and those who didn’t have innate talent or training in design fell by the wayside.

It wasn’t enough to just use the tools anymore…

The same is true today in market research. The advent of tools like Google Surveys, Survey Monkey and Survey Gizmo (all good products for basic data collection needs) has made my industry balk at those who send out a “Survey Monkey” to their customers. Companies feel they are engaging with their customers by sending out a survey…except that many of these ‘surveys’ haven’t taken into account biased questions, too many open-ended responses (see this post all about those!), pages and pages of characteristic grids, and many of the other painful survey mistakes we market researchers cringe at.

At some point, like it did for graphic designers, using the tool won’t be enough. We just have to be patient and keep educating companies about what is ‘good research’. Until then, let the teeth gnashing continue.


The Danger of the Open-Ended Response

As I’ve mentioned in previous posts, I will happily take any survey, especially ones that directly affect me or my family…although I don’t always enjoy the experience.

The one pictured above is one from our local school district’s ‘strategic plan’…yep, full of those dreaded open-ended responses. Lots of fill-in-the-blank vague questions about improvement opportunities, academic excellence and other edu-speak buzzwords.

I hate surveys like these…open-ended questions such as the ones shown here are as enjoyable as trying on bathing suits in the dead of winter when I’m still digesting all of those holiday meals.

Why do I mock these questions like others ridicule pie charts? Because they are useless.

A bold statement, I know, but just think about this:

  1. Someone has to read each and every response
  2. That person (or someone else) has to organize the responses into manageable categories
  3. Then, only then, can the responses be tabulated into something that remotely resembles a list
  4. And finally, one can draw conclusions and plan a path forward

Oh, did I mention there are upwards of 40,000 students in our district? You do the math of how much time this is going to take to analyze and report…boggling, eh?

Best practice tip: have a few focus groups to ask these kinds of questions and develop a short response set based on the focus group feedback. This will get rid of almost all of steps 1 through 4.

Don’t forget to allow for an “Other” response because you can’t always capture every possible response, but I bet they could have come up with at least 80%–more than enough to offset the costs of the focus groups.

And, you’ll find most folks will be happy to fill in an “Other” open-end, including me.


Not Enough Choices

Sometimes I wonder if I only write blog posts when I see egregious errors by other research firms…maybe it’s because it spurs me to action in my otherwise ordinary workdays.

Many of us researchers take surveys to keep up with trends and new ideas as well as learning what not to do when designing and fielding our own surveys.

My latest foray into survey-taking was a rather lengthy one about a local grocery store. Earlier in the survey I was asked where I buy groceries and I indicated not only did I shop at the stores listed, but I also happen to shop at a local chain…in fact I do a significant portion of my shopping at this local store.

But, when it came to allocating my proportion of grocery shopping across the stores, the survey designer failed to include my little local one…even though I had indicated I shopped there in an earlier question. This is what I saw when I tried to allocate my grocery shopping dollars:

As you can see, making matters worse, the survey folks expected answers to this question to total to 100%…kind of hard when you don’t have enough choices! Ugh.

I was very tempted to abandon the survey at this point as I’m sure other folks will who potentially find themselves in my dilemma.

So, tip for today, if you allow an “Other” choice earlier in a survey, make sure you account for that in any subsequent questions that use the same list of choices.

We all like having lots of choices…





I ask this question often when I take surveys. Why are they asking me that question? Why did they word that question so awkwardly? Why did they make me rate 25 characteristics? Why did they try three different pricing models?

But, my all-time favorite ‘why’ moment is this:

[/start rant] Why the heck can’t this research company figure out what state I live in by my zip code? Why make me give you both items of information? In fact, my zip code offers all kinds of cool, interesting demographic information–city, state, population density, average income…ooodles of data. Pretty much when I enter 98xxx they immediately should know that I live in Washington State.

Some of you research-savvy folks might say that the state question is being used to ‘quota’ (ensure a balanced number of respondents from different states) and that is a very legitimate point. If that is true, then please put the zip code question at the end of the survey with the rest of the demographic questions.

[/end rant.]

Please write questions that don’t annoy your respondents, who are quite often, your customers.