Check your emotions at the door…

When soliciting feedback (whether it be a survey, poll or discussion), if you want useful, objective feedback it’s a good idea to leave emotions out of the research design.

I do a lot of work, the majority of it pro bono, for our local school district. It’s not that I’m totally altruistic, although I like to think I am at times. Really, it’s because I want folks to have good, useful, objective feedback so they can make smart decisions about children’s education. And, in order to make smart decisions–whether they are about advanced learning programs or how the PTA should spend generous parental donations, unbiased feedback is required.

I’m learning the hard way that the topic of children’s education is up there with money, religion and politics as a minefield of emotions.

So, how can one best approach this emotional battlefield?

First off, please, please don’t use “Pros and Cons” to elicit feedback. One person’s pros are inevitably someone else’s cons…A better choice would be to use an unbiased statement and ask people whether they agree or don’t agree (in market research terminology, a likert scale). You can determine whether a topic is a ‘pro’ or a ‘con’ by how much someone agrees, or conversely does not agree with your statement.

Avoid ‘loaded questions/statements’. You know the ones where the author’s inherent bias is already worked into the question…like “have you always listened to that awful, noisy rock and roll?” Neutrality is your friend…think like Switzerland.

Finally, be sure to listen to all stakeholders in the process. Even though I balked at collaborative survey-writing, getting buy-in from everyone ensured my own biases were kept in check.

So…even if it’s ‘for the kids’, be sure your research design is impartial.

Share

Sometimes You Know the Answer…

Very often in market research, we know the answer to our client’s burning question…or at least we think we do.

In a very real-world example (and one that has been all-consuming for me…hence my ‘radio silence’ on Twitter), our local school district is trying to find a solution to a very overcrowded elementary school that houses three distinct programs. Given the option of moving all, or part of one program (Program “A”) brought a flurry of emails, petitions, and you-name-it from angry parents advocating to keep Program “A” intact.

The school district, to their credit, offered to send a survey to parents to gather more feedback. But, with such an overwhelming response to keep Program A together, the district wondered whether they even needed to send out a survey.

This is where I remind folks that even if you think you know the answer…you need to ask the question.

Not just for the obvious reason to ensure all points of view are accounted for, but also, asking “the question” gives you an excellent chance to engage with your constituents. They will welcome the opportunity to tell you what they are thinking–because they feel you are listening…and don’t we all want to listen to our audience/customers/clients?

In the case of the school district, they did send a short survey–three questions, short and sweet.

So, even if you really, truly do know the answer–please take a bit of time and ask the question.


Share

Making Trade-offs

I’ve just finished putting together a survey for one of our local schools. With state budgets in dire shape, school districts will be cutting back funding, leaving parents/PTAs to make up the gap. I could discuss the inequity of this issue until I’m blue in the face, but that really doesn’t have much to do with research.

How to make trade-offs is an interesting topic for researchers. Clients are always interested in how their customers choose Product A over Product B, or which features are most important (oooh, there’s that rating/ranking concept again!). In the case of our local school, it’s which programs are parents’ top priority for the PTA to continue funding.

There are 9 programs for parents to prioritize encompassing everything from drama classes to teaching assistants–with all kinds of things in between: librarian assistant, playground equipment, instrumental music, etc… Which ones to fund are not easy choices for the PTA to make, nor for parents.

And, keeping the respondents in mind (in this case public elementary school parents) the survey can’t be long, nor can it be very complex (no conjoint analysis here for you hard-core staticians). With that in mind, I developed a simple ranking and budget allocation model for the survey. I asked the parents to rank the 9 programs in order of importance, and then gave them a hypothetical budget ($1000) to allocate across those same 9 items. In addition, I assigned a dollar value associated with each item so parents could understand the ramification of choosing one item over another.

In this case we really needed both questions–ranking the items allows the PTA to understand what is most important to parents, if money were no object or if fundraising wildly exceeds expectations, while having parents allocate hypothetical budget dollars helps bring their priorities back in line with reality.

And, hopefully inspires them to donate more money…

Share