Survey Design Love.

Confession: I’ve been a survey lover for as long as I can remember.

I realize many people approach them with dread — it becomes yet another thing one must do, a task, a barrier preventing you from using the app or website or service you want to us. It’s the people milling around outside of grocery stores, ready to poll you about political candidates. It’s the signs people hang on their doors — no soliciting. All of these sort of negative associations.

What I like about surveys is more the process that goes into building them and how people record and work with the data they receive. There is a lot of psychology and narrative that goes into the preliminary design and coding that comes after we have the data. Not to mention the process of interviewing… Each piece presents new and interesting challenges for the researchers and team setting out to run a survey.

Second confession I need to make: I just finished my General Assembly course on User Experience Design, and spent a lot more concentrated time than I have in a while thinking through the layers of survey design and data collection. It was truly wonderful and useful to me… and gave me a lot of time to nerd out about surveys.

For my current projects, I am thinking about the best ways to structure and develop surveys that will make the experience less awkward and forced for the data collector and for the person responding to the survey. In many ways, I think some of the weakness we encounter in data collection about difficult topics, like informal business, could be addressed through better design/systems thinking.

Some of the typical problems I encounter in the work are:

1) Low response rates, which means the group that responded gives me a strange and inaccurate data set to use when describing the community I am working with

2) Questions are sometimes unclear/worded inappropriately given the audience we are working with. I think the value of language and word associate often gets overlooked. We need to account for the way the question will be received and also for the types of answers we might be able to get. Are we asking questions clearly in the right lexicon? Are we interpreting the responses in the appropriate weight/meaning of the local ways people talk about business?

3) Information Capture Beyond the Page: often the most important and enlightening pieces of information (the new rabbit holes to go exploring, if you will) will not fit nicely into a question topic or predicted category. Getting researchers who are ready and able to follow these threads of thought, record them, and offer some sort of analysis is… rare and challenging. Some people are truly excellent at this task! They also, however, need to make an effort to collect and protect this information, so that it does not get lost in the coding process.

4) Plain and Simply: the surveys are extremely long and fatiguing for both parties. These surveys will not generate the information you need and the survey administrator will not stay focused and engaged with the task beyond a few runs. If your respondent sighs/looks at the clock/looks bored as a noticeable change in mood, the design is off. A wise mentor told me once I get 10 questions and often within those 10 questions, I can answer 70% of my questions overall.

These are all issues I am attempting to address in my redesign process. I apologize in advance to the friends and family members that are regularly subjected to my guerilla field testing… but it’s for a good cause!

Happy New Year!

 

 

Published by

denrsch

Tea, Tequila, and informal economy enthusiast.

Leave a Reply

Your email address will not be published. Required fields are marked *