What Conservationists Need to Know About Surveys

This post in the first of a new series entitled “The Basics of the Human Dimensions”, which gives the most basic tips for how to work with social scientists and social questions in marine conservation efforts. Whether you are the stakeholder, the collaborating natural scientist, or both, this series will hopefully make the journey into the human dimensions easier. 

Whether someone has emailed you a survey asking for your expert opinion on something or you would generally like to poll the audience of your research about their thoughts, the basic survey is often the first experience with social science one encounters. And yes, these surveys come in a wide variety of forms and quality, and you are right to judge and raise an eyebrow at them to a certain degree, whether that is out of sheer curiosity or justified skepticism. If you are on the side of wanting to deploy a survey, my best piece of advice would be to hire or consult with a survey writer, particularly if you’re not crystal clear on how the questions and the later statistical analysis will relate. But there’s some more subtle things you should know in dealing with surveys at all stages of the process.

When someone asks you to take a survey

When someone sends you a survey, they’re asking for your opinion on something as a stakeholder in whatever subject they’re studying. My experience has been that people interpret this role in different ways, and that’s fine, but there’s certain things that can’t be changed by the time you have survey-in-hand. Also, any survey writer worth their salt has already field-tested this survey for understanding among all groups of desired respondents, and has likely considered all the suggested changes balanced against each other with design features like how long it takes to complete. Here’s some suggestions that help that survey-writer want to yank their hair out:

– Changing the survey after it’s deployed basically destroys any possibility of statistical analysis later. Don’t suggest changes unless there’s an obvious typo or offensive statement.

– Choose one of the options available unless “other” is offered. Don’t write in “4.5” if you’re torn between rating something 4 and 5. Math gets hard later if you do.

– If you are genuinely confused by a question or see two ways to interpret it, ask how you should be answering it. There is always contact information for the writer included, so use it. Relatedly, if each word in a survey was carefully chosen, so answer what the question is asking, not what you think it should be asking (or what you think the author meant). Be literal.

– Don’t overthink it. Every survey should be introduced by a time estimate for how long it should take. If you’re taking longer, you’re overthinking it.

– Recognize the difference between “anonymous” and “confidential”. Anonymous surveys collect zero personally identifiable information. Confidential ones might collect names, job titles, and whatnot, and are under additional ethical protections. Believe me, surveyors only write confidential surveys when they need that information for analysis, like to see how different respondents are related to one another. And they’re bound to not share any of that information. If you’re concerned about protections, ask – but don’t demand anonymity if it will invalidate the usefulness of your answers.

When you want to write a survey

Again, if you haven’t done this before, your options will involve the help of others: hire a social scientist, take training, solicit the help of colleagues. After you’ve assembled your team, there are still a number of considerations that should be floating around on the top of your brain:

– Consider the statistical analysis that will need to happen and the time available to do them. It might be really fun to get loads of paragraph-long, handwritten responses to questions, but that kind of qualitative data is incredibly hard to deal with quantitatively and people often skip these kinds of questions. There are also fairly standard ways of asking certain types of things, like asking people to rate or rank something (Likert Scale – you know, the one with “highly disagree, somewhat disagree, no opinion, somewhat agree, strongly agree) that are worth using to save you time later.

– Which questions are most important for your study? Not everyone will answer all questions, so put these questions up front to make sure to capture as many people as possible.

– Consider the limits of patience in your audience. Keep the survey short and to the point, and provide some background but avoid “wall of texts” of explanation. Test your expected understanding of jargon, subject matter, etc., on a test audience to strike that balance. Be upfront about how much you are asking respondents to give. Long isn’t necessarily a killer, so long as people aren’t surprised.

– There are people who hate surveys, so you will never get 100% response rate. In reality, 50% is considered good enough to move forward with analysis (as low as 30% if you have some assurance of randomness in your response pool). You can expect higher if respondents are already in your circle of acquaintances, and in this case, you might want to offer up a semi-structured interview to those who hate the survey format and coding their answers into survey form later.

– Be extremely clear about ethics. All surveys are human subjects research, and many institutions have an IRB that must approve the protocol. But be better than the IRB. Be clear about data storage and privacy protections during and after analysis, and be available for questions on the matter.

One comment

  1. Geoffrey Shideler · April 3, 2015

    Nice piece. It’s extremely frustrating to see people create something they call a “survey,” when the same person would (hopefully) be frustrated if similar inappropriate survey methods were used to collect information and extrapolate on fish populations, for example. I would note that response rate isn’t nearly as important as sampling bias, though. Even if 50% of your survey frame answers, if that 50% represents a biased segment of your population (like only soliciting responses from people who feel strongly negative on your topic, because they were much more likely to response), it is much more important to learn this and account for it.

Comments are closed.