Read time: 5 minutes
Running an effective digital communications program means making constant quick decisions. We’re often forced to make choices in the absence of complete data, relying on instinct and going with our gut. But sometimes, your gut says, “Hey, honestly? You should probably get some actual facts together before you make this big important decision.”
You have a few options when you really want to know how your supporters will react to a moment or a message. You can hire a polling firm to conduct large-scale national surveys and/or focus groups. But those methods for answering your questions aren’t cheap or fast (or at least, not cheap and fast). And if you’re like many of our clients, you often need answers NOW to get your next campaign off the ground or to make your case internally for a new investment or initiative.
If your gut doesn’t quite provide the right statistical validity for the job and hiring a polling firm doesn’t fit your pocketbook or timeline, it’s time to explore online survey tools.
How Does This Work?
Most online survey tools function in the same way: you choose an audience based on various demographics, create questions based on specific question types, the platforms solicit responses from an audience based on your targets or quotas, and then provide raw results, in some cases with crosstabs of demographic data or responses to other questions.
But once you start diving into specifics, there are some differences between the tools:
Recruitment: Each tool takes a different approach to recruiting survey takers. Most offer an incentive of some kind; some pay the survey takers directly. And some let you upload your own list to target or share a public link to the survey.
Targeting: Audience targeting and analysis also varies by tool. The demographics by which you can narrow your audience pool can range in specificity from state to metro area to zip code, and can include demographics like race, gender, age, household income, employment status, education level, and more.
Question types: The types of questions you ask—whether open-ended ranking multiple items on a scale—are determined by the tool. Some tools offer more question types and allow for more questions, although most restrict the number of questions.
Testing the Tools
Earlier this year we used Pollfish to get some data on how the new tax law might impact giving. Luckily, the data helped quell some nerves. The tools allowed us to see if the new tax law would cause people to decrease their giving and to compare the response across different demographics. Virtually none of the groups we surveyed said they anticipated decreasing their giving in 2018, if they had increased it in 2017. You can read more about our findings here.
Pollfish was useful, but in typical M+R fashion we wanted to collect more data to make sure we were collecting data in the most effective way. We decided to use survey tools to help us answer some burning questions about brand awareness and fundraising. Throughout 2018, we’ve been running the same survey across Pollfish, Survey Monkey, and Google Surveys to track general brand awareness for two nonprofit cohorts over time.
Evaluating the Tools
We ran identical surveys across four platforms: Pollfish, Survey Monkey, Google Surveys, and Amazon Mechanical Turk (mTurk). We also ran that same survey on each platform a second time one week later, all to help us compare the platforms on five main facets:
- Consistency: Did they produce the same or similar results on the same survey a week apart?
- Flexibility: Did the tool allow us to target the audience and ask the types of questions we wanted?
- Cost: How much did targeting or screening an audience with a question increase the cost? Could the same or similar results be achieved with 100 as with 400 participants?
- Ease of use: How easy or hard was it to set up the survey and analyze the results?
- Speed: How quickly can you get responses back on your survey?
So how’d the tools do?
Consistency: Survey Monkey and Pollfish had the most consistent results from one week to the next. Results were least consistent with mTurk.
Flexibility: Survey Monkey and Pollfish had the most question type options. Amazon mTurk limited question types to four, though very advanced users could create more with coding.
Speed: Amazon mTurk provided responses in the shortest amount of time, followed closely by Pollfish, both within hours. Survey Monkey and Google Surveys provided answers in a matter of days, and Survey Monkey does offer an expedited option for an additional fee.
Cost: For all of the tools, we saw similar results with 100 and 400 participants, meaning we could spend less for the same information. Note: Platforms do recommend minimums (around 250) for statistical confidence, which might be a priority over price.
Ease of use: Survey Monkey, Pollfish, and Google Surveys helpfully provided crosstab by answers to questions; Amazon mTurk provided answers in raw form only.
Bonus: Survey Monkey, Pollfish, and Google Surveys all allow you to attach images and video. Amazon mTurk does not.
At a glance, here are the basics:
Google Surveys | Pollfish | Survey Monkey + Audiences | Amazon mTurk | |
Flexibility:
Screening Questions |
Yes, up to 4 | Yes, up to 2 | Yes | Yes |
Flexibility:
Targeting Options |
4+ factors | 20+ factors | 15+ factors | 8 factors |
Speed | Days | Hours | Days | Hours |
Cost | $1-$3 per response | $1-$4 per response | $1+ per response | 50¢ per + 20% of amount paid respondents |
Ease of use | Crosstab by answers to questions | Crosstab by answers to questions and demographics | Crosstab by answers to questions and some demographics | Raw data only |
Have you done online surveying? We’d love to hear how you’re using it and how you’ve found results compare to offline! Drop us a line on social @mrcampaigns.