Online Survey Platforms Guidance

Online Survey Platforms

What are online survey platforms?

Online survey platforms offer cash, gift card, or other compensations for survey-takers to participate in market surveys and research surveys.  These sites facilitate the gathering of data from individuals and offer researchers convenient features, such as managing compensation and pre-screening participants based on identity factors shared directly with the sites themselves.  Examples of such sites are MTurk, Prolific, e-Rewards, and Lucid.

How are academic researchers using online survey platforms?

Online survey platforms allow researchers to have access to a large population of willing participants for research studies.  Researchers can generate a survey or task, providing potential participants with a title and description of what they would do as participants.  These are called “HITs” (Human Intelligence Tasks) on MTurk, but different websites may have different names for them.  These descriptions also include the amount of compensation users will receive and the amount of time the task/survey takes to complete. 

Sites such as MTurk, Prolific, and Lucid allow for researchers to recruit from their site and then route participants to their online survey, which may be hosted through other sites like Qualtrics, Survey Monkey, Google Forms, Survey Gizmo, or others.  The participant is then routed back, sometimes with a “completion code,” to the online survey platform to receive compensation.

One of the draws to using online survey platforms is the heterogeneity of users, which provides more diversity than a typical college campus population.  With a wide range of users, researchers also have the ability to draw from distinct user pools (such as having a particular amount of users in a variety of economic ranges, age ranges, or gender roles).  Additionally, these sites allow for pseudo-anonymity between researchers and participants by having a third party (the online survey platform) overseeing payments (compensation) to participants.

It is important to note that some platforms are more “designed” for research than others.  Amazon Mechanical Turk, for instance, is a site primarily for matching people with work tasks and submitting payments to workers.  It is not designed with an eye towards human subjects research.  It lacks the sophistication and security measures of dedicated survey software tools, and collection of data by Amazon is subject to Amazon’s Privacy Policy and Terms of Service.  Other sites such as Prolific and Lucid are more geared toward human subjects research, but researchers should always note the policies governing the data protection and ownership for each site.

What do academic researchers need to consider when submitting an study to the IRB that utilizes online survey platforms?

Recruitment: The title of the study and its description (also known as a HIT for MTurk) are acting as a form of recruitment.  Researchers should be sure to include the title and description as part of the Kuali application. In the description, researchers should be sure to include the following information:

  • Researchers should be clear about compensation and bonuses.  Also, it would be useful for participants to know how long it will take for the researchers to approve their submission and release that compensation.
  • Clearly and accurately state the time required to complete the task.
  • Participants should be told if there is a screener in order to qualify.  It would be important for researchers to make clear if participants are being paid for the time it takes to complete the screener or not.  One option would be to list qualifications for participation in the description.  Another option would be to make an initial description and “study” for the Screener (which would pay a nominal amount), and then if an individual is eligible for the main study, they get a bonus amount or are then invited to the main study.
  • If any extraneous software is required to complete the task, this should be stated in the description (e.g. this task requires javascript or inquisit).
  • Researchers should be clear about the type of task participants are being asked to do.  For instance, if the task involves writing, or watching videos, this should be stated in the description.  Also be aware that certain types of tasks, such as writing tasks, elicit higher compensation.
  • If there are specific requirements in terms of how they should complete the study (i.e., only on laptops, or with working sound), this should be included.
  • The researchers name and/or school affiliation should be listed as the requester/host, or somewhere in the description.
  • If applicable: The link to the online survey should be included.

Consent: The first page of the online survey should be the consent document. The online consent will have all of the elements of a regular consent, but it will not require a signature.  Participants will either click an “I Agree” or an “I do not Agree” box.  The “I Agree” box will take them into the survey.  The “I do not agree” box will thank them for their time and take them away from the survey.  For a sample of an online consent form, please see our one page, online survey consent template.

Debrief: If the researchers are using deception or incomplete disclosure (i.e. – are not stating exactly what the study is about so as not to bias participants responses), then it is important to include a debriefing form at the end of the survey.  This debriefing form could be embedded into the last page of the survey and would require participants to answer a final question allowing researchers to use their data (or not use their data) now that they know the true purpose of the study.  For more information on debriefing forms, please see our debriefing processPlease note: For particularly sensitive topics, the IRB may want to ensure that participants receive a debriefing form, even if they do not complete the full study (i.e. – they click out before the end of the survey).  This might mean contacting participants (through the online survey platform) and providing them a debriefing form.  If this is something that will happen for your study, it would be important to include a statement in the description and in the informed consent that the researchers will be providing participants with additional information after the study and that they may contact participants through the messaging function of the online survey platform.  This might mean collecting participants’ worker IDs.  Researchers should be sure to state that the collection of worker IDs will only be for debriefing purposes and after debriefing, worker IDs will be deleted and will at no time ever be linked to their survey data.  In the case that there is no way to contact participants, the consent form should have them confirm that they will reach out to the researcher for important additional information, if they exit the study before completing it.

Confidentiality: While it may have been the goal that online survey platform participants were anonymous to academic researchers, the reality is that anonymity cannot be guaranteed in any online environment were data is being collected.  For instance, recent research shows that MTurk worker IDs can easily be linked to individuals Amazon profiles including individuals’ wish lists and previous product reviews.  This means that researchers must be careful in deciding what information to collect from participants.  The default should be that participants’ worker IDs not be collected.  If it is necessary to collect worker IDs (for instance, for payment), then the researchers should ensure that worker IDs are kept confidential and secure, are not linked back to survey data, and are deleted after use as soon as possible.

Additional things to consider with regard to confidentiality of participants identity and data are the online survey platform’s Terms of Service (TOS) and those of the online survey software being used (Qualtrics, Survey Monkey, Google Forms, Survey Gizmo, Zoho Survey, etc.)  Researchers should examine the TOS especially as it relates to the collection of participants’ online behavior and history (through the use of cookies or other tracking systems) and the selling of participants’ data to third parties.  It would also be important for researchers to be aware of any policies or procedures in place by online survey site platforms or online survey software companies when a breach of data occurs.  It might be helpful to think about the researcher’s own protocol for when a breach of data security occurs.  Working with IT to solidify this process may be necessary.  This information could then be relayed to participants through the consent.