Advice for Running a Web Survey

Test all the components of your survey. Your survey is more than just the questions.

Take the time to design your survey

Constructing a survey instrument is an art in itself. There are numerous small decisions that must be made — about content, wording, format, placement — that can have important consequences for your results. While there’s no one perfect way to accomplish this, OIRE is available ([email][/email]) to offer advice and guidance to help develop your survey. Additionally, the Social Research Methods site is a good resource.

Check the HMC Survey Calendar

OIRE maintains a survey calendar, which shows what surveys are active on campus at what times for students, faculty and staff. Please contact us ([email][/email]) and let us know about your survey and we will add it to the calendar. This helps OIRE coordinate survey administration, make sure that HMC is not overburdening students, faculty and/or staff with survey requests and helps ensure that your survey has a better chance of getting a good response.

Think about the length of your survey instrument

Think about it again. Every additional question reduces the likelihood respondents will complete the survey. Don’t ask items you don’t know how you will use. It is good practice to provide survey respondents with an estimate of how many items are on the survey and/or how long it takes on average to complete the survey in the survey invitation/instructions.

Don’t ask if you know the answer

Basic demographic information (gender, race/ethnicity, class year, major, etc.) is available through OIRE. When appropriate, suppress these questions and OIRE can provide that data for you. OIRE uses HMC ID as the link between demographic data and survey responses in creating a datafile. HMC also participates in many standard surveys. If the results of these surveys can answer the questions you have, consider using these results rather than running your own independent survey. Not only will you get results faster and avoid asking HMC students questions they have already answered, you will know that the items have been independently tested and validated and you will be assured of a good response rate. A list of surveys that HMC participates in and the types of information available can be found at Institutional Results by Instrument.

Do not “force” a response to a question

By doing this, you may annoy the respondent, causing them to abandon the entire survey, or encourage them to speak out in ways that negatively impact your survey. More importantly, voluntary participation refers to the exercise of free will in deciding whether to participate in any research activity, including answering a question on a survey. International law, national law, and the codes of conduct of scientific communities protect this right. Sample language from CGU’s Institutional Review Board to include in your instructions: “This survey is entirely voluntary, and individuals may answer as few or as many questions as they wish.” OIRE does not force or request a response to any survey or question. Allow the respondents to decide how much information they want to share.

Get feedback on your survey

It’s a good idea to pilot test your survey with a few volunteers in advance of sending it out. Sample language to invite pilot-testers: “In advance of this survey going out to a wider audience, we are asking a select group of students/faculty/staff to respond to the survey early in order to get feedback on the questions, and to understand how long the survey takes to answer.” If you make no changes from pilot test to launching the full version, this allows you to retain and use pilot test data. Many of us are concerned with getting as many respondents as possible. While there is no single method that will increase response rates for your survey, research (Dillman, 2007) does indicate several factors that contribute to higher response rates:

  • Perceived importance of the survey (value to the student, the college, perceived legitimacy)
  • Level of interest students/faculty/staff have in the topic
  • Trust that the data will be used and maintained properly
  • Perception of reward for participation
  • Minimizing respondent burden

With these in mind, here are some practical considerations that may serve to help you get the best response to your survey:

Who the survey is “from” matters

Who will your intended audience respond to? It may be a department chair, a Dean, or OIRE. “Office of the Dean of Students” is less effective than “Dean Leslie”. Framing who the survey is from can help you acquire more participation and better data.

The “Subject” line of your invitation email matters

Think of your subject line as a headline. It should be:

  1. Short (because people don’t read)
  2. Rich in information (clearly summarize the email)
  3. Frontloaded with keywords (because people scan only the beginning of a list of items, and an email inbox is a list)
  4. Understandable out of context (your email is competing for attention with other emails)
  • Provide the reader with a reason to explore your message further.
  • The words “Help” and “Reminder” negatively affect survey participation rates.
  • General Rule of Thumb: keep the subject line to 50 characters.
  • Provide localization and context. “HMC Core Curriculum Alumni Survey” or “Alumni Attitudes towards Core Curriculum” is more effective than “Alumni Survey.”

The proper “Greeting” in the invitation will increase the likelihood of response

“Dear Alex” is better than “Dear member of the Class of 2021” is better than “Dear HMC student”. “Dear Professor Xu” is better than “Dear Faculty”.

If you are offering incentives, prizes or gifts for participation, check the rules

Any incentive program may require IRB approval. We recommend that you check with the CGU IRB as soon as you have made a decision regarding incentives. Other helpful hints around the use of incentives are:

  • Incentives should be designed in a manner that maintains the voluntary nature of the survey. Please note, if you offer an incentive, your survey is no longer anonymous, as researchers will need to know the identity of those who participated in order to get them their incentive.
  • Survey promotion (emails, flyers, etc.) should not emphasize the incentives to a degree that minimizes the requirement of survey participation.
  • The amount or value of the incentive should not be so large as to appear coercive, interfere with a student’s financial aid or violate NCAA eligibility rules.
  • If your incentive involves a drawing (i.e., raffle or lottery), you should consult CA state law to determine whether your lottery is legal.
  • If you plan to utilize an incentive for your survey, you must add that information about incentives to the instructions all participants receive before deciding to participate in the survey. This section would describe, in detail, the incentive program including the amount that could be won, an estimate of the odds (if you are utilizing a drawing), and how any drawing or other incentive program would be conducted. If you are conducting a web administration, this information should be included on the “Welcome” page.
  • Alternative considerations include allowing the respondents to select their “incentive” (e.g. please select your “thank you” gift: a) $10 Claremont Cash, b) $10 Amazon Gift Card; c) $10 donation to HMC’s Mary S Binder Prize or d) I do not wish to receive a gift) or making a donation for every valid survey response (e.g. donating $1.00 per survey response to a local charity or cause).

Set up your reporting before you launch the survey

You want the best survey possible. In order to obtain quality results, that means doing some work upfront.

  • Enter variable names and values into your datafile.
  • Generate test responses (pilot test the survey).
  • Check and see that reporting is capturing data as you intend.
  • Create an analysis strategy (e.g. list of analyses you will do).

For more helpful information, please consult: