Factors affecting response rates of the web survey: A systematic review

https://doi.org/10.1016/j.chb.2009.10.015Get rights and content

Abstract

The lower response rate in web surveys has been a major concern for survey researchers. The literature has sought to identify a wide variety of factors that affect response rates in web surveys. In this article, we developed a conceptual model of the web survey process and use the model to systematically review a wide variety of factors influencing the response rate in the stage of survey development, survey delivery, survey completion, and survey return. Practical suggestion and future research directions on how to increase the response rate are discussed.

Introduction

Survey researchers have been using various modes and methods, such as mail, telephone, and e-mail, to collect data. In the past decade, web surveys as a new mode of conducting surveys via websites have gained significant popularity (Couper, 2000, Couper et al., 2001). Compared with traditional modes of surveys, web surveys have several advantages, including shorter transmitting time, lower delivery cost, more design options, and less data entry time. However, web surveys often face specific challenges, such as losing participants who do not Internet access, and having low response rates that could lead to biased results (Couper, 2000, Fricker and Schonlau, 2002, Groves, 1989).

Among various web survey challenges, low response rates become a major concern in threatening the quality of the web survey (Couper, 2000, Crawford et al., 2001, Dommeyer and Moriarty, 2000). The response rate is generally defined as the number of completed units divided by the number of eligible units in the sample, according to American Association for Public Opinion Research. It is the most widely used and commonly computed statistic to indicate the quality of surveys. Based on a recent meta-analysis (Manfreda, Bosnjak, Berzelak, Haas, & Vehovar, 2008) of 45 studies examining differences in the response rate between web surveys and other survey modes, it is estimated that the response rate in the web survey on average is approximately 11% lower than that of other survey modes.

To locate the research literature on contributing factors to response rates of web surveys, we have used multiple strategies, including computer-based search of major databases, manual search of references included in the identified literature, and direct consultation with web survey experts for their published and unpublished works. Specifically, several major electronic databases were searched, including Academic Search Premier, ERIC, PsycINFO, and MEDLINE. The initial database search was based on combination of the three groups of key words: (a) web, www, Internet, web based, online; (b) questionnaire, survey, data collection; and (c) response rate, return rate. We also examined www.websm.org, the largest web site focusing on the web survey methodology. The combination of the various search strategies yielded over 300 studies that examined contributing factors of response rates in web surveys.

While the web survey literature is extensive (e.g., Couper, 2000, Couper et al., 2004, Dillman, 2000, Dillman, 2007, Manfreda et al., 2008), however, there exist no theoretical models of the psychological process of web surveys. To best review the extensive research literature, we have decided to develop and use a model of the web survey process as conceptual framework. This is because that, although there exist at least seven published reviews examining a wide variety of factors influencing response rates in either mail surveys (Edwards et al., 2002, Fox et al., 1988, Heberlein and Baumgartner, 1978, Singer, 1978, Yu and Cooper, 1983) or in web surveys (Cook et al., 2000, Sheehan, 2001), the theoretical or practical relationship among the various factors in these reviews is not immediately clear (e.g., sequentially reviewing the contributing factors of sampling methods, target population characteristics, methods of contact, questionnaire length, monetary incentives, non-monetary incentives, response facilitators, and appeals). A review without a clear framework to effectively reveal the relationship among the contributing factors might result in at least two problems. First, readers might not be able to develop a systematic knowledge of where various contributing factors are located in the entire survey process. Second, readers might not be able to have a systematic knowledge of how to increase the response rates during the actual process of conducting web surveys.

In a survey, researchers use various data collect tools such as paper, telephone, e-mail, WWW, or mobile phone to collect data from certain groups of people. Thus, the process of a survey generally involves three key elements, survey researchers (surveyors), the survey participants (surveyees), and survey tools, (or called survey modes, e.g., mail, telephone, and WWW). With the three key elements, the process of a web survey can be conceptualized in Fig. 1.

As shown in Fig. 1, the process of a web survey includes four basic steps. The first step is web survey development. It concerns the process in which surveyors design and develop a web survey and upload it to the survey website, similar to the process of developing a mail survey and printing out the needed hard copies ready for use. The second step is web survey delivery. It concerns the process in which surveyors develop a sampling method, contact potential participants, and deliver the web survey to the hands of each of surveyees, like the process of mailing and distributing the mail survey to each of potential respondents. The third step is web survey completion. It concerns the process in which web surveyees receive the survey announcement, log into the survey website, complete and submit the survey, and log out from the website, like the process of finishing a mail survey. The fourth step is web survey return. It concerns the process in which surveyors download the collected web survey data from the website to research computers in certain formats for data analysis, relatively similar to the process of handing in the completed mail surveys.

In the text that follows, we use the model presented above as the conceptual framework to review a wide variety of factors that influence the response rate of a web survey in the four basic steps of the web survey process (i.e., development, delivery, completion, and return). We conclude the paper with a list of suggestions for increasing the response rate of web survey and a summary of future research directions.

Section snippets

Factors affecting response rates in survey development

There is an extensive literature on design and development web surveys, primarily focusing on two major factors influencing the response rate, content of web questionnaires and presentation of web questionnaires.

Factors affecting response rates in survey delivery

After the web survey is uploaded to and operated on the survey website successfully, the next phase of the web survey is to deliver the survey to the hands of potential respondents. Five major issues have been discussed in the existing literature: sampling methods (who should be surveyed), contact delivery modes (how web surveys should be informed), invitation designs (how respondents should be invited), the use of pre-notification and reminders (how various notification and reminders should be

Participation in web surveys

Various factors affect respondents’ participation decision of whether they participate in a survey. Adapted from Groves’ categorization (Groves et al., 1992), we group contributing factors influencing participation decisions in web surveys into three categories: society-related factors, respondent-related factors, and design-related factors. The fist two categories of factors will be discussed here, while the design-related factors can be referred to in the previous sections.

The social-level

Factors affecting response rates in survey return

The web survey has its unique advantages and disadvantages in collecting all the completed survey. It is the last important step to ensure a good response rate. On the one hand, survey return and data entry are fully automated through the web survey software. The completed surveys will not be lost in the process of mail delivery or manual data entry like in mail surveys. Data download directly from the survey website are normally ready for immediate data analysis in SPSS, SAS, or other

Conclusion

The present review suggest that a wide variety of factors affecting the response rate of the web survey involve in all the four stages of the entire web survey process rather than only in survey delivery or survey . Thus, to increase the average response rate of web surveys, which is approximately 10% lower than that of mail or telephone surveys, web survey researchers should pay attention not only to general factors involved in any survey work but more importantly to specific factors uniquely

References (110)

  • P.M. Blau

    Exchange and power in social life

    (1964)
  • Bosnjak, M., & Tuten, T. L. (2001). Classifying response behaviors in web-based surveys [Electronic Version]. Journal...
  • M. Bosnjak et al.

    Prenotification in Web-based access panel surveys – The influence of mobile text messaging versus e-mail on response rates and sample composition

    Social Science Computer Review

    (2008)
  • M. Bosnjak et al.

    Prepaid and promised incentives in Web surveys

    Social Science Computer Review

    (2003)
  • M. Bosnjak et al.

    Unit (non)response in Web-based access panel surveys: An extended planned-behavior approach

    Psychology and Marketing

    (2005)
  • H. Cho et al.

    Privacy issues in Internet surveys

    Social Science Computer Review

    (1999)
  • C. Cobanoglu et al.

    The effect of incentives in web surveys: Application and ethical considerations

    International Journal of Market Research

    (2003)
  • C. Cook et al.

    A meta-analysis of response rates in Web- or Internet-based surveys

    Educational and Psychological Measurement

    (2000)
  • M.P. Couper

    Web surveys – A review of issues and approaches

    Public Opinion Quarterly

    (2000)
  • M.P. Couper et al.

    Visual context effects in web surveys

    Public Opinion Quarterly

    (2007)
  • M.P. Couper et al.

    Computer-assisted interviewing

  • M.P. Couper et al.

    What they see is what we get – Response options for web surveys

    Social Science Computer Review

    (2004)
  • M.P. Couper et al.

    Web survey design and administration

    Public Opinion Quarterly

    (2001)
  • Crawford, S. D. (2006a). Web survey implementation. University of North...
  • Crawford, S. D. (2006b). Web survey implementation. Paper presented at the Ethnicity, Culture, Race and Aging (ECRA)...
  • S.D. Crawford et al.

    Web surveys: Perceptions of burden

    Social Science Computer Review

    (2001)
  • S.D. Crawford et al.

    Applying Web-based survey design standards

    Journal of Prevention and Intervention in the Community

    (2005)
  • W. de Heer

    International response rates trends: Results of an international survey

    Journal of Official Statistics

    (1999)
  • E.D. de Leeuw

    To mix or not to mix data collection modes in surveys

    Journal of Official Statistics

    (2005)
  • E.D. de Leeuw et al.

    Mixed mode surveys: When and why

  • D.A. Dillman

    Mail and telephone surveys: The total design method for surveys

    (1978)
  • D.A. Dillman

    Mail and Internet surveys: The tailored design method

    (2000)
  • D.A. Dillman

    Mail and Internet Surveys: The tailored design method. 2007 update with new Internet, visual, and mixed-mode guide

    (2007)
  • D.A. Dillman et al.

    Influence of an invitation to answer by telephone on response to census questionnaire

    Public Opinion Quarterly

    (1995)
  • Dillman, D. A., Tortora, R., Conradt, J., & Bowker, D. (1998). Influence of plain vs. fancy design on response rates...
  • Dillman, D. A., Tortora, R. D., & Bowker, D. (1998). Principles for constructing web surveys. Paper presented at the...
  • D.A. Dillman et al.

    Achieving usability in establishment surveys through the application of visual design principles

    Journal of Official Statistics

    (2005)
  • D.A. Dillman et al.

    Design effects in the transition to Web-based surveys

    American Journal of Preventive Medicine

    (2007)
  • D.A. Dillman et al.

    Administrative issues in mixed mode surveys

  • K. Diment et al.

    How demographic characteristics affect mode preference in a postal/web mixed-mode survey of Australian researchers

    Social Science Computer Review

    (2007)
  • C.J. Dommeyer et al.

    Comparing two forms of an e-mail survey: Embedded vs. attached

    International Journal of Market Research

    (2000)
  • P. Edwards et al.

    Increasing response rates to postal questionnaires: Systematic review

    British Medical Journal

    (2002)
  • G. Fox et al.

    Mail survey response rate: A meta-analysis of selected techniques for inducing response

    Public Opinion Quarterly

    (1988)
  • G. Fox et al.

    Work–family balance and academic advancement in medical schools

    Academic Psychiatry

    (2006)
  • S. Fraze et al.

    The effects of delivery mode upon survey response rate and perceived attitudes of Texas Agri-Science teachers

    Journal of Agricultural Education

    (2003)
  • S. Fricker et al.

    An experimental comparison of web and telephone surveys

    Public Opinion Quarterly

    (2005)
  • R.D. Fricker et al.

    Advantages and disadvantages of Internet research surveys: Evidence from the literature

    Field Methods

    (2002)
  • Galesic, M., & Bosnjak, M. (2006). Personality traits and participation in an online access panel. Paper presented at...
  • M. Galesic et al.

    What is sexual harassment? It depends on who asks! Framing effects on survey responses

    Applied Cognitive Psychology

    (2007)
  • A.S. Goritz

    Incentives in web studies: Methodological issues and a review

    International Journal of Internet Science

    (2006)
  • Cited by (937)

    View all citing articles on Scopus
    View full text