Factors affecting response rates of the web survey: A systematic review
Introduction
Survey researchers have been using various modes and methods, such as mail, telephone, and e-mail, to collect data. In the past decade, web surveys as a new mode of conducting surveys via websites have gained significant popularity (Couper, 2000, Couper et al., 2001). Compared with traditional modes of surveys, web surveys have several advantages, including shorter transmitting time, lower delivery cost, more design options, and less data entry time. However, web surveys often face specific challenges, such as losing participants who do not Internet access, and having low response rates that could lead to biased results (Couper, 2000, Fricker and Schonlau, 2002, Groves, 1989).
Among various web survey challenges, low response rates become a major concern in threatening the quality of the web survey (Couper, 2000, Crawford et al., 2001, Dommeyer and Moriarty, 2000). The response rate is generally defined as the number of completed units divided by the number of eligible units in the sample, according to American Association for Public Opinion Research. It is the most widely used and commonly computed statistic to indicate the quality of surveys. Based on a recent meta-analysis (Manfreda, Bosnjak, Berzelak, Haas, & Vehovar, 2008) of 45 studies examining differences in the response rate between web surveys and other survey modes, it is estimated that the response rate in the web survey on average is approximately 11% lower than that of other survey modes.
To locate the research literature on contributing factors to response rates of web surveys, we have used multiple strategies, including computer-based search of major databases, manual search of references included in the identified literature, and direct consultation with web survey experts for their published and unpublished works. Specifically, several major electronic databases were searched, including Academic Search Premier, ERIC, PsycINFO, and MEDLINE. The initial database search was based on combination of the three groups of key words: (a) web, www, Internet, web based, online; (b) questionnaire, survey, data collection; and (c) response rate, return rate. We also examined www.websm.org, the largest web site focusing on the web survey methodology. The combination of the various search strategies yielded over 300 studies that examined contributing factors of response rates in web surveys.
While the web survey literature is extensive (e.g., Couper, 2000, Couper et al., 2004, Dillman, 2000, Dillman, 2007, Manfreda et al., 2008), however, there exist no theoretical models of the psychological process of web surveys. To best review the extensive research literature, we have decided to develop and use a model of the web survey process as conceptual framework. This is because that, although there exist at least seven published reviews examining a wide variety of factors influencing response rates in either mail surveys (Edwards et al., 2002, Fox et al., 1988, Heberlein and Baumgartner, 1978, Singer, 1978, Yu and Cooper, 1983) or in web surveys (Cook et al., 2000, Sheehan, 2001), the theoretical or practical relationship among the various factors in these reviews is not immediately clear (e.g., sequentially reviewing the contributing factors of sampling methods, target population characteristics, methods of contact, questionnaire length, monetary incentives, non-monetary incentives, response facilitators, and appeals). A review without a clear framework to effectively reveal the relationship among the contributing factors might result in at least two problems. First, readers might not be able to develop a systematic knowledge of where various contributing factors are located in the entire survey process. Second, readers might not be able to have a systematic knowledge of how to increase the response rates during the actual process of conducting web surveys.
In a survey, researchers use various data collect tools such as paper, telephone, e-mail, WWW, or mobile phone to collect data from certain groups of people. Thus, the process of a survey generally involves three key elements, survey researchers (surveyors), the survey participants (surveyees), and survey tools, (or called survey modes, e.g., mail, telephone, and WWW). With the three key elements, the process of a web survey can be conceptualized in Fig. 1.
As shown in Fig. 1, the process of a web survey includes four basic steps. The first step is web survey development. It concerns the process in which surveyors design and develop a web survey and upload it to the survey website, similar to the process of developing a mail survey and printing out the needed hard copies ready for use. The second step is web survey delivery. It concerns the process in which surveyors develop a sampling method, contact potential participants, and deliver the web survey to the hands of each of surveyees, like the process of mailing and distributing the mail survey to each of potential respondents. The third step is web survey completion. It concerns the process in which web surveyees receive the survey announcement, log into the survey website, complete and submit the survey, and log out from the website, like the process of finishing a mail survey. The fourth step is web survey return. It concerns the process in which surveyors download the collected web survey data from the website to research computers in certain formats for data analysis, relatively similar to the process of handing in the completed mail surveys.
In the text that follows, we use the model presented above as the conceptual framework to review a wide variety of factors that influence the response rate of a web survey in the four basic steps of the web survey process (i.e., development, delivery, completion, and return). We conclude the paper with a list of suggestions for increasing the response rate of web survey and a summary of future research directions.
Section snippets
Factors affecting response rates in survey development
There is an extensive literature on design and development web surveys, primarily focusing on two major factors influencing the response rate, content of web questionnaires and presentation of web questionnaires.
Factors affecting response rates in survey delivery
After the web survey is uploaded to and operated on the survey website successfully, the next phase of the web survey is to deliver the survey to the hands of potential respondents. Five major issues have been discussed in the existing literature: sampling methods (who should be surveyed), contact delivery modes (how web surveys should be informed), invitation designs (how respondents should be invited), the use of pre-notification and reminders (how various notification and reminders should be
Participation in web surveys
Various factors affect respondents’ participation decision of whether they participate in a survey. Adapted from Groves’ categorization (Groves et al., 1992), we group contributing factors influencing participation decisions in web surveys into three categories: society-related factors, respondent-related factors, and design-related factors. The fist two categories of factors will be discussed here, while the design-related factors can be referred to in the previous sections.
The social-level
Factors affecting response rates in survey return
The web survey has its unique advantages and disadvantages in collecting all the completed survey. It is the last important step to ensure a good response rate. On the one hand, survey return and data entry are fully automated through the web survey software. The completed surveys will not be lost in the process of mail delivery or manual data entry like in mail surveys. Data download directly from the survey website are normally ready for immediate data analysis in SPSS, SAS, or other
Conclusion
The present review suggest that a wide variety of factors affecting the response rate of the web survey involve in all the four stages of the entire web survey process rather than only in survey delivery or survey . Thus, to increase the average response rate of web surveys, which is approximately 10% lower than that of mail or telephone surveys, web survey researchers should pay attention not only to general factors involved in any survey work but more importantly to specific factors uniquely
References (110)
- et al.
Noncoverage and nonresponse in an Internet survey
Social Science Research
(2007) - et al.
Personalization, authentication and self-disclosure in self-administered Internet surveys
Computers in Human Behavior
(2007) - et al.
Comparison of web and mail surveys for studying secondary consequences associated with substance use: Evidence for minimal mode effects
Addictive Behaviors
(2006) A theory of planned behavior
- et al.
Prediction of leisure participation from behavioral, normative, and control beliefs: An application of the theory of planned behavior
Leisure Sciences
(1991) - Asiu, B. W., Antons, C. M., & Fultz, M. L. (1998). Undergraduate perceptions of survey participation: Improving...
- et al.
Nonresponse in federal household surveys: New measures and new insights
Journal of Official Statistics
(2001) Response rate in academic studies: A comparative analysis
Human Relations
(1999)Teleworking: Benefits and pitfalls as perceived by professionals and managers
New Technology, Work and Employment
(2000)- et al.
The effects of cash, electronic, and paper gift certificates as respondent incentives for a Web-based survey of technologically sophisticated respondents
Social Science Computer Review
(2004)
Exchange and power in social life
Prenotification in Web-based access panel surveys – The influence of mobile text messaging versus e-mail on response rates and sample composition
Social Science Computer Review
Prepaid and promised incentives in Web surveys
Social Science Computer Review
Unit (non)response in Web-based access panel surveys: An extended planned-behavior approach
Psychology and Marketing
Privacy issues in Internet surveys
Social Science Computer Review
The effect of incentives in web surveys: Application and ethical considerations
International Journal of Market Research
A meta-analysis of response rates in Web- or Internet-based surveys
Educational and Psychological Measurement
Web surveys – A review of issues and approaches
Public Opinion Quarterly
Visual context effects in web surveys
Public Opinion Quarterly
Computer-assisted interviewing
What they see is what we get – Response options for web surveys
Social Science Computer Review
Web survey design and administration
Public Opinion Quarterly
Web surveys: Perceptions of burden
Social Science Computer Review
Applying Web-based survey design standards
Journal of Prevention and Intervention in the Community
International response rates trends: Results of an international survey
Journal of Official Statistics
To mix or not to mix data collection modes in surveys
Journal of Official Statistics
Mixed mode surveys: When and why
Mail and telephone surveys: The total design method for surveys
Mail and Internet surveys: The tailored design method
Mail and Internet Surveys: The tailored design method. 2007 update with new Internet, visual, and mixed-mode guide
Influence of an invitation to answer by telephone on response to census questionnaire
Public Opinion Quarterly
Achieving usability in establishment surveys through the application of visual design principles
Journal of Official Statistics
Design effects in the transition to Web-based surveys
American Journal of Preventive Medicine
Administrative issues in mixed mode surveys
How demographic characteristics affect mode preference in a postal/web mixed-mode survey of Australian researchers
Social Science Computer Review
Comparing two forms of an e-mail survey: Embedded vs. attached
International Journal of Market Research
Increasing response rates to postal questionnaires: Systematic review
British Medical Journal
Mail survey response rate: A meta-analysis of selected techniques for inducing response
Public Opinion Quarterly
Work–family balance and academic advancement in medical schools
Academic Psychiatry
The effects of delivery mode upon survey response rate and perceived attitudes of Texas Agri-Science teachers
Journal of Agricultural Education
An experimental comparison of web and telephone surveys
Public Opinion Quarterly
Advantages and disadvantages of Internet research surveys: Evidence from the literature
Field Methods
What is sexual harassment? It depends on who asks! Framing effects on survey responses
Applied Cognitive Psychology
Incentives in web studies: Methodological issues and a review
International Journal of Internet Science
Cited by (937)
Psychometric proprieties of the French Version of the Family-Focused Mental Health Practice Questionnaire (FFMHPQ)
2024, Revue Europeenne de Psychologie AppliqueeComparison of Paper-and-Pencil Versus Tablet Administration of the 2021 National Youth Risk Behavior Survey (YRBS)
2024, Journal of Adolescent HealthConsiderations of Academically Talented Students’ Homeschooling Families for Returning to Traditional Schools
2024, Gifted Child Quarterly