Skipping the first two steps (Deciding the objective, Metadata description, and Designing the survey), which are related to the survey's subject matter and beyond the present study's scope, focus should shift to the mode choice. The panel's objective was to have a probability‐based web data collection. Target population were Lombardy region inhabitants. Internet penetration in the population was not high, and an exhaustive list of e‐mail addresses did not exist; thus, there was a need for a proxy for the population list. The proxy list available contained postal addresses and phone numbers. Therefore, a probability‐based survey was not possible without some preliminary step to select a probability‐based sample; the adoption of a mixed‐mode approach to cover the part of the population not on the Internet was the solution. Thus, a contact mode had to be decided, considering that postal and telephone number codes were available. The selection of the survey mode for the sampled units took place using a mixed‐contact mode, partially telephone and partially mail. The survey mode was a mixed‐mode as well; only part of the sampled units had an e‐mail address and accepted a web survey, while for some others the interview was by telephone or mail.
Table 3.2 Responses to the survey by mode and percentage composition
Mode |
% |
Mode |
% |
Web |
68.5 |
Web |
45.5 |
Phone |
71.2 |
Phone |
30.2 |
Mail |
53.2 |
Mail |
24.3 |
Total |
63.5 |
Total |
100.0 |
The data collected by mode are in Table 3.2. The response rate has been satisfactory. The web mode turned out to be the most important component of the mixed‐mode approach.
To address the web component of the mixed‐mode approach, the steps in Figure 3.1were followed.
To show how to use, at the monitoring data collection step, adaptive design to increase the response rate in a web survey and how different error types are interrelated, Bianchi and Biffignandi (2014) applied experimental responsive design strategies, in retrospect, to the recruitment of the mixed‐mode panel. Especially targeting the units contributing the most to nonresponse bias during data collection was useful. The identification of such units took place through indicators representing proxy measures for nonresponse bias. In their study, the authors adopted three strategies, and the results show that the method proves promising in reducing nonresponse bias. When evaluated in the TSE framework, i.e., considering how different errors relates to the adopted responsive strategies, the results are not uniform across variables. In general, there is a reduction of total errors.
A flowchart for the web/mobile web survey process of a probability‐based survey is proposed and discussed. The flowchart will be useful for practitioners and researchers. Practitioners have a guide to follow when undertaking the survey in an efficient way, without forgetting or overlooking important decisional steps. Not considering the steps in the flowchart could compromise the survey's quality and increase the amount of errors. Because a detailed description of the most important flowchart steps is in the chapters of this book, surveyors have the opportunity to gain a deeper insight into different issues ad techniques.
When analyzing their empirical results and evaluating errors and the risk of errors, researchers can identify the steps or sub‐steps to which the results are related and determine how the decisions on one step (or sub‐step) could affect the results at other steps, thus improving the quality of the survey process.
Metadata:Metadata is “data that provides information about other data.” In short, it's data about data. Mixed‐mode survey:A survey in which various modes of data collection are combined. Modes can be used concurrently (different groups are approached by different modes) or sequentially (nonrespondents of a mode are re‐approached by a different mode). Paradata:Paradata of a web survey are data about the process by which the data were collected. Probability‐based panel:A panel for which members are recruited by means of probability sampling.
1 Exercise 3.1 Selecting the survey mode for a probability‐based survey requires knowing:The sampling frame list of the web population.The sampling frame list of the whole population (both web and non‐web).Only the sampling frame list of the non‐Internet population.Only the phone number of the Internet population.
2 Exercise 3.2 When a bias error occurs?When designing the survey.When choosing the sampling technique.When drawing the sample.When estimating the model.
3 Exercise 3.3 Probability‐based web survey can take place:When the target frame list is available.Whether people invitation to participate in a survey appears when accessing a website.When an e‐mail list of a few people of the target population is available.When the list of e‐mail addresses is available.
4 Exercise 3.4 The flowchart is describing:The sampling methods.Step of actions and decisions in a web survey.When an e‐mail list of a few people of the target population is available.When the list of e‐mail addresses is available.
5 Exercise 3.5 Total survey error approach is considering:Only sampling errors.The overall quality of the survey steps.Only the measurement error.A non‐probability‐based survey.
6 Exercise 3.6 The selection of an inadequate mode is affecting:Only the coverage of the sampling frame.Only the response rate.The overall quality of the survey.The response rate and the sampling frame.
1 Bethlehem, J. & Biffignandi, S. (2012), Handbook of Web Surveys, 1st edition. Wiley, Hoboken, NJ.
2 Bianchi, A. & Biffignandi, S. (2014), Responsive Design for Economic Data in Mixed‐Mode Panels. In: Mecatti, F., Conti, P. L., & Ranalli, M. G. (eds.), Contributions to Sampling Statistics. Springer, Berlin, pp. 85–102.
3 Biemer, P., de Leeuw, E., Eckman, S., Edwards, B., Kreuter, F., Lyberg, L. E., Tucker, N., & West, B. T. (eds.). (2017), Big Data in Practice. Wiley, Hoboken, NJ.
4 Callegaro, M. (2013), Paradata in Web Surveys. In: Kreuter, F. (ed.), Improving Surveys with Paradata: Analytic Uses of Process Information. Wiley, Hoboken, NJ, pp. 261–280.
5 Couper, M. & Mavletova, A. (2014), Mobile Web Surveys: Scrolling versus Paging; SMS versus e‐mail Invitations. Journal of Survey Statistics Methodology, 2, pp. 498–518.
6 Dillman, D., Smyth, J., & Christian, L. M. (2014), Internet, Mail and Mixed Mode Surveys. The Tailored Design Method. Wiley, Hoboken, NJ.
7 Groves, R. M. (1989), Survey Errors and Survey Costs. Wiley, New York.
8 Heerwegh, D. (2011), Internet Survey Paradata. In: Das, M., Ester, P., & Kaczmirek, L. (eds.), Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies. Taylor and Francis, Oxford.
9 Jäckle, A., Lynn, P., & Burton, J. (2015), Going Online with a Face‐to‐Face Household Panel: Effects of a Mixed Mode Design on Item and Unit Non‐Response. Survey Research Methods, 9, 1, pp. 57–70.
10 Kreuter, F. (2013), Improving Surveys with Paradata: Analytic Uses of Process Information. Wiley, New Jersey.
11 Olson, K. & Parkhurst, B. (2013), Collecting Paradata for Measurement Error Evaluation. In: Kreuter, F. (ed.), Improving Surveys with Paradata: Analytic Uses of Process Information. Wiley, Hoboken, NJ, pp. 73–95.
Читать дальше