Mode effect
Mode effect is a broad term referring to a phenomenon where a particular survey administration mode causes different data to be collected. For example, when asking a question using two different modes, responses to one mode may be significantly and substantially different from responses given in the other mode. Mode effects are a methodological artifact, limiting the ability to compare results from different modes of collection.
Theory
Particular survey modes put respondents into different frames of mind, referred to as a mental "script". This can affect the results they give. For example:- Face-to-face surveys prompt a "guest" script. Respondents are more likely to treat face-to-face interviewers graciously and hospitably, leading them to be more agreeable and affecting their answers. Differences between the interviewers administering the survey can also lead to a range of "interviewer effects" on survey results.
- Phone interviews prompt a "solicitor" or "telemarketer" script. Respondents may place less priority on telephone interviews, making them more likely to satisfice in order to finish the interview sooner. Wariness of who may be on the other end of the phone can also lead respondents to provide more socially acceptable answers than would be given in other survey modes.
Users of surveys must consider the potential for mode effects when comparing results from studies in different modes. However, this is difficult as mode effects can be complex and subject to interactions between respondent demographics, subject matter and mode. Unless the mode effects are formally investigated for the survey instrument, it is difficult to quantify their size and qualitative judgments by experts familiar with the subject matter and respective modes are required instead.
Social desirability bias
Studies of mode effects are sometimes contradictory but some general patterns do emerge. For example, social desirability bias tends to be highest for telephone surveys and lowest for web surveys:- Telephone surveys
- Face-to-face surveys
- IVR surveys
- Mail surveys
- Web surveys
Differences in questions between modes
Some modes require different question wording from others, in order to suit the features of the mode. For example, self-complete forms can use lists of examples or extensive instructions to help respondents answer relatively complex questions. By contrast, in telephone interviews, respondents are often limited by their working memory and are unlikely to understand a long question with multiple sub-clauses. Another example is a 'matrix' of questions, commonly found on self-complete forms, cannot be read out easily in a verbal interview; rather a matrix would generally need to be scripted as a series of individual questions.Differences in question wording across modes may cause different data to be collected by different modes. However, this is not always the case, and appropriate adaptation of questions to a new mode can yield comparable data. Survey designers should consider the conventions of the mode when adapting questions. For example, while it may be acceptable to require respondents to calculate total figures themselves in a paper form, respondents may perceive it to be burdensome if this is required in a web form. This may in turn change their attitude toward the form, altering their behaviour and ultimately changing the data collected.
Identifying and resolving mode effects
Mode effects can be identified by embedding an experiment within the survey, where a proportion of respondents are allocated to each mode. Differences in results from each mode should identify the 'mode effect' for this particular survey.Once a mode effect has been quantified, it may be possible to use this information to reprocess existing data and allow comparison between data collected in different modes.
Differential coverage between modes
Different administration modes may inherently exclude some parts of the target population. This potentially biases the sample that is taken, and changes the data from what would have been collected using another mode. For example, people without a home phone are excluded from Random Digit Dialling surveys, and people without internet access are unlikely to complete a web survey. This means different samples are taken from the population when using different modes. Unless experiments are specifically designed to investigate differential coverage, mode effects will be confounded by coverage, and significant differences between modes/experimental conditions could have several explanations:
- properties of the mode;com3
- different 'types' of people responding to the different modes;
- both the mode properties and different 'types' of respondents ;
- an interaction, where some respondents are affected by properties of the mode but others aren't.