Chapter 15 Questionnaires

This Section concentrates on surveys of human subjects, but many of these stages still apply in studies of animals, plants and inanimate subjects.

15.1 Stages of Questionnaire Development

  • Formulate the survey objectives;

  • Convene a set of Focus Groups to investigate their experiences and inform the formulation/refinement of the objectives;

  • Ensure the appropriate conceptual framework exists for the data collection. Define key concepts and terms;

  • Decide on classification standards;

  • Identify constraints imposed by the sample design, identify relevant information available from the survey frame;

  • Decide the survey mode;

  • Develop a list of topic areas, and any relationships or dependencies among them;

  • Develop a list of items to be collected – justify each in relation to the objectives;

  • Develop question wording, sets of responses (for closed questions)

  • Develop connecting material, and instructions;

  • Develop the layout of the questionnaire – includes instructions, routing, contingency questions. This may mean programming the questionnaire into a software package;

  • Develop/obtain additional materials (including showcards, any measuring devices, envelopes etc.);

  • Design and test the data capture/editing/coding procedures;

  • Test (and revise) the questionnaire (not all of these are necessary):

    • Desk Check – the designer and colleagues test the questionnaire;
    • Cognitive Test – a small set of respondents are selected (not necessarily from the survey population) to complete the questionnaire, and asked to express their reactions and thought processes as they do so;
    • Focus Group – use the questionnaire with members of the survey population, but in an office situation. Discuss with them their reactions to the questionnaire, wording, terms used, flow, instructions, layout etc.;
    • Pilot Test – use the questionnaire in a real setting with people who are in the survey population, and who will give real responses;
    • Dress Rehearsal – use the questionnaire in conjunction with the full sample selection process and full survey operation;
  • Collect the data!

15.2 Format

In general:

  • Start with interesting questions, relevant to the subject at hand, to hook in the respondent’s interest, and ensure a higher response rate;

    This usually (though not always) means that the demographic questions (‘And now about you …’: age/sex/…) are found at the end;

    [A notable exception: The first questions in the Census are Name, Sex, Address]

  • Don’t put the most important items at the end of the questionnaire;

  • Don’t put the most sensitive or difficult questions first;

  • Group items into logical sections;

  • Keep it short. Don’t ask questions you don’t need to.

For self completion questionnaires (paper and electronic):

  • The instructions must be especially clear;
  • There should be contact information in case of queries or difficulties;
  • Space items out on the page or screen;
  • Make it clear where the respondent should write, and HOW (e.g. ‘tick the boxes provided to indicate your answer’)

In phone questionnaires:

  • Brevity is vital to ensure the respondent doesn’t hang up
  • With no visual cues, and no possibility of looking back over the questionnaire, questions must be particularly simple

In all questionnaires:

  • Provide background information about the purpose of the survey, who is running it;
  • Make it clear what the respondent is agreeing to by providing their information;
  • Make it clear whether or not (and it is usually is) participation is voluntary;
  • Explain that the data of individual respondents will not be released to others, and that no individual respondent will be identifiable when the survey results are published.

15.3 Questions

Questions can usually be divided into two types: open or closed.

Open questions allow the respondents to answer in any way they like. On paper questionnaires a space is left for the respondent to write one or more sentences.

e.g. Is there anything more you’d like to add?

What three things would improve this course?

Open questions are most suitable for qualitative and pilot surveys. They are time consuming to analyse in large surveys.

Closed questions have a prespecified set of possible responses.

What gender are you?    \(\square\) Male    \(\square\) Female    \(\square\) Another Gender;

These come in two important forms:

  • Tick one box only (single response)
  • Tick all that apply (multiple responses)

Electronic questionnaires can enforce these rules – but on paper the respondent can easily break them (inappropriately giving zero, one, two or more responses)

Focus groups and pilot surveys with many open questions administered to a small sample can be used to refine a set of frequent responses that are offered in a closed question in the main survey.

Example. In a focus group one might ask the open question: ‘What is your religion?’ and based on the distribution of responses develop the question:

’What is your religion? (Tick any that apply):

\(\square\)   No religion
\(\square\)   Christian
\(\square\)   Buddhist
\(\square\)   Hindu
\(\square\)   Muslim
\(\square\)   Jewish
\(\square\)   Other: (please specify) ___________________

Optical Character Recognition (OCR) software can look at a scanned image of the open ‘Other’ answer, and try to convert it automatically to text. This may need human intervention however.

Things to consider when designing a closed question:

  • It is normal to put the most frequently chosen categories first;

  • The options provided should in principle be exhaustive – everyone should be able to tick one box;

  • The open option ‘Other’ is usually included to cover cases which are not frequently chosen (even if the surveyors are expecting these);

  • If a question is ‘tick one box only’ then the options should be mutually exclusive – it should really be the case that only one box can apply;

  • The order that the options are offered in affects how people answer

  • One may choose deliberately to exclude some valid categories, because they apply to too small a part of the population, or will encourage flippant answers.

    Example. The sex question in the census (and in almost all questionnaires) is a good example of this. There are people who do not classify themselves as either Male or Female. Yet no ‘Other’ box is provided. Think about why surveyors do not allow an ‘Other’ response (and do not list other alternatives). (Note that a new question about gender is starting to be used in New Zealand, and it allows the options Male/Female/Another Gender.)

    Example. A lot of people recently started putting ‘Jedi’ as their religion. Why isn’t this listed as an option?

  • Two boxes that may or may not be shown are the Don’t Know and Refused box. These are usually present on any questionnaire completed by an interviewer, but are rarely present in a self-completion questionnaire.

    This is to discourage a lot of item non-response.

    • Don’t Know – may be used on a paper questionnaire where it is of interest to know if people are undecided or uninformed.

      ‘Which party will you vote for in the next election?’

      The undecideds are are interesting group in this case. But not in the case

      ‘How many bedrooms are there in your house?’

      where we want people to think about this and give a good answer.

    • Refused – may be used when the surveyors feel that the question may be an invasion of privacy for some people. Even having the option of ‘prefer not to say’ may encourage people to respond in one of the other categories.

  • A Not Applicable box should be shown if a respondent is being asked a question that may not apply to him/her.

    It is preferable never to ask such a question (and to avoid with a skip, as in ‘If Yes please go to Q5, if No please go to Q10’). But in short paper questionnaires it can be inevitable.

  • Think of the coding and classification system that you’re going to use. But be aware that you can collect data in one way, and then transform and analyse it in another.

    Example. Can ask for a free text response – ‘How many years have you lived at this address?’ and then convert the number into ranges ($<$1 year, 1-4 years, 5-9 years 10+ years).

    Example. Can ask for the make and type of car, but then only analyse it by size (which we derive from the make and type).

    Ask questions that are easy for the respondent to answer, but which still make your analysis possible.

  • Likert Scales are closed questions asking for a level of agreement. Convenient for collecting a lot of questions about opinions, can be laid out in a grid. A disadvantage of this is that it encourages some respondents to answer in a single column throughout (e.g. all Excellent or all Awful).

    Consider whether the direction of all statements is the same (e.g. all positive – so that 5 is always indicating approval).

    \(\square\)   Strongly Disagree
    \(\square\)   Disagree
    \(\square\)   Neither agree nor disagree
    \(\square\)   Agree
    \(\square\)   Strongly Agree

    An ODD number of response categories allows respondents to be neutral, whereas an EVEN number forces a response in one direction or other.

    \(\square\)   Strongly Disagree
    \(\square\)   Disagree
    \(\square\)   Agree
    \(\square\)   Strongly Agree

    Notice there are some questions where the centre is the most positive response:

    ‘How was the workload in this course?’

    \(\square\)   Far too little
    \(\square\)   Too little
    \(\square\)   About right
    \(\square\)   A bit too much
    \(\square\)   Far too much

Five or seven categories is usual.

  • Attention Checks - Some questionnaires offer rewards/incentives for completion. An unfortunate consequence of this is that some respondents are only completing the questionnaire to get the reward, and do not pay attention to the content of questions. These ‘donkey responses’ are typified by giving the same response to every question - e.g. always ticking ‘Strongly agree’ - so that they can complete the questionnaire with minimum thought or effort.

    Some questionnaires including attention checking questions to detect this.

    For example we might request a specific response to a multi-choice question:

    ‘To check that you are paying attention, please answer Strongly Disagree to this question’

    \(\square\)   Strongly Disagree
    \(\square\)   Disagree
    \(\square\)   Neutral
    \(\square\)   Agree
    \(\square\)   Strongly Agree

    or request a specific text response in an open question:

    ‘To check that you are paying attention, please enter the number 545 in the text box below’

Other issues to consider:

  • The questionnaire needs a title (and possibly a date)

  • Use short sentences; (Two short sentences are better than one long one.)

  • Avoid unnecessary (double) negatives:

    How strongly do you agree with:
    ASK: ‘This course has good support for learning iNZight’
    NOT: ‘This course does not have good support for learning iNZight’

  • Use terms consistently (e.g. refer consistently to household OR dwelling)

  • Use terms that will be understood (avoid technical jargon)

  • Don’t use abbreviations unless you are very sure they will be understood. ‘NZ’ is probably all right in New Zealand (though migrants and visitors won’t be familiar with it); ‘VUW’ would not be appropriate in most situations.

  • Terms should be unambiguous (not vaguely defined, or ambiguous)

  • Terms should be free from moral or other overtones – they should not be emotionally loaded.

    Compare questions about ‘terrorists’ and ‘freedom fighters’.

  • Don’t ask leading questions (the respondent is guided to an answer by the question)

  • Don’t ask questions which assume a state of affairs (e.g. don’t assume anything about what the respondent thinks about the world)

  • Ask for only ONE piece of information – i.e. Don’t ask double-barrelled (or worse) questions:

    NOT: Do you think that there should be longer sentences and hard physical labour for those convicted of violent offences?

    Instead:

    1. Do you think there should be longer sentences for those convicted of violent offences?
    2. Do you think hard physical labour should be part of the sentences of those convicted of violent offences?
  • Be clear whether you are seeking facts (‘Do you …’) or opinions (‘Do you think …’) from the respondent.

  • Number the pages

  • Use colour to create contrasts between instructions and questions, and assist with the flow of the questionnaire

  • Ask information that the respondent can provide – and without too much difficulty.

    It may be easier to ask what the respondent recently did, rather than generally does:

    ASK: ‘How many times did you go to the cinema last week?’
    NOT: ‘How many times do you go to the cinema per week?’

    You’ll be averaging over lots of respondents – so it doesn’t matter if one respondent had a big movie week last week, others won’t have. (Just as long as you don’t ask this question after a film festival.)

  • Shorter recall periods lead to more precise answers – best not have a long recall period (e.g. ask about last month, rather than last year).

    Recent events will be recalled more accuarately than more distant events: this is called recall bias.

  • Respondents often agree with statements, how ever they are worded. If only one side of an argument is stated then respondents often support it.

    Use a neutral approach.

    In particular, avoid predisposing questions

    ASK: ‘Do you think students should be helped with after school activities?’
    NOT: ‘Do you think teachers should help with students’ after school activities?’

    This question suggests a particular solution to a problem, but is not the only possible solution. The respondent should be able to agree that students need help, but not with the solution that you are offering.

  • The order questions are asked in makes a difference to the way people respond.

    Ask a lot of questions about how safe someone feels from criminal violence and then a question about tougher prison sentences, the pattern of responses will be different than if the prison sentences question was asked in the beginning.

  • Don’t ask questions that will embarrass the respondent, or make him/her feel exposed.

    Example. The Māori Language survey asks for self-assessed language ability, rather than actually testing the respondent’s language skill. There are two main reasons (i) to reduce the time the interview takes and (ii) not to embarrass a respondent who would feel that this was a test, and s/he might become nervous and unable to speak naturally.

    Ask questions in a way that doesn’t make the respondent feel stupid if s/he can’t answer:

    ASK: ‘Can you tell me if…?’
    NOT: ‘Do you know if…?’

  • Indirect questions can soften an approach to a difficult subject:

    ASK: ‘What do you think a student should do if…?’
    THEN: ‘What do most students actually do when…?’
    THEN: ‘What would you do if…?’

  • Finish the questionnaire with a thank you statement

15.4 Interviewers

  • What additional information can the interviewer provide if the respondent asks?

    Note that there is a risk that if the respondent and interviewer become too chatty, then the respondent will start to treat the interviewer as a friend, and may succumb to social desirability bias – giving answers s/he thinks the interviewer will want to hear.

  • Can the interviewer use probes – asking further questions if the respondent does not seem to be answering the question in the right way?

    Is the interviewer allowed to improvise in any way? or should the interviewer stick to a script?

    The risk is that some interviewers may be more or less successful at this, and responses to the questionnaire will not be uniform across interviewers.

    The interviewer also may make a mistake when re-explaining a concept in other words, and introduce a misconception.

  • Interviewers need training

    • Survey aims, objectives and sample design
    • Concepts related to the survey and what it is trying to measure (e.g. the concept of ethnicity, or general health, or …)
    • Questionnaire content
    • Field Work procedures – making contact, making an appointment, visiting a person in their home, document flow matters etc.
    • Conducting an interview professionally. Keeping their own views separate.

    Training may involve lectures, discussion groups and role playing.

  • Interviewers may need to be of a specific age, sex, ethnicity for some surveys. They may need specific language skills.

  • An important principle is that respondents should give the same answers to the questions, no matter which interviewer conducted the interview.

    Think about how this can be ensured.