2. Today’s Class Objectives
Examine how surveys are used to collect data
Review stages of survey/questionnaire development
Discuss what makes a survey valid
Identify “Common Pitfalls” of survey/questionnaire
design
1/6/2012
3. A Survey is…..
Tool to gather information about peoples
attitudes, thoughts & behaviors
Surveys can be divided into two broad categories:
the questionnaire and the interview.
Definition obtained from socialresearchmethods.net
1/6/2012
4. Why Survey?
Image & Rights obtained from cartoonstock.com
1/6/2012
5. Stages of Survey Development
Step 1 Step 2 Step 3 Step 4 Step 5
Design Survey Administer
Formulate Identify
Pre-test Survey Survey to
Research Population and
Sample
Question Sample
1/6/2012
6. Elements of Good Survey/Data
Valid & Reliable
Reduce Bias & Good
Errors Survey/Data
Good Response
Rate
1/6/2012
7. Common Pitfalls
1. Keep it short, simple, specific
2. Use mutually exclusive & exhaustive categories
1. What is your current age?
2. 14-15 15-18 18-19
3. Consider the phrasing and wording of questions
1. Culture, Leading Questions, Educational Level
4. Avoid double barreled questions
1. E.g. Was your social worker and/or parole officer
supportive?
5. Avoid making assumptions
1/6/2012
8. Questionnaire Evaluation Exercise
Next class you will be the panel of experts evaluating
the validity of an intake survey for the Second Chance
High School Dropout Prevention program. Review the
handout and be prepared to discuss some of the issues
you see with a group of your peers.
1/6/2012
10. References
Creswell,J.(2009). Research design: qualitative, and
mixed methods approaches. Los Angeles, CA: Sage
Publications Inc.
Thyer, B.( 2010). The handbook of social work
research methods (2nd ed.). Thousand Oaks, CA:
Sage Publications Inc.
Van Bennekom, F. Survey design and survey data
analysis workshop series. 28 Dec. 2011
<http://www.greatbrook.com/survey_workshop/>
1/6/2012
Notas do Editor
Social Work Background (Clinical work, casemangement, group work)Research Background (Consumer of research, focus groups, market research, group projects, individual projects)
Before we move to the next slide, by a show of hands who has seen a survey in the last 3months?What type did you see or participate in?Phone, Mail, Electronic, Survey at the bottom of a receipt, Bank survey, Car rental survey, Interview
It’s one of the least expensive ways to collect a lot of data/information in a short amount of time. There are also several ways to collect the data, which gives the researcher options (mail, phone, etc.). Over the holidays I rented a car and took a flight and what do I find in my inbox a few days later, a survey asking for my feedback on services.
InexpensiveCollect a lot of data in a short amount of timeIt’s one of the least expensive ways to collect a lot of data/information in a short amount of time. There are also several ways to collect the data, which gives the researcher options (mail, phone, etc.). Over the holidays I rented a car and took a flight and what do I find in my inbox a few days later, a survey asking for my feedback on services.
Worked for drop out prevention program. Our goal was to help youth graduate from high school. In order to do that we needed to develop a better understanding of why some many students in our school district were dropping out of high school. Because we needed a quick and inexpensive way to collect a lot of data, we decided to administer surveys.Our research question was “What factors lead to students dropping out of school”?-Very BroadWe knew our research question and our next question was who do we ask (students, parents, teachers,etc). While we could have used all of those, we decided we had the best access to students.State difference between population(students who drop out school) and sample(students between the ages of 13-20 in the school district who dropped out of school and were interested in returning)
Instrument Validity- Are you measuring what you intended to measure?Bias- Can result from administration method(phone, mail, online, etc.), Bias in the actual instrument, selection of sample,non-response bias(when people with extreme views respond,response bias(bias that respondent brings to the process. E.g. when respondents give certain answers to please the surveyor, when respondents select anything just to complete the survey.Bias affects the validity and reliability of the findings and ultimately skews the data
After you have done the following:Made a list of what you want to know Check to see if information is already availableConsider why you would ask certain questions and how you will use the informationOnce you understand the type of questions you are going to ask(knowledge, attitude, belief, behavior,attributes), one of the biggest pitfalls is in the wording.Wording questions to gain what is wanted and also to be understood by all respondents is a challenging task. In writing question, consider three things: 1) the people for who the questionnaire is designed 2) the purpose of Questionnaire 3)how questions should be placed in relationship to each othershort, simple, specific- simple wording-should be adapted to audience and consider double meaning of words. Specific-a question about youth should clearly state how “youth” is defined. Watch words like regularly, occasionally, often, etc.Give respondents all the information needed to answer the question. E.g. Which program are you interested in? Can’t respond if they don’t know the choicesE.g. asking people to indicate race, level of income, socio-economic statusE.g. How many times have you been arrested? Instead, have you ever been arrested?Two questions written together gives no opportunity for people to respond in favor of one part or the other
TypeDescriptonPurposeFace Evaluation of an instrument's appearance by a group of experts and/or potential participants. Establishing an instrument's ease of use, clarity, and readability. Content Evaluation of an instrument's representativeness of the topic to be studied by a group of experts. Establishing an instrument's credibility, accuracy, relevance, and breadth of knowledge regarding the domain. Criterion Evaluation of an instrument's correlation to another that is deemed unquestionable or identified as the gold standard. Establishing an instrument's selection over another or establishing the predictability of the measure for a future criterion. Construct Evaluation of an instrument's ability to relate to other variables or the degree to which it follows a pattern predicted by a theory. Establishing an instrument's ability to evaluate the construct it was developed to measure.
Instrument Validity- Are you measuring what you intended to measure?Bias- Can result from administration method(phone, mail, online, etc.), Bias in the actual instrument, selection of sample,non-response bias(when people with extreme views respond,response bias(bias that respondent brings to the process. E.g. when respondents give certain answers to please the surveyor, when respondents select anything just to complete the survey.