18
Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Embed Size (px)

Citation preview

Page 1: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Developing Surveys for the Outcomes Assessment Process

Kim AndersonCourse Evaluation Subcommittee Chair

Summer 2009

Page 2: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

What is a Survey?NOUN: pl.sur·veys (sûrv )

A detailed inspection or investigation.

A general or comprehensive view.

A gathering of a sample of data or opinions considered to be representative of a whole.

The process of surveying. (From the American Heritage Dictionary)

Assessment instrument that measures a characteristic or attitude that ranges across a continuum of values or identifies a value or belief on a rating scale. Typically use a sampling of data to be representative of a whole that is being studied.

Page 3: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

“Surveying for Surveying’s Sake” Is Problematic

Distinct and Practical Purpose• If not distinct then will get frequent changes in the survey• If not practical then will get fatigue from all involved and poor expenditure of

resources– Survey fatigue– Bureaucratic fatigue– Assessment fatigue– Audience fatigue

• Low quality• Indirect vs. direct assessment issue• “One shot” assessments are less valuable than continuous assessments

If any issues arise or persist: Do not use a survey

Page 4: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

PurposePreliminary Planning

• Confronted with a need for information (questions it should answer)

• Be specific, clear-cut, and unambiguous as possible with needed information (focus)

• Best possible way to ascertain desired information

• Write as few questions as possible to obtain information

• Trade-offs exist

Page 5: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Survey Development

• Step 1: Decide to whom and how the survey will be administered.

• Step 2: Determine the content and wording of each question.

• Step 3: Determine the structure of response to each question.

• Step 4: Establish the sequence of questions.• Step 5: Design the format and appearance.

Page 6: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Step 1: Decide to whom and how the survey will be administered.

• Sample Size– General population/All– Sample = Portion of a population of interest (scientifically

chosen or reliable projection or randomly selected)• Collection of Data

– In-person, mail, e-mail, phone (paper surveys assume literacy and are time consuming to manage)

– Optical Mark Reader, e.g. use a No. 2 pencil (requires a machine to read answer sheets)

– Web-based, e.g. Survey Gizmo (online surveys are convenient, but often assume respondents have access to a computer, are technologically literate, and feel comfortable responding in an electronic format)

• Professional & financial resources available

Page 7: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Step 2: Determine the content and wording of each question.

• Appropriateness based on purpose• Eliminate unnecessary questions

– Is this a double-barreled question that should be split or eliminated?

– Can the respondent answer this question? (too long ago or worded in a way that might sway)

– Will the respondent answer this question? (personal)• Appropriate wording

– Not too vague or confusing– Avoid double negatives– Unfamiliar terminology (lingo)– Loaded terms (sensitive/controversial questions)

Page 8: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Step 3: Determine the structure of response to each question.

• Open-ended: Not one definite answer; answer in their own words; requires the necessary time and effort to answer; yields quotable material; difficult to analyze; factor in time and effort for data compilation

• Closed-ended: finite set of answers to choose; easy to standardize; data gathered lends to analysis; more difficult to write (must design choices including all possible answers)

Page 9: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Closed-Ended Types

• Likert scale: How closely feelings match the statement on a rating scale

• Multiple choice: pick the best answer(s) from the finite options

• Ordinal: Rank ordered for all possible answers; rate in relationship to others

• Categorical: Possible answers are in categories; respondent must fall into exactly one

• Numerical: Answer must be a real number

Page 10: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Closed-End SamplesType Example

Likert Strongly StronglyDisagree Agree1 2 3 4 5

Multiple choice Why don't you use the school's cafeteria services? (circle one):a.It's too expensive.b. The food quality is poor.c. The location is inconvenient.

Ordinal Please write a number between 1 and 5 next to each item below. Put a 1 next to the item that is MOST important to you in selecting an on-line university course. Put a 5 next to the item that is LEAST important. Please use each number only ONCE.

Categorical Place the cursor over your category and click.

Numerical How old were you on your last birthday?

Page 11: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Responses to QuestionsGeneral Suggestions

• Scale Point Proliferation: Too many points on a rating scale (more than 5) is confusing and hairsplitting

• Order of Categories: Better to list a progression between a lower level to a higher

• Category Proliferation: Minor distinctions among categories are not useful; brevity

• “Other”: With a few exceptions, avoid this option

Page 12: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Step 4: Establish the sequence of questions.

• First Part = easier questions (gains cooperation)

• Middle Series = most important topics• End of the survey = demographic and other

classification questions• Conclude with a thank you.

Page 13: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Step 5: Design the format and appearance.

• Attractive, clearly printed, and well laid out• Appealing and simple to complete• Quality engenders better response• Representing program and the college

Page 14: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

No Survey is Perfect

• Fallacy of Perfection– Ask for feedback in each step of the development process– Ask colleagues both in and out of the program or

discipline for reactions and suggestions– Beta test– Many GREAT surveys have “crashed and burned” in prior

revisions; just be patient• Administration

– Cover Letter or script to provide consistency– Address protection of confidentiality

Page 15: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Surveys and Outcomes Assessment• Survey creation is the beginning of the process• Consider analysis requirement (statistical or otherwise) during

survey developmentTypes of statistical analyses:

✓ Descriptive statistics (means, medians, etc.)✓ Correlation analysis✓ Regression and logistic regression✓ Graphs: Bar, Boxplots, ANOVA, etc.

• Keep it simple—present basic, descriptive data regularly; More nuanced analysis is possible if there is a need to

✓ demonstrate differences (ANOVA, t-test)✓ demonstrate correlation (basic spearman’s correlations)✓ explain causation (regression)

• Resources needed and available• Responder anonymity and data confidentiality• Key findings = presentation plan to improve service and student

learning

Page 16: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Surveys and SLOs1. Survey assessments are considered indirect assessments.

Therefore, it is best to compare findings with direct assessments of student learning.

2. Survey assessments can be very useful for observing✓ what students believe they are learning,✓ what alumni feel that have learned,✓ how well employers feel graduates have been prepared.

3. Survey assessments create very useful findings if a program is concerned about the quality of student preparation (i.e., employer, mentor, or work experience surveys).

4. Closed-ended questions are derived from the content knowledge.

5. Open-ended questions lead to qualitative analysis that can be compared with closed-ended responses.

6. Open-ended survey responses can also be analyzed to detect trends or concerns. Useful information can be gained through the systematic analysis of open-ended questions.

Page 17: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009

Final Thoughts• Well-crafted surveys are methods of describing

opinions, or even describing changes in perceptions and attitudes

• More work is involved in creating surveys and managing the survey process than usually anticipated

• No survey is perfect; it is often best to combine a survey-based assessment with an assessment involving direct assessment of process or performance

• That said, survey information can be useful • Questions? Thank you.

Page 18: Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009