45
OUTCOMES ASSESSMENT: OUTCOMES ASSESSMENT: Linking Learning, Assessment Linking Learning, Assessment and Program Improvement and Program Improvement James O. Carey James O. Carey Associate Professor Emeritus Associate Professor Emeritus School of Information School of Information University of South Florida University of South Florida ALA Annual Meeting, June 27, 2011

OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Embed Size (px)

Citation preview

Page 1: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

OUTCOMES ASSESSMENT:OUTCOMES ASSESSMENT:

Linking Learning, AssessmentLinking Learning, Assessmentand Program Improvementand Program Improvement

James O. CareyJames O. CareyAssociate Professor EmeritusAssociate Professor Emeritus

School of InformationSchool of InformationUniversity of South FloridaUniversity of South Florida

ALA Annual Meeting, June 27, 2011

Page 2: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

OverviewOverview

• What is outcomes assessment?What is outcomes assessment?

• Why do outcomes assessment?Why do outcomes assessment?• What are the required elements What are the required elements

of outcomes assessment?of outcomes assessment?• How is an outcomes assessment How is an outcomes assessment

process implemented?process implemented?• Practical tips for successful Practical tips for successful

outcomes assessmentoutcomes assessment

• Summary and ConclusionsSummary and Conclusions

Page 3: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

What is Outcomes Assessment?What is Outcomes Assessment?

3 Characteristics3 Characteristics•Identifying Identifying desired outcomesdesired outcomes•Assessing Assessing progress on progress on outcomesoutcomes•Using the Using the results of results of assessment for assessment for improvementimprovement

Roughly = To:Roughly = To:•Institutional Institutional effectivenesseffectiveness•AccountabilityAccountability•Continuous Continuous improvementimprovement•Quality Quality assuranceassurance•Formative Formative evaluationevaluation

Page 4: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

What is Outcomes Assessment?What is Outcomes Assessment?

Continuous process . . .Continuous process . . .

. . . instead of an event.. . . instead of an event.

NEEDS PLANNING

ASSESSMENT PROGRAM

Page 5: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

What is Outcomes Assessment?What is Outcomes Assessment?• At the institutional levelAt the institutional level

– Institutional effectivenessInstitutional effectiveness– AccountabilityAccountability

• For example:For example:– The University’s goal is to graduate 80% The University’s goal is to graduate 80%

of entering freshmen in 5 yearsof entering freshmen in 5 years– A three-year assessment indicates A three-year assessment indicates

graduation rate of 63% in 5 years and graduation rate of 63% in 5 years and points to financial problems as primary points to financial problems as primary cause.cause.

– Office of Student Affairs will create a Office of Student Affairs will create a student services taskforce on alternative student services taskforce on alternative paths to financial viability paths to financial viability

Page 6: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

What is Outcomes Assessment?What is Outcomes Assessment?• At the program levelAt the program level

– AccountabilityAccountability– Outcomes assessmentOutcomes assessment

• For example:For example:– A departmental objective is relevant job A departmental objective is relevant job

placement for 85% of graduates within 1 placement for 85% of graduates within 1 year of graduationyear of graduation

– Surveys of alumni indicate 70% have Surveys of alumni indicate 70% have found relevant placements in first yearfound relevant placements in first year

– An initiative is planned to contact An initiative is planned to contact relevant professional constituencies and relevant professional constituencies and develop structured involvement in the develop structured involvement in the program with regional employersprogram with regional employers

Page 7: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

What is Outcomes Assessment?What is Outcomes Assessment?

• At the program levelAt the program level– Student Learning outcomes assessmentStudent Learning outcomes assessment

• Another example:Another example:– A library school wants graduates to be A library school wants graduates to be

able to write an action research plan for able to write an action research plan for a given problem scenario.a given problem scenario.

– On the comprehensive exam, 18% of On the comprehensive exam, 18% of students fail the action research students fail the action research question and analysis indicates that question and analysis indicates that most took the course with an adjunctmost took the course with an adjunct

– The curriculum committee standardizes The curriculum committee standardizes the syllabus for the research course and the syllabus for the research course and the pass rate increases the pass rate increases

Page 8: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

What is Outcomes Assessment?What is Outcomes Assessment?• At the classroom levelAt the classroom level

– Student learning outcomes assessmentStudent learning outcomes assessment

• For example:For example:– The instructor wants students to learn The instructor wants students to learn

how to analyze service needs for a how to analyze service needs for a specified patron group specified patron group

– On the final project students On the final project students consistently confuse patron needs with consistently confuse patron needs with programming alternativesprogramming alternatives

– The instructor develops new case The instructor develops new case studies on analyzing service needs, and studies on analyzing service needs, and performance on the final project performance on the final project improves to an acceptable level improves to an acceptable level

Page 9: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

What is Outcomes Assessment?What is Outcomes Assessment?To summarize:To summarize:•Has three basic componentsHas three basic components•Is part of a whole family of accountability Is part of a whole family of accountability methodologiesmethodologies•Goes by many namesGoes by many names•Is used in many organizations in both Is used in many organizations in both public and private sectors public and private sectors •Is used at many levels within Is used at many levels within organizations for multiple accountability organizations for multiple accountability purposespurposes

Page 10: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Our Purpose TodayOur Purpose Today• Institutional and programmatic Institutional and programmatic

learning outcomes assessmentlearning outcomes assessment– For accreditationFor accreditation– Mandated, unit-level accountabilityMandated, unit-level accountability– Shared responsibilityShared responsibility

Rather thanRather than

• Course-level outcomes assessmentCourse-level outcomes assessment– For course improvementFor course improvement– Elective, individual accountabilityElective, individual accountability– Personal responsibilityPersonal responsibility

Page 11: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Our Purpose TodayOur Purpose Today

Focus on:Focus on:•Program level accreditation Program level accreditation requirementsrequirements•Systematic planning and Systematic planning and evaluationevaluation•Student learning outcomes Student learning outcomes assessmentassessment

Page 12: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Why Do Outcomes Assessment?Why Do Outcomes Assessment?• A systematic process of outcomes A systematic process of outcomes

assessment is currently required for assessment is currently required for accreditation by:accreditation by:– ALA Committee on AccreditationALA Committee on Accreditation– Parallel professional association Parallel professional association

accreditation (e.g., NCATE)accreditation (e.g., NCATE)– All eight regional higher education All eight regional higher education

accreditation organizations accreditation organizations – Most state boards of regents and Most state boards of regents and

departments of educationdepartments of education

• A proven methodology for best results A proven methodology for best results from effort and resourcesfrom effort and resources

Page 13: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Overview:Overview:Elements of Outcomes AssessmentElements of Outcomes Assessment

Page 14: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University
Page 15: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Student Learning OutcomesStudent Learning Outcomes• What are they?What are they?

– Statements describing knowledge and Statements describing knowledge and skills that students are expected to master skills that students are expected to master by completion of their program of studiesby completion of their program of studies

• Synonyms (sort of):Synonyms (sort of):– Core competenciesCore competencies– Learning objectivesLearning objectives

• Where do we get them?Where do we get them?– Parent institution’s mission, goals, and Parent institution’s mission, goals, and

strategic objectivesstrategic objectives– Unit-level mission, goals, and objectivesUnit-level mission, goals, and objectives

Page 16: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Student Learning OutcomesStudent Learning Outcomes• Where do we get them? (continued)Where do we get them? (continued)

– Professional standardsProfessional standards• ALA/COA Standards for Accreditation (2008) (see ALA/COA Standards for Accreditation (2008) (see

Standards I and II)Standards I and II)• ALA Task Force “Core Competencies” (2009)ALA Task Force “Core Competencies” (2009)• ALA divisions and other library/information professional ALA divisions and other library/information professional

associationsassociations

– Expert faculty membersExpert faculty members– Syllabi from core coursesSyllabi from core courses– Program advisory boards, alumni, employers, Program advisory boards, alumni, employers,

practitioners, studentspractitioners, students– Exemplary LIS programsExemplary LIS programs– FuturistsFuturists

Page 17: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Student Learning OutcomesStudent Learning Outcomes• What do they look like?What do they look like?

– Declarative sentences describing what Declarative sentences describing what students will know and be able to dostudents will know and be able to do

– Description of a single skill or set of Description of a single skill or set of closely related skills that can be closely related skills that can be assessed at the same timeassessed at the same time

– Performance of the skill(s) should be Performance of the skill(s) should be observable, or result in an observable observable, or result in an observable productproduct

– Performance of the skill(s) should be Performance of the skill(s) should be measurable; i.e., one can determine measurable; i.e., one can determine when it has been done successfullywhen it has been done successfully

Page 18: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Student Learning OutcomesStudent Learning Outcomes

• How many should we have?How many should we have?– Don’t go overboard!Don’t go overboard!– Remember, if you write it you will need to Remember, if you write it you will need to

assess it and report on it.assess it and report on it.– Usually 2-5 outcomes for each core content Usually 2-5 outcomes for each core content

area are sufficient.area are sufficient.• Write outcomes at a high intellectual level (analysis Write outcomes at a high intellectual level (analysis

and problem solving) that subsumes multiple sub-and problem solving) that subsumes multiple sub-skillsskills

• For example:For example:Students use strategic planning processes to guide the direction and progress of an organization VSStudents list the steps in a strategic planning process

Page 19: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Student Learning OutcomesStudent Learning OutcomesBad Ones:Bad Ones:

• Students appreciate Students appreciate the value of the value of professional professional organizationsorganizations

• Students become Students become familiar with needs familiar with needs assessment for assessment for collection collection developmentdevelopment

Good Ones:Good Ones:

• Students join Students join relevant professional relevant professional organizations.organizations.(observable &measurable)(observable &measurable)

• Students plan a Students plan a simulated needs simulated needs assessment for a assessment for a given collection given collection development development problemproblem(observable and measurable)(observable and measurable)

Page 20: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Student Learning OutcomesStudent Learning OutcomesBad Ones:Bad Ones:

• Students know the Students know the scholarly literature scholarly literature in the LIS field in the LIS field

• Students understand Students understand principles of fair use principles of fair use and how to apply and how to apply them them

Good Ones:Good Ones:

• Students select Students select scholarly literature scholarly literature appropriate for appropriate for analyzing a current analyzing a current issue in LISissue in LIS(observable and measurable)(observable and measurable)

• Students describe Students describe principles of fair use principles of fair use and write policy for and write policy for applications in an applications in an information centerinformation center(observable and measurable)(observable and measurable)

Page 21: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Student Learning OutcomesStudent Learning OutcomesBad Ones:Bad Ones:

• Students find Students find sources of outside sources of outside funding for libraries funding for libraries and information and information centers centers

• Students learn Students learn about cataloging about cataloging tools and tools and bibliographic bibliographic utilitiesutilities

Good Ones:Good Ones:

• Students select a Students select a source of outside source of outside funding and write a funding and write a proposal for proposal for support of a projectsupport of a project(higher level skill)(higher level skill)

• ??

Page 22: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Student Learning OutcomesStudent Learning OutcomesBad Ones:Bad Ones:

• Students list the Students list the features of an features of an effective reference effective reference interview interview

• Students describe Students describe functional areas within functional areas within libraries or libraries or information centers information centers that offer that offer opportunities for opportunities for applied researchapplied research

Good Ones:Good Ones:

• ??

• ?

Page 23: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University
Page 24: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Develop Measures of OutcomesDevelop Measures of Outcomes• IFIF learning outcomes have been learning outcomes have been

written well, logical measures are written well, logical measures are often impliedoften implied

• For example:For example:Outcome: Students will Outcome: Students will identify and

assess the specific information needs of user groups in the community and use that information to write a collection development policy

Measure: Write a collection development policy for user groups in the following community scenario

Page 25: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Develop Measures of OutcomesDevelop Measures of Outcomes

• Characteristics of good measuresCharacteristics of good measures– Valid; that is, actually measures what it Valid; that is, actually measures what it

claims to measureclaims to measure– Reliable; that is, will yield consistent Reliable; that is, will yield consistent

scoresscores– Applied uniformly across all students or Applied uniformly across all students or

sampled across studentssampled across students– Objective or require consensus of more Objective or require consensus of more

than one judge/evaluator/rater than one judge/evaluator/rater

Page 26: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Develop Measures of OutcomesDevelop Measures of Outcomes

• Two general types of measures:Two general types of measures:– Direct measures (primary data)Direct measures (primary data)

Describe the selection and configuration of Describe the selection and configuration of technological resources required to solve the technological resources required to solve the communications problems depicted in the following communications problems depicted in the following case study. (requires a product)case study. (requires a product)

– Indirect measures (supplemental data)Indirect measures (supplemental data)How would you rate your ability to select and How would you rate your ability to select and configure technological resources to solve configure technological resources to solve communications problems?communications problems?

a.a. Not adequate for an entry-level professionalNot adequate for an entry-level professional

b.b. Adequate for an entry-level professionalAdequate for an entry-level professional

c.c. Above average for an entry-level professionalAbove average for an entry-level professional

d.d. Equal to an experienced professionalEqual to an experienced professional

(elicits an opinion)(elicits an opinion)

Page 27: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Develop Measures of OutcomesDevelop Measures of Outcomes

• Examples of direct measuresExamples of direct measures– Comprehensive examination w/rubricComprehensive examination w/rubric– Portfolio w/rubricPortfolio w/rubric– Products from capstone course w/rubricProducts from capstone course w/rubric– Observation scale from fieldwork or Observation scale from fieldwork or

internship internship – Standardized tests (local, state, or national)Standardized tests (local, state, or national)– Common course examinations Common course examinations – Licensure examinationsLicensure examinations

• All examples must conform to All examples must conform to characteristics of good measurescharacteristics of good measures

Page 28: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Develop Measures of OutcomesDevelop Measures of Outcomes

• Examples of indirect measuresExamples of indirect measures– Exit interviewsExit interviews– Focus groups with students, alumni, Focus groups with students, alumni,

supervisors, and employerssupervisors, and employers– Surveys of students, alumni, Surveys of students, alumni,

supervisors, and employerssupervisors, and employers– Reviews by advisory boards or councilsReviews by advisory boards or councils– Case studies of cohort groupsCase studies of cohort groups

Page 29: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Develop Measures of OutcomesDevelop Measures of Outcomes

• What about students’ grades in What about students’ grades in classes, seminars, capstone classes, seminars, capstone courses, fieldworks, and courses, fieldworks, and internships?internships?

NO WAY!NO WAY!

• Can not satisfy the characteristics of Can not satisfy the characteristics of good measuresgood measures

Page 30: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Develop Measures of OutcomesDevelop Measures of Outcomes

• Set performance expectationsSet performance expectations– Once measures have been established, set Once measures have been established, set

the levels of performance that will be the levels of performance that will be considered acceptable or “passing”considered acceptable or “passing”

– This is an internal “gatekeeping” function This is an internal “gatekeeping” function for student progress for student progress

– for example:for example:• Yes or no; right or wrong; pass or failYes or no; right or wrong; pass or fail• 80% correct, 90% correct, 95% correct80% correct, 90% correct, 95% correct• Average rating of 4 on a 5 point scale to passAverage rating of 4 on a 5 point scale to pass• Students must perform at 4.5 level on critical Students must perform at 4.5 level on critical

criteria #1 and #2, but can pass with an overall criteria #1 and #2, but can pass with an overall rating of 4.0 averaged across all 5 criteriarating of 4.0 averaged across all 5 criteria

Page 31: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Develop Measures of OutcomesDevelop Measures of Outcomes

• Practical considerationsPractical considerations– Not a simple taskNot a simple task– Requires a level of sophistication in Requires a level of sophistication in

testing and measurementtesting and measurement– Requires a committee of the willing Requires a committee of the willing

and/or a layer of administration for:and/or a layer of administration for:• Design and development of direct measuresDesign and development of direct measures• Design and development of indirect Design and development of indirect

measuresmeasures• Formative testing and revision of both direct Formative testing and revision of both direct

and indirect measures and indirect measures

Page 32: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University
Page 33: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Assess Learning OutcomesAssess Learning Outcomes

• Developing and carrying out an Developing and carrying out an assessment planassessment plan

• Practical considerationsPractical considerations– Requires a committee of the willing Requires a committee of the willing

and/or a layer of administration for:and/or a layer of administration for:• Policy, procedures, and calendar for Policy, procedures, and calendar for

administration of measuresadministration of measures• Policy, procedures, and calendar for gradingPolicy, procedures, and calendar for grading• Policy, procedures, and calendar for notifying Policy, procedures, and calendar for notifying

successful students and notifying and successful students and notifying and managing unsuccessful studentsmanaging unsuccessful students

• Procedures and calendar for recording, Procedures and calendar for recording, summarizing, and reporting results summarizing, and reporting results

Page 34: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Assess Learning OutcomesAssess Learning Outcomes

• More practical considerationsMore practical considerations– It is difficult to measure all student learning It is difficult to measure all student learning

outcomes with a single instrument in a outcomes with a single instrument in a single eventsingle event

– Use multiple measures, for example:Use multiple measures, for example:• Comprehensive exam and products from capstone Comprehensive exam and products from capstone

coursecourse• Capstone course products and portfolioCapstone course products and portfolio• Portfolio and fieldwork observationsPortfolio and fieldwork observations

– Sample across outcomes and students, for Sample across outcomes and students, for example:example:• Measure several outcomes in each comprehensive Measure several outcomes in each comprehensive

exam and rotate outcomes across exams each exam and rotate outcomes across exams each semestersemester

Page 35: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University
Page 36: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Organize and Interpret ResultsOrganize and Interpret Results

• Purposes for organizing and Purposes for organizing and interpreting resultsinterpreting results– Confirm satisfactory performanceConfirm satisfactory performance– Detect performance problemsDetect performance problems– Detect faulty assessment instruments Detect faulty assessment instruments

and/or proceduresand/or procedures– Discover opportunities for programmatic Discover opportunities for programmatic

expansion, reorganization, additions, cuts, expansion, reorganization, additions, cuts, and changes in overall directionand changes in overall direction

– Address accountability expectations for the Address accountability expectations for the parent institution and for accreditationparent institution and for accreditation

– Inform programmatic improvement Inform programmatic improvement

Page 37: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Organize and Interpret ResultsOrganize and Interpret Results

• Set accountability expectationsSet accountability expectations• How do we know when the program is How do we know when the program is

meeting its obligations to its students, meeting its obligations to its students, its institution, and its profession? its institution, and its profession?

• For example:For example:• 90% of alumni will report “adequate” or better 90% of alumni will report “adequate” or better

preparation on 90% of learning outcomespreparation on 90% of learning outcomes• 85% of our students will achieve an average 85% of our students will achieve an average

rating of 4.5 or above on a 5-point scale on their rating of 4.5 or above on a 5-point scale on their capstone projects capstone projects

• 95% of students will rate a “pass” on student 95% of students will rate a “pass” on student learning outcome #6 on the comps examlearning outcome #6 on the comps exam

Page 38: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Organize and Interpret ResultsOrganize and Interpret Results

• Methods for organizing and Methods for organizing and interpreting resultsinterpreting results– Matrix analysis is most typicalMatrix analysis is most typical– Display student learning outcomes by Display student learning outcomes by

measurement items and fill in results at measurement items and fill in results at the intersection, for example:the intersection, for example:

Measure 1 Measure 2 Measure 3 Etc.

Outcome A 85% Pass

Outcome B 94% Pass

Outcome C 63% Pass 72% pass

Etc.

Page 39: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University
Page 40: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Use Results for ImprovementUse Results for Improvement

• Remember—no need to improve what Remember—no need to improve what is working well!!!is working well!!!

• The dependent variable in this The dependent variable in this investigation is student learninginvestigation is student learning

• The independent variables are many!The independent variables are many!– Teaching and learning are parts of a Teaching and learning are parts of a

complex system with multiple complex system with multiple interacting componentsinteracting components

– Data can point to problems, but cause Data can point to problems, but cause and effect relationships are difficult to and effect relationships are difficult to establishestablish

Page 41: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Simplified Model of Variables in Teaching and LearningSimplified Model of Variables in Teaching and Learning(See Figure 1 in handout.)(See Figure 1 in handout.)

COURSECONTENT

LEARNERCHARACTERISTICS

LEARNINGENVIRONMENT

ESSENTIALS FORLEARNING:

MOTIVATIONLEARNINGGUIDANCEACTIVESTUDENTPARTICIPATIONCONTENTINTEGRATION

ASSESSMENTSTUDENT LEARNING

OUTCOMES

IMPROVEMENTIMPROVEMENT

Page 42: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Use Results for ImprovementUse Results for Improvement

• Where would one look among all of Where would one look among all of the variables for opportunities for the variables for opportunities for improving student performance?improving student performance?– Begin with assessment dataBegin with assessment data– Look for gaps between performance Look for gaps between performance

expectations and actual performanceexpectations and actual performance– Look for gaps between accountability Look for gaps between accountability

expectations and actual performance expectations and actual performance – Sharpen understanding of performance Sharpen understanding of performance

problems with qualitative dataproblems with qualitative data

(See Table 1 in handout.)(See Table 1 in handout.)

Page 43: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Practical Tips for ImplementationPractical Tips for Implementation

• Establish authority for outcomes Establish authority for outcomes assessmentassessmentA person or committee charged with A person or committee charged with managing the process and delegating managing the process and delegating assessment responsibilitiesassessment responsibilities

• Establish an annual assessment Establish an annual assessment calendarcalendar(See Table 2 in handout.)(See Table 2 in handout.)

• Establish a uniform reporting formatEstablish a uniform reporting format(See Table 3 in handout.)(See Table 3 in handout.)

Page 44: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

Summary and ConclusionsSummary and Conclusions

• For good teachers, learning outcomes For good teachers, learning outcomes assessment is intuitive; good assessment is intuitive; good teachers are always improving what teachers are always improving what they do based on the results of what they do based on the results of what they have done in the past.they have done in the past.

• The challenges:The challenges:– Infuse the logic of that “good teacher” Infuse the logic of that “good teacher”

intuition school-wide or department-wideintuition school-wide or department-wide– Create sustaining policies and Create sustaining policies and

administrative structuresadministrative structures– ““How we do it.” instead of “What we do How we do it.” instead of “What we do

for accreditation.”for accreditation.”

Page 45: OUTCOMES ASSESSMENT: Linking Learning, Assessment and Program Improvement James O. Carey Associate Professor Emeritus School of Information University

HTTP://SHELL.CAS.USF.EDU/~JCAREY/OAHTTP://SHELL.CAS.USF.EDU/~JCAREY/OA//

This PowerPoint presentation along This PowerPoint presentation along with resource links for outcomes with resource links for outcomes assessment will be available at:assessment will be available at: