Principles of and Practices in Assessing Student Learningslis.simmons.edu/coa/docs/Simmons.pdf ·...

Preview:

Citation preview

1

Principles of and Practices in Assessing Student Learning

Presented by Peggy L. Maki Simmons CollegeGraduate School of Library & Information ScienceMay, 2005PeggyMaki@aol.com

2

Integrating Teaching, Learning, and AssessmentPedagogy

Curricular design

Instructional design

Educational tools

Educational experiences

Students’ Learning Histories/Styles

3

Integrated Learning….

Cognitive

AffectivePsychomotor

4

Consider the importance of answering the questions under “Questions to Ground Discussion of Teaching, Learning and Assessment.”

5

Major Collaborative Tasks

Articulating learning outcome statements that align with collective educational practices

Mapping outcomes to the curriculum according to a meaningful labeling system that also identifies course-based assessment practices

6

Identifying or designing methods to assess learning—more than one method to capture the dimensions of learning

Developing standards and criteria of judgment to assess student achievement—scoring rubrics—and inter-rater reliability

7

Collecting, scoring, and analyzing

Interpreting results within the context of other relevant data that you identify before you begin to assess

Implementing agreed upon changes to improve or advance learning; then re-entering the assessment cycle

8

The Process: 1. Articulating Learning Outcome Statements

Course/Service/Educational Experience Outcome Statements

Institution-level Outcome Statements

Department- or Program-level Outcome Statements

9

What Is an Outcome Statement?

Describes learning desired within a context

Relies on active verbs (create, compose, calculate)

Emerges from our collective intentions over time

10

Can be mapped to curricular and co-curricular practices (ample, multiple and varied opportunities to learn over time)

Can be assessed quantitatively or qualitatively during students’ undergraduate and graduate careers

Is written for a course, program, or institution

11

Example from ACRL

ONE OUTCOME: Student examines and compares information from various sources in order to evaluate reliability, validity,accuracy, timeliness, and point of view or bias.

12

Ways to Articulate Outcomes

Adapt from professional organizations

Derive from mission of institution/program/department/service

Derive from students’ work that demonstrates interdisciplinary thinking, ways of knowing, or problem solving

13

Derive from faculty to faculty interview process

Derive from exercise focused on listing one or two outcomes “you attend to”

14

Question?

Using the handout, review your outcomes against criteria for outcome statements to determine that your outcomes meet those criteria

15

The Process: 2. Mapping Learning Outcomes

Curricular:• Reveal how we translate outcomes into

educational practices offering students multiple and diverse opportunities to learn

• Help us to identify appropriate times to assess those outcomes

• Identify gaps in learning or opportunities to practice

16

Help students understand our expectations of them

Place ownership of learning on students

Enable them to develop their own maps or learning chronologies

(review sample curricular maps and inventories)

17

Group Work on Mapping

After you have reviewed the two curricular maps and inventories, identify a useful inventory and labeling system you might use to map your outcomes. Your system will enable you to identify assessment methods already in place as well as times to engage in formative and summative assessment.

18

“Every assessment is also based on a set of beliefs about the kinds of tasks or situations that will prompt students to say, do, or create something that demonstrates important knowledge and skills. The tasks to which students are asked to respond on an assessment are not arbitrary.“

National Research Council. Knowing what students know: The science and design of educational assessment . Washington, D.C.: National Academy Press, 2001, p. 47.

The Process: 3. Designing or Selecting Assessment Methods

19

Assumptions UnderlyingTeaching

Actual Practices

Assumptions UnderlyingAssessment Tasks

Actual Tasks

20

What Tasks Elicit Learning You Desire?

Tasks that require students to select among possible answers (multiple choice test)?

Tasks that require students to constructanswers (students’ problem-solving and thinking abilities)?

21

When Do You Seek Evidence?

Formative—along the way?For example, to ascertain progress or development

Summative—at the end?For example, to ascertain mastery level of achievement

22

Some Methods That Provide Direct Evidence

Student work samples, such as reports, theses, other written documents or oral presentations

Collections of student work (e.g. Portfolios)

Capstone projects

23

Course-embedded assessment (derive examples; develop agreed upon prompt and ask students to respond to it in class or at a designated time)

Observations of student behavior

Internal juried review of student projects—creations, demonstrations

Simulations—virtual or live

24

External juried review of student projects

Externally reviewed internship

Performance on a case study/problem

Performance on case study accompanied with students’ analysis

25

Representative disciplinary practices

Performance on national licensure examinations or other standardized tests

Locally developed tests

Pre-and post-tests

26

The Process: 4. Developing Criteria and Standards of Judgment

A set of criteria that identifies the expected characteristics of a text and the levels of achievement along those characteristics. Scoring rubrics are criterion-referenced, providing a means to assess the multiple dimensions of student learning.

Are collaboratively designed based on how and what students learn (based on curricular-co-curricular coherence)

27

Are aligned with ways in which students have received feedback (students’ learning histories)

Students use them to develop work and to understand how their work meets standards (can provide a running record of achievement).

28

Raters use them to derive patterns of student achievement to identify strengths and weaknesses

29

Strategies to Develop Scoring Rubrics

Emerging work in professional and disciplinary organizations

Research on learning (from novice to expert)

Student work

Interviews with students

30

Experience observing students’development

31

Pilot-testing the Scoring Rubric

Apply to student work to assure you have identified all the dimensions with no overlapSchedule inter-rater reliability times:

-independent scoring-comparison of scoring-reconciliation of responses-repeat cycle

32

The Process: 5. Collecting and Analyzing Results

Determine how and when to collect student responses

Determine who will analyze results (suitable for aggregation and disaggregation according to question you wish to answer)

33

The Process: 6. Interpreting Results

Seek patterns against criteria and cohorts

Build in program-level discourse

Tell the story that explains the results—triangulate with other data

Determine what you wish to change, revise, or how you want to innovate

34

Implement agreed upon changes

Focus on collective effort—what we do and how we do it

Re-assess to determine efficacy of changes

(See report format to guide annual reports)

The Process: 7. Implementing Changes and Re-Entering the Assessment Cycle

35

Examples of Changes:

Increased attention to weaving experiences across the program to improve student achievement

Changes in advising based on assessment results

Closer monitoring of student achievement—tracking according to demographics or other meaningful criteria

36

Faculty and staff development to learn how to integrate experiences that contribute to improved student learning

Changes in pedagogy and curricular and co-curricular design

Development of modules to assist learning; use of technology; self-paced learning, supplemental learning

37

Mission/Purposes

Learning Outcomes

How well dowe achieve

our outcomes?

Gather Evidence

Interpret Evidence

Enhance teaching/ learning;

inform institutional decision-

making, planning,budgeting

38

“What and how students learn depends to a major extent on how they think they will be assessed.”

John Biggs, Teaching for Quality Learning at University: What The Student Does. Society for Research into Higher Education & Open University Press, 1999, p. 141.

39

List of attachmentsQuestions that ground discussions focused on teaching, learning, and assessmentChecklist for outcome statementsDissemination of learning outcome statementsCurricular-co-curricular mapInventories of educational and assessment practicesIdentification of assessment methodsExamples of scoring rubricsInventory of direct and indirect methodsAnnual report format

40

Integrating Teaching, Learning, and AssessingPedagogy

Curricular design

Instructional design

Educational tools

Educational experiences

Students’ Learning Histories/Styles

41

Integrated Learning….

Cognitive

AffectivePsychomotor

42

The Process: 1. Development of Learning Outcome Statements

Course/Service/Educational Experience Outcome Statements

Institution-level Outcome Statements

Department- or Program-level Outcome Statements

43

What Is an Outcome Statement?

Describes learning desired within a context

Relies on active verbs (create, compose, calculate)

Emerges from our collective intentions over time

44

Can be mapped to curricular and co-curricular practices (ample, multiple and varied opportunities to learn over time)

Can be assessed quantitatively or qualitatively during students’ undergraduate and graduate careers

Is written for a course, program, or institution

45

Distinguishing between Objectives and Outcomes

Objectives state overarching expectations such as—

Students will develop effective oral communication skills.

ORStudents will understand different economic principles.

46

Compare:

Students will write effectively.

Students will compose a range of professional documents designed to solve problems for different audiences and purposes.

to

47

Compare:

Students will writeeffectively.

toStudents will summarize recent articles on economics and identify underlying economic assumptions.

48

Example from ACRL

ONE OUTCOME: Student examines and compares information from various sources in order to evaluate reliability, validity,accuracy, timeliness, and point of view or bias.

49

Ways to Articulate Outcomes

Adapt from professional organizations

Derive from mission of institution/program/department/service

Derive from students’ work that demonstrates interdisciplinary thinking, ways of knowing, or problem solving

50

Derive from faculty to faculty interview process

Derive from exercise focused on listing one or two outcomes “you attend to”

51

Mapping Learning Outcomes

Curricular-Co-curricular maps:• Reveal how we translate outcomes into

educational practices offering students multiple and diverse opportunities to learn

• Help us to identify appropriate times to assess those outcomes

• Identify gaps in learning or opportunities to practice

52

Help students understand our expectations of them

Place ownership of learning on students

Enable them to develop their own maps or learning chronologies

53

“Every assessment is also based on a set of beliefs about the kinds of tasks or situations that will prompt students to say, do, or create something that demonstrates important knowledge and skills. The tasks to which students are asked to respond on an assessment are not arbitrary.“

National Research Council. Knowing what students know: The science and design of educational assessment . Washington, D.C.: National Academy Press, 2001, p. 47.

The Process: 2. Design or Selection of Assessment Methods

54

Assumptions UnderlyingTeaching

Actual Practices

Assumptions UnderlyingAssessment Tasks

Actual Tasks

55

What Tasks Elicit Learning You Desire?

Tasks that require students to select among possible answers (multiple choice test)?

Tasks that require students to constructanswers (students’ problem-solving and thinking abilities)?

56

When Do You Seek Evidence?

Formative—along the way?For example, to ascertain progress or development

Summative—at the end?For example, to ascertain mastery level of achievement

57

Some Methods That Provide Direct Evidence

Student work samples, such as lab reports, theses, other written practices

Collections of student work (e.g. Portfolios)

Capstone projects (see AAC& U work on these kinds of projects)

58

Course-embedded assessment (derive examples; develop agreed upon prompt and ask students to respond to it in class or at a designated time)

Observations of student behavior

Internal juried review of student projects—creations, demonstrations

Simulations—virtual or live

59

External juried review of student projects

Externally reviewed internship

Performance on a case study/problem

Performance on case study accompanied with students’ analysis

60

Disciplinary practices

Performance on national licensure examinations or other standardized tests

Locally developed tests

Pre-and post-tests

61

The Process: 3. Development of Criteria and Standards of Judgment

A set of criteria that identifies the expected characteristics of a text and the levels of achievement along those characteristics. Scoring rubrics are criterion-referenced, providing a means to assess the multiple dimensions of student learning.

Are collaboratively designed based on how and what students learn (based on curricular-co-curricular coherence)

62

Are aligned with ways in which students have received feedback (students’ learning histories)

Students use them to develop work and to understand how their work meets standards (can provide a running record of achievement).

63

Raters use them to derive patterns of student achievement to identify strengths and weaknesses

64

Development of Scoring Rubrics

Emerging work in professional and disciplinary organizations

Research on learning (from novice to expert)

Student work

Interviews with students

65

Experience observing students’development

66

Pilot-testing the Scoring Rubric

Apply to student work to assure you have identified all the dimensions with no overlapSchedule inter-rater reliability times:

-independent scoring-comparison of scoring-reconciliation of responses-repeat cycle

67

The Process: 4. Collection and Analysis of Results

Determine how and when to collect student responses

Determine who will analyze results (suitable for aggregation and disaggregation according to question you wish to answer)

68

The Process: 5. Interpretation of Results

Seek patterns against criteria and cohorts

Build in institutional level and program level discourse

Tell the story that explains the results—triangulate with other data

Determine what you wish to change, revise, or how you want to innovate

69

Implement agreed upon changes

Re-assess to determine efficacy of changes

Focus on collective effort—what we do and how we do it

The Process: 6. Implement Changes and Re-Enter the Assessment Cycle

70

Examples of Changes:

Increased attention to weaving experiences across the institution to improve student achievement

Changes in advising based on assessment results

Closer monitoring of student achievement--tracking

71

Faculty and staff development to learn how to integrate experiences that contribute to improved student learning

Changes in pedagogy and curricular and co-curricular design

Development of modules to assist learning; use of technology; self-paced learning, supplemental learning

72

Mission/Purposes

Learning Outcomes

How well dowe achieve

our outcomes?

Gather Evidence

Interpret Evidence

Enhance teaching/ learning;

inform institutional decision-

making, planning,budgeting

73

“What and how students learn depends to a major extent on how they think they will be assessed.”

John Biggs, Teaching for Quality Learning at University: What The Student Does. Society for Research into Higher Education & Open University Press, 1999, p. 141.

74

List of attachmentsChecklist for outcome statementsDissemination of learning outcome statements Questions that ground discussions focused on teaching, learning, and assessmentInventories of educational and assessment practicesCurricular mapsIdentification of assessment methodsExamples of scoring rubricsAnnual report format

75

1. Developing and Mapping Learning Outcome Statements

Course/Service/Educational Experience Outcome Statements

Institution-level Outcome StatementsIncluding GE

Department- or Program-level Outcome Statements

76

What Is an Outcome Statement?

Describes learning desired within a context

Relies on active verbs (create, compose, calculate)

Emerges from our collective intentions over time

77

Can be mapped to curricular and co-curricular practices (ample, multiple and varied opportunities to learn over time)

Can be assessed quantitatively or qualitatively during students’ undergraduate careers

Is written for a course, program, or institution

78

Distinguishing between Objectives and Outcomes

Objectives state overarching expectations such as—

Students will develop effective oral communication skills.

ORStudents will understand different economic principles.

79

Learning Outcome Statementfrom ACRL

ONE OUTCOME: Student examines and compares information from various sources in order to evaluate reliability, validity,accuracy, timeliness, and point of view or bias.

80

Quantitative Literate Graduates according to MAA Should be Able to:

1. Interpret mathematical models such as formulas, graphs, tables, and schematics, and draw inferences from them.

2. Represent mathematical information symbolically, visually, numerically, and verbally.

3. Use arithmetical, algebraic, geometric, and statistical methods to solve problems.

81

Quantitative Literate Graduates according to MAA Should be Able to:

1. Interpret mathematical models such as formulas, graphs, tables, and schematics, and draw inferences from them.

2. Represent mathematical information symbolically, visually, numerically, and verbally.

3. Use arithmetical, algebraic, geometric, and statistical methods to solve problems.

82

4. Estimate and check answers to mathematical problems in order to determine reasonableness, identify alternatives, and select optimal results.

5. Recognize that mathematical and statistical methods have limits. (http://www.ma.org/pubs/books/qrs.html)The Mathematics Association of America (Quantitative Reasoning for College Graduates: A Complement to the Standards, 1996). See also AMATYC draft, 2006.

83

Ethics—Students should be able to…

Identify and analyze real world ethical problems or dilemmas, and identify those affected by the dilemma.

Describe and analyze the complexity and importance of choices that are available to the decision-makers concerned with this dilemma

84

Ways to Articulate Outcomes

Adapt from professional organizations

Derive from mission of institution/program/department/service

Derive from students’ upper level work

85

Derive from ethnographic process

Derive from exercise focused on listing one or two outcomes “you attend to”

86

How well do your current outcome statements address the criteria for well articulated outcome statements (see attachment)

87

Mapping Learning Outcomes

Curricular-Co-curricular maps:• Reveal how we translate outcomes into

educational practices offering students multiple and diverse opportunities to learn

• Help us to identify appropriate times to assess those outcomes

• Identify gaps in learning or opportunities to practice

88

Help students understand our expectations of them

Place ownership of learning on students

Enable them to develop their own maps or learning chronologies

89

Inventories of Practice

Provide in-depth information about how students learn along the continuum of their studies

Identify the range of educational practices and assessment experiences that contribute to learning outcomes (See handouts)

90

How will you use maps and inventories?

Discuss with your team how you will go about the process of developing a curricular or curricular-co-curricular map and how you will label peoples’entries

Discuss with your team how you might use inventories

91

“Every assessment is also based on a set of beliefs about the kinds of tasks or situations that will prompt students to say, do, or create something that demonstrates important knowledge and skills. The tasks to which students are asked to respond on an assessment are not arbitrary.“

National Research Council. Knowing what students know: The science and design of educational assessment . Washington, D.C.: National Academy Press, 2001, p. 47.

2.Designing or Selecting Assessment Methods

92

Assumptions UnderlyingTeaching

Actual Practices

Assumptions UnderlyingAssessment Tasks

Actual Tasks

93

What Tasks Elicit Learning You Desire?

Tasks that require students to select among possible answers (multiple choice test)?

Tasks that require students to constructanswers (students’ problem-solving and thinking abilities)?

94

Approaches to Learning

Surface Learning

Deep Learning

95

When Do You Seek Evidence?

Formative—along the way?For example, to ascertain progress or development

Summative—at the end?For example, to ascertain mastery level of achievement

96

Direct Methods

Focus on how students represent or demonstrate their learning (meaning making)

Align with students’ learning and assessment experiences

Align with curricular-and co-curricular design verified through mapping

97

Invite collaboration in design (faculty, students, TAs, tutors)

98

Standardized InstrumentsPsychometric approach—values quantitative methods of interpretation

History of validity and reliability

Quick and easy adoption and efficient scoring

One possible source of evidence of learning

99

Do Not Usually Provide

Evidence of strategies, processes, ways of knowing, understanding, and behaving that students draw upon to represent learning

Evidence of complex and diverse ways in which humans construct and generate meaning

Highly useful results that relate to pedagogy, curricular design, sets of educational practices

100

Authentic, Performance-based Methods

Focus on integrated learning

Directly align with students’ learning and previous assessment experiences

Provide opportunity for students to generate responses as opposed to selecting responses

Provide opportunity for students to reflect on their performance

101

Do Not Provide

Immediate reliability and validity (unless there has been a history of use)

Usually do not provide easy scoring unless closed-ended questions are used.

102

Some OptionsE-Portfolios

Capstone project (mid-point and end-point)

Performances, productions, creations

Visual representations (mind mapping, charting, graphing)

103

Disciplinary or professional practices

Agreed upon embedded assignments

Selection of assignments students hand in

Simulations (virtual or live)

104

Team-based or collaborative projects

Internships and service projects

Oral examinations

Critical incidents

105

Externally or internally juried review of student projects

Externally reviewed internship

Performance on a case study/problem

Performance on case study accompanied with students’ analysis

106

Performance on national licensure examinations

Locally developed tests

Standardized tests

Pre-and post-tests

107

Learning Logs or Journals

Magic box—increasingly difficult problems over time

Videotaping

108

Indirect Methods

Surveys (CCSSE, for example)

Interviews

Focus groups

109

Other Sources of Information that Contribute to Your Inference Making

Grades

Participation rates in support services or in the co-curriculum

Course-taking patterns

110

Achievement in Majors

Transcript analyses

Course content analyses

Initial placement

111

Identify Methods to Assess Outcomes

Using the handout, identify both direct and indirect methods you might use to assess several of your outcomes. Determine the kinds of inferences you will be able to make based on each method.

112

3. Developing Standards and Criteria of Judgment

A set of criteria that identifies the expected characteristics of a text and the levels of achievement along those characteristics. Scoring rubrics are criterion-referenced, providing a means to assess the multiple dimensions of student learning.

Are collaboratively designed based on how and what students learn (based on curricular-co-curricular coherence)

113

Are aligned with ways in which students have received feedback (students’ learning histories)

Students use them to develop work and to understand how their work meets standards (can provide a running record of achievement).

114

Raters use them to derive patterns of student achievement to identify strengths and weaknesses

115

4. Collecting and Analyzing Results

Determine how and when to collect student responses

Determine who will analyze results (suitable for aggregation and disaggregation according to question you wish to answer)

116

5. Interpreting Results

Seek patterns against criteria and cohorts

Build in institutional level and program level discourse

Tell the story that explains the results—triangulate with other data

Determine what you wish to change, revise, or how you want to innovate

117

6. Implementing Changes and Re-Entering the Assessment Cycle

Implement agreed upon changes

Re-assess to determine efficacy of changes

Focus on collective effort—what we do and how we do it

118

Examples of Changes:

Increased attention to weaving learning experiences across the institution to improve students’ achievement of outcomes

Changes in advising based on assessment results

Closer monitoring of student achievement--tracking

119

Faculty and staff development to learn how to integrate experiences that contribute to improved student learning

Changes in pedagogy and curricular and co-curricular design

Development of modules to assist learning; use of technology; self-paced learning, supplemental learning

120

Mission/Purposes

Learning Outcomes

How well dowe achieve

our outcomes?

Gather Evidence

Interpret Evidence

Enhance teaching/ learning;

inform institutional decision-

making, planning,budgeting

121

List of AttachmentsQuestions to Ground Discussion of Teaching, Learning, and AssessmentChecklist for Outcome StatementsDissemination of outcome statements2 curricular mapsInventories of practiceIdentification of assessment methods Four examples of scoring rubricsScoring rubricFormat for annual reporting

Recommended