65
Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Embed Size (px)

Citation preview

Page 1: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Student Learning Objectives Pilot Test

SLO Learning Goals and Quality Assessment

Aurora Public SchoolsFall 2013

Page 2: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Introductions Center for Transforming Learning and Teaching

Catalyzing and co-creating the transformation of learning environments through the use of assessment so that all are engaged in learning and empowered to positively contribute in a global society.

www.ctlt.org

Facilitator/Trainer:Julie Oxenford O’Brian

Coach/Trainer:Mary Beth Romke

[email protected] [email protected]

Page 3: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Check-In Check-in with your table group:

How was your experience identifying an SLO Learning Goal?

Were you able to specify associated standards, determine the cognitive complexity, write your goal as an objective statement, and determine if the goal represents the learning needs of your students?

Did you try-out engaging your students with success criteria?

Capture remaining questions about identifying SLO Learning Goal(s) on a sticky note

Page 4: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Purpose of Session Two

Finalize SLO Learning Goals.

Introduce the roles of Assessment in the SLO Process and the key characteristics of Quality Assessment.

Page 5: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

SLO ComponentsLearning Goal 

Learning Goal

Standards Reference

Rationale

Success Criteria

Measures 

Evidence Sources

Alignment of Evidence

Collection and Scoring

Performance Targets 

Baseline Data

Performance Groups

Performance Targets

Rationale for TargetsProgress Monitoring 

Check Points

Progress Monitoring Evidence Sources

Instructional Strategies

SLO Results Student Performance Results

Targets Met

Teacher Performance

Day OneDay Two

Day Three and Four

Day Five

Day Three

Day Five

Page 6: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Materials

Page 7: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Learning Outcomes: Session Two

Engage in learning activity during this session.

Complete follow-up readings and tasks.

Finalize an SLO Learning Goal including writing a rationale for the goal.

Understand the role of assessment in SLOs.

Define assessment and the key components of assessment.

Identify a variety of methods (informal and formal) for collecting data about student learning.

Describe the relationship between the method of assessment used and the information gained.

Identify baseline data sources.

Page 8: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Activity: Monitoring your learning Turn to Progress Monitoring (Note catcher, p. 2-3).

Re-write today’s learning outcomes in language that has meaning for you.

Create a bar graph which describes where you currently believe you are in relationship to each learning target.

Leave the “reflections” column blank for now.

Learning Target

I don’t know what this Is

I need more

practice

I’ve got It

I could teach some-

one about it

Reflections

Identify a variety of methods (informal and formal) for collecting data about student learning.

In my words:

I can describe how to collect learning data and list several different options.

Page 9: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Day Two Agenda

Quality Assessment

Practice

Using Assessment for

SLOs

Data Collection Methods

Goal Method Match

Finalize SLO Learning

Goals

Baseline Data

Sources

Page 10: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

SLO Learning Goal Process1. Identify the “big ideas” for the grade level and content area.

2. Identify learning goals associated with at least one “big idea” that would be achieved across several units, and/or which have related goals in prior or subsequent grade levels. These become candidates to be the SLO Learning Goal.

3. Determine which standards are associated with each candidate SLO Learning Goal.

4. Prioritize possible Learning Goals based on the learning needs of the student population (identifying two or three top priorities).

5. Determine the cognitive complexity (depth of knowledge) of the priority SLO Learning Goals. Eliminate candidate SLO learning goals with a depth of knowledge less than 3 for secondary and less than 2 for elementary.

6. Select the SLO Learning Goal.

7. Describe the rationale for your selection.

Day One

Page 11: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

SLO Rubric A tool for evaluating the quality of SLOs.

Used by:Teachers in the development of each SLO

component.

Supervisors as they vet SLOs with teachers.

District leaders to investigate the quality of SLOs being developed.

SLO Pilot will try-out this “draft” tool.

Page 12: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

SLO Rubric Consider your SLO Learning Goal.

Is your SLO Learning Goal consistent with the Learning Goal component definition on the Rubric?

Does it meet the criteria for “acceptable quality”?

Note: you should not have completed a rationale yet.

Take a few minutes to make any needed revisions to meet the “acceptable quality” criteria.

Page 13: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Effective Feedback is

Clear, descriptive, criterion-based, and indicates:√ how the learning goal differed from that

reflected in quality criteria, and

√ how the receiver of the feedback can move forward (what they might do next to improve).

Page 14: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Provide feedback about SLO Learning Goals Choose a partner (different grade level and/or content

area).

Exchange your SLO Learning Goals with your partner.

Consider: To what degree does her/his Learning Goal meet the acceptable

quality criteria?

Is it clear how the Learning Goal relates to the identified standards?

Is the Learning Goal at an appropriate level of cognitive rigor (DOK level)?

How could the components of the Learning Goal be improved?

Share your feedback with your partner.

Page 15: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

SLO Learning Goal Rationale Acceptable Quality Criteria for Rationale:

Clearly explains why the learning goal is an appropriate focus or need for students to learn.

Clearly explains how the learning goal addresses high expectations (DOK no less than 3 for secondary and no less than 2 for elementary).

Page 16: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Learning Goal Rationale Outline

Justify that your SLO learning goal is at the right level (it is an educational objective).

State how cognitively complex (as measured by Depth of Knowledge) and that it is a DOK>=3 for secondary and DOK>=2 for elementary.

Describe the data that justifies the learning goal is a need for the identified student population.

Page 17: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Practice: Your Rationale Write a rationale for your SLO Learning

goal.

Capture your rationale:On the SLO form or

In the note catcher for today (p. 3).

Page 18: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Share your Rationale

Stand up and find someone you haven’t spoken with today.

Share your SLO Learning Goal statement and your rationale.

Provide just-in-time feedback to your partner about his/her rationale.

Make any needed revisions to your rationale.

Page 19: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Day Two Agenda

Quality Assessment

Practice

Using Assessment for

SLOs

Data Collection Methods

Goal Method Match

Finalize SLO Learning

Goals

Baseline Data

Sources

Page 20: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Defining Educational Assessment What is assessment?

Write your working definition of assessment in your note catcher (p. 4).

Activating peers as resources:Find a partner

Share your definition

Update your definition (if appropriate)

Page 21: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Defining Educational Assessment Terms used synonymously in education:

assessment, educational measurement, and testing.

Educational Assessment is. . . A process by which educators use students’

responses to specially created or naturally occurring stimuli to draw inferences about the students’ knowledge and skills.

A process of reasoning from evidence.

Pellegrino, J., Chudowsky, N., and Glaser, R. Eds. (2001). Knowing what students know: The science and design of educational assessment. Washington DC: National Academy Press.

Page 22: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Assessment Components

1. The aspect(s) of student learning that are to be assessed (cognition).

2. The tasks used to collect evidence about students’ achievement (observation).

3. The approach used to analyze and interpret the evidence resulting from the tasks (interpretation).

Pellegrino, J., Chudowsky, N., and Glaser, R. Eds. (2001). Knowing what students know: The science and design of educational assessment. Washington DC: National Academy Press.

Page 23: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Assessment Triangle

Cognition

Observation Interpretation

(Tools, p. 1)

Pellegrino, J., Chudowsky, N., and Glaser, R. Eds. (2001). Knowing what students know: The science and design of educational assessment. Washington DC: National Academy Press.

Page 24: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Assessment Quality

Work with your table group to list three considerations for assessment quality.

Capture in your note catcher (p. 6).

Prepare to share your list. . .

Page 25: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Characteristics of Quality Assessment Accuracy

The assessment instrument measures what it is supposed to measure.

Consistency

Multiple data sources result in the same inferences.

Fairness (bias)

All students can access the materials in the assessment instrument and have the chance to show what they know.

Motivation

Students want to show what they know..

Instructional importance and utility

The use(s) for the results justify the investment of time and effort involved.

(Tools, p. 6)

Page 26: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Testing Axioms Turn to Testing Axioms (Tools, p. 5).

Talk with a partner about the following: Do you agree/disagree with each axiom?

What are the implications for using externally developed tests for SLOs?

What are the implications for other classroom uses of test results? Grading?

These axioms guide most large-scale assessment development (e.g. TCAP, Interim assessments).

Page 27: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Assessment Results Learning

Assessment results measure learning, but are not direct observations of learning.

All assessment instruments measure only a sample of the learning we care about.

All assessment results include “error” in their measurement of students’ learning.

Increasing assessment quality = reducing the error in our measurement of students’ learning.

Page 28: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Quality Assessment Criteria Select a partner and turn to the Quality Assessment

Criteria.

Individually and silently read the first row of the quality criteria.

Turn to your partner and “say something” about the criteria: A summary of what you have read.

A connection to something else.

An elaboration or explanation of what you have read.

Silently read the next row of quality criteria. Continue until you have read and “said something” about each of the quality criteria.

Page 29: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Day Two Agenda

Quality Assessment

Practice

Using Assessment for

SLOs

Data Collection Methods

Goal Method Match

Finalize SLO Learning

Goals

Baseline Data

Sources

Page 30: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Assessment in Student Learning Objectives

As part of the SLO Process, we use multiple evidence sources (data collected from a variety of assessment instruments) to “reason from evidence” about:

Student learning in relationship to our Learning Goal at the beginning of the instructional interval (baseline data).

Student progress towards the Learning Goal during the instructional interval (progress monitoring/formative assessment).

Student learning in relationship to our Learning Goal at the end of the instructional interval (summative assessment).

Teacher contribution to Student Learning Growth (aggregation of results across students in the class/course).

Page 31: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Levels of ObjectivesLevel of Objective

Global Educational Instructional

Scope Broad Moderate Narrow

Time needed to learn

Two or more years (often

many)

Weeks, months, or academic

yearHours or days

Purpose or function Provide vision Design

curriculum Prepare lesson

plans

Example of use Plan a multi-year curriculum (e.g.

elementary reading)

Plan units of instruction

Plan daily lessons, activities,

experiences and exercises

A Taxonomy for Learning, Teaching, and Assessing: A revision of Bloom’s taxonomy of educational objectives, 2001

SLO

Learn

ing

Goals

Less

on

Object

i

ves

or

Targ

ets

Tools, p. 7

Page 32: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

“An assessment activity can help learning if it provides information to be used as feedback by teachers, and by their students in assessing themselves and each other, to modify the teaching and learning activities.”

Black, Harrison, Lee, Marshall & Wiliam, 2003

“Formative assessment is a planned process in which assessment-elicited evidence of students’ status is used by teachers to adjust their ongoing instructional procedures or by students to adjust their current learning tactics.”

Popham, 2008

Definitions of Formative Assessment

Tools, p. 9

Page 33: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Formative Assessment Episode

1. Determine the learning goal/target.

2. Gather/collect information about learning (in relationship to the target(s)).

3. Analyze and interpret the gathered information about learning.

4. Use the learning information to improve teaching and/or learning.

Page 34: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Formative Assessment Episodes

Learning Goal/Target

Collecting Learning

Information

Analyzing Learning

Information

Interpreting Learning

Information

Using Learning

Information

Oxenford-O’Brian, 2013Tools, p. 11

Page 35: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Summative vs. Formative Assessment

Summative Formative

Ranking/Sorting

CertifyingCompetence

Grading

Accountability

Collecting data about

learning

Analyzing &Interpretingdata about

learning

Questioning

Clarifying Targets w/Learners

Providing Useful

Feedback

Self- & Peer-Assessment

Setting Goals &

Monitoring Progress

Planning &EvaluatingInstruction

Adjusting Learning ActivityDefining the

Learning Target (s)

Tools, p. 15

Page 36: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Assessment in the SLO Form Take out the SLO Form and SLO Component

Descriptions.

Where in the form will you capture information about assessment occurring as part of the SLO Process? Measuring and Scoring - how you will observe and interpret

student learning at the end of the instructional interval.

Performance Targets/Baseline Data – how you will observe and interpret student learning at the beginning of the instructional interval.

Progress Monitoring – how you will observe and interpret student learning during the instructional interval (progress towards the Learning Goal(s)).

Results – how student learning results are aggregated into a teacher performance rating.

Page 37: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Day Two Agenda

Quality Assessment

Practice

Using Assessment for

SLOs

Data Collection Methods

Goal Method Match

Finalize SLO Learning

Goals

Baseline Data

Sources

Page 38: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Data Collection Methods

1. How we “collect data” determines our assessment method.

2. Use sticky notes to write down all of the strategies you currently use to collect data about student learning.

3. List as many as you can, capturing one per sticky note.

Page 39: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Jigsaw Reading: Collecting Data Select a partner and assign readings (one per person):

Evidence of Learning (Davies, 2000) – Tools, p. 21.

Assessment, Testing, Measurement and Evaluation (Russell & Airasian, 2012) – Tools, p. 27.

As you read, highlight: Assessment Methods: descriptions of different categories of data

collection techniques or sources of evidence.

Examples of strategies for each data collection method.

Share the descriptions and examples with your partner.

Page 40: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Data Collection Strategies Work as a table group.

Group your data collection strategies (sticky notes) into the following categories of data collection methods: Observation

Questioning

Student Products

Put similar strategies together.

Page 41: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Student Products – additional categories

Not all student products yield the same type of data about learning.

Additional “assessment methods” that can be part of student products include: Selected Response

Short Constructed Response

Extended Constructed Response

Performance/Demonstration

Portfolio

Page 42: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Informal vs. Formal MethodsInformal Assessment Methods: Collected in the moment

Take less time

May or may not be planned ahead of time

Individual, small group, or full class

May or may not result in documented evidence

Observation and Questioning

Formal Assessment Methods: Structured

Take more time

Planned in advance

Usually full class

Result in documented evidence of student learning

Student Products

Page 43: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Assessment Methods Continuum

Observation

Questioning (individual, group, full class)

SelectedResponse

Short ConstructedResponse

Demonstration or Performance

Portfolio

Time

Complexity of information

Informal Formal

ExtendedConstructedResponse

Student Products

Tools, p. 33

Page 44: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Organizing based on Continuum Sort your “student product” strategy examples into the

additional categories of the continuum. You may need to clarify

some of your examples.

If a strategy doesn’t fit into one of the categories, put it in an

“other” category.

Turn to the Assessment Methods Continuum (Note Catcher, p. 9-

10).

Make notes about assessment methods: Clarifications about the category

Example strategies

Page 45: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Day Two Agenda

Quality Assessment

Practice

Using Assessment for

SLOs

Data Collection Methods

Goal Method Match

Finalize SLO Learning

Goals

Baseline Data

Sources

Page 46: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Accuracy Alignment

Are the data we collect providing information about the learning goals we care about?

This is often referred to as “alignment”.

Alignment includes:

To what degree do the assessment tasks/items include the type of thinking/skills included in the learning goal?

To what degree do the assessment tasks include the knowledge/concepts included in the learning goal?

Are the assessment tasks as cognitively complex (DOK) as the learning goal?

Page 47: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Accuracy Starts with Learning Goal

Accurate assessment depends on knowing the kind of thinking and the complexity of the thinking that is being asked of students by the learning goal or target.

Clarifying the type of thinking and cognitive complexity of learning goals/targets helps us to better select a method of assessment that measures what we’re looking for.

Page 48: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

This means. . . Deconstruct the learning goals/targets (identifying

the skills/type of thinking and the content/knowledge).

Categorize the type of thinking required by the learning goal/target (using Revised Bloom’s Taxonomy).

Establish the cognitive complexity of the learning goal/target (using the Depth of Knowledge Framework).

Remember we already did this!

Page 49: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Appropriate Assessment Method(s)

Once we are clear on the thinking and depth of knowledge required by a learning goal/target (deconstructing).

We can better determine what assessment methods to use to collect data about student learning in relationship to the goal/target.

Every assessment method is not equally accurate for assessing every type of goal/target.

Page 50: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Aligning Learning Goals and Assessment Methods

1. Use the “Learning Goal to Method Match” blank table (Note Catcher, p. 11-12).

2. Fill in why you think each cell represents a “match” or not.

3. Compare your completed table to a table of a partner.

4. Prepare to share out questions/conflicts.

Page 51: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Cognitive Processes vs. Assessment MethodsType of Learning Target

Assessment Method

Observation Questioning Selected Response

Short Constructed Response

Extended Constructed Response

Demonstration or Performance

Remember Only if student talk is factual

Yes if questions are factual

Good for assessment remembering facts

Good for assessing remembering facts

Good for assessing conceptual knowledge

Too time consuming, hard to distinguish specific gaps

Understand YesYes if questions are about understanding

Only for factual knowledge Possibly Yes

May be difficult to distinguish specific gaps

Apply YesDifficult to use for this type of thinking

Difficult to use for this type of thinking

Possibly, may be difficult Yes Yes

Analyze YesYes if questions are about analysis

Possibly Possibly Yes Yes

Evaluate YesYes if questions are about evaluation

Possibly Possibly Yes Yes

Create Yes No

No (only to assess pre-requisite knowledge)

No (only to assess pre-requisite knowledge)

Only if what is being created is a written product

Best Method (in general)

Tools, p. 35

Page 52: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Depth of Knowledge vs. Assessment MethodsType of Learning Target

Assessment MethodObservation Questioning Selected

ResponseShort Constructed Response

Extended Constructed Response

Demonstration or Performance

DOK 1: Recall and Reproduce

Only if student talk is factual

Yes if questions are factual

Good for assessing remembering facts

Good for assessing remembering facts

Yes, for reproducing procedures.

Too time consuming, hard to distinguish specific gaps

DOK 2: Skills and

ConceptsYes

Yes, depending on the questions

Possibly Possibly YesGood for assessing some skills

DOK 3: Strategic Thinking/ Reasoning

YesYes, depending on the question

Difficult to use for this complexity of thinking

Difficult to use for this complexity of thinking

Yes Yes

DOK 4: Extended Thinking

YesYes, depending on the question

Difficult to use for this complexity of content

Difficulty to use for this complexity of thinking

Yes Yes

Tools, p. 36

Page 53: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Activity: Practice Matching Learning Goals to Assessment Methods

1. Work with your content/grade level group.

2. Take out your SLO Learning Goal(s).

3. Use the “Learning Goal and Assessment Methods Match” table (Note Catcher, p. 11).

4. Identify the assessment method(s) you will use for your SLO Learning Goal(s) and explain why.

Page 54: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Day Two Agenda

Quality Assessment

Practice

Using Assessment for

SLOs

Data Collection Methods

Goal Method Match

Finalize SLO Learning

Goals

Baseline Data

Sources

Page 55: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

What is baseline data? Student learning data collected before or at the

beginning of the instructional period.

Measures of student learning that relate to your SLO learning goal.

Could include: TCAP results (by student) from last year for current students. District interim/benchmark assessment results from beginning of

the year. Results from other district-or school-wide assessments. Results from classroom assessments.

Page 56: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Why analyze baseline data? Evaluate how much initial student performance

varied at the beginning of the instructional period.

Determine if students can/should be put into more than one group based on their initial performance.

Establish a “baseline” from which student learning growth can be measured for different performance groups.

Not to establish an initial score for every student.

Page 57: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

What evidence do you have?Talk with a partner. . .

What sources of evidence (assessment results) are available about student learning in relationship to your SLO Learning Goal from the beginning of the instructional interval? How closely does each evidence source align with the SLO

learning goal?

How formal was the data collection method?

When was data collected?

How was it scored?

How many evidence sources do you have?

Page 58: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Triangulation

Page 59: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

How much baseline data? Consider all of your evidence sources about

student learning in relationship to your SLO Learning Goal collected before or near the beginning of the instructional period.

Prioritize them.

List your top three evidence sources on the “Baseline Data” chart (Note Catcher, pg. 14).

Describe your level of confidence in your top three evidence sources.

Page 60: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Analyzing baseline dataFor each evidence source (Baseline data handout):

1. Determine what scores or metrics are provided by the evidence source?

2. Describe the performance of the student population, or the class. (e.g. 80 % of the students were proficient; 15% were partially proficient; and 5% were unsatisfactory)

3. Consider the range of student performance (low to high). Is the variability in student performance enough to form more than one group of students based on their performance?

4. If yes, describe the performance of the groups of students (2-4).

Page 61: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Combining evidence sourcesIn your Note Catcher (pg. 14):

Identify the number of performance groups you will have.

Identify a label for each (e.g. low, medium, high).

Describe student performance for each evidence source by performance group.

Create a combined description of student performance for each performance group.

Page 62: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Assign Students to Performance Groups

The simplest case: one performance group: Student performance does not vary

Baseline student performance can be characterized for the student population as a whole.

More than one performance group: Assign students (by name) to each performance group.

How will you assign students form whom performance was inconsistent across evidence sources?

How will you assign students for whom all baseline data is not available?

Use Performance Group Descriptions chart (Note Catcher, p. 16).

Page 63: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Before we see you again. . . Identify appropriate assessment methods for your SLO

Learning Goal example.

Bring at least one example instrument (if available) for the content area for your example learning goal(s).

Page 64: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Reflect and Consider your Learning

Return to your Progress Monitoring (Note Catcher).

Did you move to the right in your self-assessment? Add to your graph.

Make any notes about your own learning in the “reflections” column.

Page 65: Student Learning Objectives Pilot Test SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

Give us Feedback!! Oral: Share one ah ha!

Written: Use sticky notes

+ the aspects of this session that you liked or worked for you.

The things you will change in your practice or that you would change about this session.

? Question that you still have or things we didn’t get to today

Ideas, ah-has, innovations

Leave your written feedback on the parking lot.