48
STUDENT LEARNING OBJECTIVES ASSESSMENT WRITING INSTITUTE JULY 16-20, 2012 Monroe 2 – Orleans BOCES

Student Learning Objectives Assessment Writing Institute July 16-20, 2012

  • Upload
    ivy

  • View
    28

  • Download
    0

Embed Size (px)

DESCRIPTION

Student Learning Objectives Assessment Writing Institute July 16-20, 2012. Monroe 2 – Orleans BOCES. Agenda. Setting the Context Overview of Reform Agenda What are SLOs? Who Needs an SLO? Assessment Development Review of NYS Test Development Process Regional Test Development Process - PowerPoint PPT Presentation

Citation preview

Page 1: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

STUDENT LEARNING OBJECTIVESASSESSMENT WRITING

INSTITUTEJULY 16-20, 2012

Monroe 2 – Orleans BOCES

Page 2: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Agenda• Setting the Context– Overview of Reform Agenda – What are SLOs?– Who Needs an SLO?

• Assessment Development – Review of NYS Test Development Process– Regional Test Development Process– Choosing an Item Format– How to write Multiple Choice Questions– How to write Constructed Response Questions

Page 3: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Agenda (continued)– Overview of Assessment Platform (LinkIt)– Item review guidelines– Test Administration– Test Validation

• Review of Format for Subject Area Writing Sessions– Set Expectations– Overview of Assessment Platform LinkIT!

Page 4: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

4

Regents Reform Agenda

College and Career Ready

Students

Highly EffectiveSchool Leaders

Highly Effective Teachers

Implementing Common Core standards and developing curriculum and assessments aligned to these standards to prepare students for success in college and the workplace

Building instructional data systems that measure student success and inform teachers and principals how they can improve their practice in real time

Recruiting, developing, retaining, and rewarding effective teachers and principals

Turning around the lowest-achieving schools

Page 5: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

as of 3/5/12; subject to revision

20-25%

• State-provided growth measures or Student Learning Objectives (SLO)

15-20%

• Locally-selected measures of student growth or achievement

60%

• Other Measures - Majority of Points from multiple Observations (teachers) or visits plus surveys and records for principals.

5

State-test based measures if they are different from growth

measures

List of state-approved 3rd party tests

District, regional or BOCES-developed

assessments

School-wide measures

Student Learning Objectives (for State non-tested subjects)

Teacher: Other points: Ind/ Peer Observation*

, Student/ Parent Feedback*, Student

Work*, Teacher Artifacts*

Principal: Other options include goals

around teacher effectiveness, learning

environment or academic

20 25%

20 15%

Teacher: Other points:Individual/Peer Observation*, Student/Parent Feedback*,Student Work*,Teacher Artifacts*

* Please refer to the Summary of Revised APPR Revisions 2012-13: http://engageny.org/wp-content/uploads/2012/03/nys-evaluation-plans-guidance-memo.pdf

Components of the New APPR

Page 6: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Key Messages for Student Learning Objectives

• SLOs name what students need to know and be able to do by the end of the year.

• SLOs place student learning at the center of the conversation.

• SLOs are a critical part of all great educators’ practice.

• SLOs are an opportunity to document the impact educators make with students.

Page 7: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Key Messages for SLOs continued…• SLOs provide principals with critical

information that can be used to differentiate and target professional development, and focus supports for teachers.

• The SLO process encourages collaboration within school buildings.

• School leaders are accountable for ensuring all teachers have SLOs that will support their district and school goals.

Page 8: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

SLO Framework

All SLOs MUST include the following basic components:

Student Population Which students are being addressed?

Learning Content

What is being taught? CCSS/National/State standards? Will this goal apply to all standards applicable to a course or just to specific priority standards?

Interval of Instructional Time

What is the instructional period covered (if not a year, rationale for semester/quarter/etc)?

EvidenceWhat assessment(s) or student work product(s) will be used to measure this goal?

BaselineWhat is the starting level of learning for students covered by this SLO?

Target(s)What is the expected outcome (target) by the end of the instructional period?

HEDI CriteriaHow will evaluators determine what range of student performance “meets” the goal (effective) versus “well-below” (ineffective) , “below” (developing), and “well-above” (highly effective)?

Rationale Why choose this learning content, evidence and target?

Page 9: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Who needs an SLO?• Teacher 1: Those who have a State provided

growth measure and are not required to have an SLO.

• Teacher 2: Those who have a State provided growth measure, and yet, are required to have an SLO because less than 50% of their students are covered by the State provided growth measure.

• Teacher 3: Those who are required to have an SLO and do not have a State provided growth measure.

Page 10: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Required SLOs: Reference Guide

10

Please see the “Required SLOs: Reference Guide” for NYSED’s rules for teachers who

have SLOs for State Growth

Page 11: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Test Your Knowledge: State Provided Growth Measure or SLO?

Teacher State Provided Growth Measure or SLO?

5th Grade Common Branch Teacher

8th Grade ELA Teacher

Elementary Art Teacher- Two 2nd grade Art sections with 20 students each;- Two 4th grade Art sections with 25 students each;- One 5th grade Art section with 30 students.

7th Grade Math and Science Teacher- Two 7th grade Math sections with 30 students each; - Two 7th grade Science sections with 25 students each; - One Advanced 7th grade Science section with 20 students.

High School CTE Teacher-150 students across 5 sections of Agricultural Science (all

use same final assessment)

8 th Grade Science Teacher-One 8 th grade Science section with 30 students;-Four 8 th grade Advanced Science sections with 28 students each.

Test Your Knowledge: State Provided Growth Measure or SLO?

Page 12: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Test Your Knowledge: State Provided Growth Measure or SLO?

Teacher State Provided Growth Measure or SLO?

5th Grade Common Branch Teacher Growth SGP/VA

8th Grade ELA Teacher State Provided Growth SGP/VA

Elementary Art Teacher- Two 2nd grade Art sections with 20 students each;- Two 4th grade Art sections with 25 students each;- One 5th grade Art section with 30 students.

SLO:•1 SLO for 4th grade Art sections•1 SLO for 2nd grade Art sections

7th Grade Math and Science Teacher- Two 7th grade Math sections with 30 students each; - Two 7th grade Science sections with 25 students each; - One Advanced 7th grade Science section with 20 students.

High School CTE Teacher-150 students across 5 sections of Agricultural Science (all

use same final assessment)

SLO:• 1 SLO for Agricultural Science sections

8 th Grade Science Teacher-One 8 th grade Science section with 30 students;-Four 8 th grade Advanced Science sections with 28 students each.

SLO:•1 SLO for 8 th grade Advanced Science sections

State Provided

SLO:• 1 SLO for 7th grade math (willreceive State provided growth SGP)• 1 SLO for 7th grade Science

Test Your Knowledge: State Provided Growth Measure or SLO?

Page 13: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Test Development: NYS Process

Page 14: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Regional Test Development Process• Review of Test Specifications: Item Formats

(MC/CR), Numbers, and Item Coding Strategy (align to NY CCLS & Depth of Knowledge, e.g. Bloom)

• Write Items aligned to Test Specs• Review Items: Item Coding; Item Structure; Style; &

Bias/Sensitivity Review• Accept/Reject/Revise Items• Input Items into LinkIt

Page 15: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Regional Test Development Process (continued)

• Design Test Forms• Create Test Form Item Map (Combination of Test

Specifications and Item Maps create formal Test Blueprint)

• Select/Design Rubrics for CR Items• Create Uniform Administration Protocols• Collect Data for Validation

Page 16: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Choosing An Item Format

• Most efficient and reliable way to measure knowledge is with MC formats.

• Most direct way to measure skill is via performance, but many mental skills can be tested via MC with a high degree of proximity (statistical relation between CR and MC items of an isolated skill). If the skill is critical to ultimate interpretation, CR is preferable to MC.

Page 17: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Choosing An Item Format

• When measuring a fluid ability or intelligence, the complexity of such human traits favors CR item formats of complex nature (high-inference) (Haladyna, 1999).

Page 18: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Choosing An Item Format: Conclusions about Criterion Measurement

Criterion Conclusion About MC & CR

Declarative Knowledge Most MC formats provide the same information as an essay, short answer, or completion formats.

Critical Thinking MC formats involving vignettes or scenarios provide a good basis for forms of critical thinking. MC format has good fidelity to the more realistic open-ended behavior elicited CR.

(Haladyna, 1999)

Page 19: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Choosing An Item Format: Conclusions about Criterion Measurement

Criterion Conclusion About MC & CR

Problem Solving Ability MC item sets provide a good basis for testing problem solving. However, more research is still needed.

Creative Thinking Ability MC format limited.

School Abilities (e.g. writing, reading, & mathematics)

Performance has the highest fidelity to criterion for these school abilities. MC is good for measuring foundational aspects of fluid abilities, such as declarative knowledge or knowledge of skills.

(Haladyna, 1999)

Page 20: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Best Practices for Designing and Grading Assessments

• Read Best Practices for Designing and Grading Exams (2005) and share three things you learned with your table group.

Page 21: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Conventional Multiple Choice Questions

Three Parts: stem, correct answer, and distractors (Haladyna, 1999).

● Stem: Stimulus for the response; it should provide a complete idea of the problem to be solved in selecting the right answer. The stem can also be phrased in a partial-sentence format. Whether the stem appears as a question or a partial sentence, it can also present a problem that has several right answers with one option clearly being the best of the right answers.

Page 22: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Conventional Multiple Choice Questions

Three Parts: stem, correct answer, and distractors• Correct Answer: the one and only right answer; it

can be word, phrase, or sentence.

• Distractors: Distractors are wrong answers. Each distractor must be plausible to test-takers who have not yet learned the content the item is supposed to measure. To those who have learned the content, the distractors are clearly wrong choices. Distractors should resemble the correct choice in grammatical form, style, and length. Subtle or blatant clues that give away the correct choice should be avoided.

Page 23: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Conventional Multiple Choice Questions

Design: Review MC Question Types on State Assessments and other Large-Scale Assessments in use, e.g. AP. Review question formats for various grades and subjects. Also review Common Core Sample Questions at http://www.p12.nysed.gov/apda/common-core-sample-questions/

• Review Haladyna Handout: Guidelines for MC Item Writing and discuss with a partner.

• Read How Can We Construct Good Multiple-Choice Items?

(Cheung & Bucat, (2002) and review with a partner.

Page 24: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

• Using the content objectives make sure you narrow the focus of the potential question. – Is it about the main idea? (easier)– Is it regarding a significant detail? (more difficult)– Is it inferential? (very challenging)

• Write the stem (question). It should be:– A complete concept– Clear and Concise– Reflective of the main idea or a significant detail– Each item only assesses one standard

Steps to Writing Multiple Choice Questions

Page 25: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

• It should not:– Be obvious or answerable with common prior

knowledge– Be dependent on one word– Be written in the negative– Include all of the above, none of the above, or a

or b

Steps to Writing Multiple Choice Questions

Page 26: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

• For Multiple Choice Questions:– Develop 4 Reponses– Write the correct answer first (the key)– Scan the text for possible distracters and

develop three wrong answers– All responses should be parallel in

construction– Be equal in length (or two short two longer)– Be phrased positively– Be mutually exclusive

Steps to Writing Multiple Choice Questions

Page 27: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

High & Low-Inference Constructive Response Questions

Key Components of CR Items:• Task: a specific item, problem, question, prompt, or

assignment• Response: Any kind of performance to be evaluated,

including short/extended answer, essay, presentation, & demonstration

• Rubric: scoring criteria used to evaluate responses• Scorers: people who evaluate responses (ETS, 2005)

Page 28: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

High & Low-Inference Constructed Response Questions

Attribute High-Inference Low-InferenceType of Behavior Measured Usually Abstract, most

valued in educationUsually concrete

Ease of Construction Design of items is complex, involving command or question, conditions for performance, and rubrics

Design of items is not as involved as high inference, involving command or question, conditions for performance, and a simple mechanism

Cost of Scoring Involves training; expensive Scoring not as complex but costs can still be high

Page 29: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

High & Low-Inference Constructed Response Questions

Attribute High-Inference Low-Inference

Reliability Reliability can be a problem due to inter-rater reliability

Results can be very reliable due to concrete nature of items

Objectivity Can be subjective More objective

Bias: Systemic Error Possible threats to validity: over or underrating

Seldom yields biased observations

(Haladyna, 1999)

Page 30: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

• For Short Response Questions:– Measures targeted reasoning skill– Task is clearly specified– Question can be answered in allotted time– Avoid choices among several questions– Measures higher order thinking skills (Upper

Levels of Bloom’s Taxonomy)

Steps to Writing Constructed Response Questions

Page 31: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Assessment Platform:LinkIt!

Page 32: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

• LinkIt! is an assessment and data management platform

• Tool to design and store our regional assessments

• NWEA item bank available for ELA, Math, Science and Social Studies

What is LinkIt?

Page 33: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Process for LinkIt

• Item Banks and Test Banks are already organized by grade level/content area/course.

• One person from each course will have access to input the assessment questions during the institute. (Training will be provided)

• Assessments will be reviewed prior to giving districts access.

Page 34: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Item Review Guidelines

• Requires Content Experts (include General Education, Special Education, and ELL Expertise)

• Item Coding: Content (Aligned to correct Learning Standard-See Common Core Exemplars) & Cognition (e.g. Bloom Taxonomy; Webb’s Depth of Knowledge)

Page 35: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Bloom’s Taxonomy

Page 36: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Bloom’s Taxonomy

Page 37: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Webb’s Depth of Knowledge

Level Description

Level 1 Recall: recall of a fact, information, or procedure

Level 2 Skill/Concept: use of information or conceptual knowledge, two or more steps, etc.

Level 3 Strategic Thinking: requires reasoning, developing plan or a sequence of steps, some complexity, more than one possible answer.

Level 4 Extended Thinking: requires an investigation, time to think and process multiple conditions of the problem

(www.wcer.wisc.edu)

Page 38: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Webb’s Depth of Knowledge

Page 39: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Webb’s Depth of Knowledge

Page 40: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Item Review Guidelines

●Item Structure (MC: Stem, Correct Answer, & Plausible Distractors; CR: Clearly identified task, content and verb, e.g. analyze, discuss, & rubrics)

●Editing (Conventions of Standard-Written English)

●Bias/Sensitivity Review: Joint Standards 7.4

Page 41: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Item Review Guidelines

Joint Standards 7.4:“Test Developers should strive to identify and language, symbols, words, phrases, and content that are generally regarded as offensive by members of racial, ethnic, gender, or other groups, except when judged to be necessary for adequate representation of the domain.”

Page 42: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Item Review Guidelines

Joint Standards 7.4 Two Issues• Inadvertent use of language that, unknown to test

developer, has a different meaning or connotation in one subgroup then in others.

• Settings in which sensitive material is essential for validity, e.g. history tests may include material on slavery or Nazis and life sciences may test on evolution.

Page 43: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Test AdministrationNeed To Create Administration Manuals to establish & document testing protocols

• Recommendation: use SED guides as primary reference; review guides for AP and other large-scale assessment programs as templates.

• Link To SED Manuals: http://www.p12.nysed.gov/apda/manuals/

• Link To AP: http://professionals.collegeboard.com/testing/ap/test-day/instructions

Page 44: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Test Validation: Data Collection & Analysis

• Answer Sheet Design & Scanning Procedures (Work w/BOCES & RIC)

• Depth of data collection: student demographics; scores; and item level data, if possible

• Recommendation: collect data to parallel Title I disaggregation

• Evaluate/Revise• Generate trend data • Review against other data to verify & audit rigor• Create Local Technical Manuals

Page 45: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Let’s Get Started! How to Write Test Specifications

• Identify Standards to be addressed (be sure to include Common Core Shifts and Standards

• For courses with a NYS exam (past or present) review NYSED percentages tested for each standard identified! For all others determine percentages as a group!

Page 46: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Let’s Get Started! How to Write Test Specifications

• Determine types of questions for exam and the number of questions per item format

• Pre Assessments should be only 40 minutes total in length! (can be administered over multiple sessions if needed!

Page 47: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Process and Procedures for Writing:

In Subject Area Groups: • Review and/or develop, and finalize test

specifications• Determine Process for writing items• Make sure the items you generate are aligned

to NYS standards (CCLS and shifts) • Review items per guidelines• Develop Test Blueprint• Select/Design Rubrics for CR Items• Create Uniform Administration Protocols(See Slides 18-19)

Page 48: Student Learning Objectives Assessment Writing Institute July 16-20, 2012

QUESTIONS?