Upload
vevina
View
49
Download
0
Tags:
Embed Size (px)
DESCRIPTION
District Determined Measures Diman Regional Vocational School . Dr. Deborah Brady. Goals for Today. By the end of this session You will understand what needs to be done and be able to explain it to your colleagues - PowerPoint PPT Presentation
Citation preview
District Determined Measures Diman Regional Vocational School
Dr. Deborah Brady
Goals for Today By the end of this session
You will understand what needs to be done and be able to explain it to your colleagues
You will have tools to begin to do that work in your district
But please email questions or confusions any time during this process:[email protected]://tinyurl.com/lumqld7 materials from this presentation
The Steps Necessary to Get Ready for June Report and After
• Adapting present assessments
• Creating new assessments
• Writing to text for HSDevelopin
g and Piloting
Assessments
• Alignment of Content
• Rigorous and appropriate expectations
• Approval of assessments
Assessing Quality
and Rigor
• Security• Calibration
of standards and of assessors
• Rubric quality
• Analysis of results: High-M-Low GrowthPiloting
• 2 DDMs per educator
• JUNE REPORT• Directions for
teachers • Directions for
students• Organizing for
the actual assessments
• Storing, tracking the information
2015 Full Implementatio
n• Data
storage• Data
Analysis• L-M-H
Growth• Roster
Verification• Data team
timeInterpretin
g the results Student Impact
Living Likert
Take a magic markerReview all 6 stages After considering each one, go to your “stage” Consider:
What are your school’s barriers? What are your district’s strengths?
The DESE RequirementsPurpose, timeline, requirements, direct and indirect assessments
District Determined Measures
DEFINITIONDDMs are defined as:“Measures of student learning, growth, and achievement related to the Curriculum Frameworks, that are comparable across grade or subject level district-wide”
TYPES OF MEASURES Portfolio assessments Approved commercial
assessments District developed pre
and post unit and course assessments
Capstone projects
Timeline for Piloting and Full Implementation
2013-2014 District-wide training, development of assessments and pilotingJune 2014: Report: All educators in the district have 2 DDMs to be implemented fully in SY2015.2014-2015All DDMs are implemented; scores are divided into H-M-and Low and stored locally2015-2016Second year data is collected and all educators receive an impact rating that is sent to DESE based on 2 years of data for two DDMs
District Determined Measures Regulations
Every educator will need data from at least 2 different measures
Trends must be measured over a course of at least 2 years
One measure must be taken from State-wide testing data such as MCAS if available (grades 4-8 ELA and Math SGP for classroom educators)
One measure must not be MCAS; it must be a District Determined Measure which can include local assessments,and Galileo, normed assessments (DRA, MAP, SAT)
NEW!Determining Educator Impact on Each DDM
Evaluator and educator meet. Evaluator determines whether students demonstrated high, moderate, or low growth on each DDM.
Evaluator shares the resulting designations of student growth with educator.
Educators confirm rosters.Must be on roster by 10/1 and remain on roster through last
day of testing.Must be present for 90% of instructional time.
Performance & Impact Ratings
Performance RatingRatings are obtained through data collected from observations, walk-throughs and artifacts Exemplary Proficient Needs Improvement Unsatisfactory
Impact RatingRatings are based on trends and patterns in student learning, growth and achievement over a period of at least 2 years Data gathered from DDM’s and State-wide testing High Moderate Low
NEW! Determining a Student Impact Rating
Introduces the application of professional judgment to determine the Student Impact Rating Evaluator assigns rating using professional judgment. Evaluator considers designations of high, moderate, or low
student growth from at least two measures in each of at least two years.
If rating is low, evaluator meets with educator to discuss If rating is moderate or high, evaluator/educator decide if
meeting is necessary.
Student Impact Rating Determines Plan Duration for PST
(not future employment)
Summativ
e Rating
Exemplary 1-yr Self-Directed
Growth Plan2-yr Self-Directed Growth Plan
Proficient
Needs Improvement
Directed Growth Plan
Unsatisfactory Improvement Plan
Low Moderate HighRating of Impact on Student
Learning
Massachusetts Department of Elementary and Secondary Education
12Impact
Ratingon
StudentPerformance
NEW! Intersection of Ratings Reinforces independent nature of the two ratings.
Exemplary or Proficient matched with Moderate or High = 2-Year Self-Directed Growth Plan
Exemplary/Moderate and Exemplary/High = recognition and rewards, including leadership roles, promotions, additional compensation, public commendation, and other acknowledgements.
Proficient/Moderate and Proficient/High = eligible for additional roles, responsibilities, and compensation.
Exemplary or Proficient matched with Low = 1-Year Self-Directed Growth Plan Evaluator’s supervisor confirms rating. Educator and evaluator analyze the discrepancy. May impact Educator Plan goals.
Student Impact Rating informs the self-assessment and goal setting processes.
Indirect Measures Indirect measures of student learning, growth, or achievement provide information about students from means other than student work.
These measures may include student record information (e.g., grades, attendance or tardiness records, or other data related to student growth or achievement such as high school graduation or college enrollment rates).
To be considered for use as DDMs, a link (relationship) between indirect measures and student growth or achievement must be established.
ESE recommends that at least one of the measures used to determine each educator’s student impact rating be a direct measure.
Indirect Measure Examples Consider SST Process for a team:
High school SST team example RTI team example High school guidance example Subgroups of students can be studied (School Psychologist group example) Social-emotional growth is appropriate (Autistic/Behavioral Program example)
Number of times each student says hello to a non-classroom adult on his or her way to gym or class (Direct)
Number of days (or classes) a student with school anxiety participates Assess level of participation in a class (Direct) Improve applications to college
IEP goals can be used as long as they are measuring growth (academic or social-emotional)
Turn and TalkTime to de-brief and review the “rules”
Using the 6-phase overview, what are your priorities?
• Adapting present assessments
• Creating new assessments
• Writing to text for HSDevelopin
g and Piloting
Assessments
• Alignment of Content
• Rigorous and appropriate expectations
Assessing Quality
and Rigor
• Security• Calibration
of standards and of assessors
• Rubric quality
• Analysis of results: High-M-Low GrowthPiloting
• 2 DDMs per educator
• JUNE REPORT• Directions for
teachers • Directions for
students• Organizing for
the actual assessments
• Storing, tracking the information
2015 Full Implementatio
n • Data storage• Data Analysis• L-M-H Growth
Interpreting the
results Student Impact
Assessment Quality Requirementsand Definitions from DESE
Alignment, Rigor, Comparability, “Substantial,” Modifications
What are the requirements?1. Is the measure aligned to content?
Does it assess what is most important for students to learn and be able to do?
Does it assess what the educators intend to teach?Bottom Line: “substantial” content of course
At least 2 standards ELA: reading/writing Math: Unit exam Not necessarily a “final” exam (unless it’s a high quality exam)
19
2. Is the measure informative? Do the results of the measure inform
educators about curriculum, instruction, and practice?
Does it provide valuable information to educators about their students?
Does it provide valuable information to schools and districts about their educators?
Bottom Line: Time to analyze is essential20
Two Considerations for Local DDMs,
1. Comparable across schools Example: Teachers with the same job (e.g., all 5th grade
teachers) Where possible, measures are identical
Easier to compare identical measures Do identical measures provide meaningful information about all students?
Exceptions: When might assessments not be identical? Different content (different sections of Algebra I) Differences in untested skills (reading and writing on math test for ELL
students) Other accommodations (fewer questions to students who need more time) NOTE: Roster Verification and Group Size will be considerations by DESE
21
Five Considerations (DESE)
1. Measure growth
2. Employ a common administration procedure
3. Use a common scoring process
4. Translate these assessments to an Impact Rating (High-Moderate-Low)
5. Assure comparability of assessments within the school (rigor, validity).
22
Two Forms to Adapt to Your School’s Standards
Handout—DDM Proposal formSimple Excel List for June report
MEPID Last Name
First Name
Grade/Dept.
DDM1 DDM2 DDM3 (optional)
Jones Brigit ELA 9Writing for
College
ELA common assessment
ELA writing DDM
Smith Marion 9-12 library Library Search Tools DDM
Indirect: Increase
teachers who do research in
library.
Watson Elspeth
Physics Physics mid term or final
Physics—interpreting
dataHolmes Bill Carpentry Skills USA test Departmental
Safety DDM Guidance Indirect:
Increase College Applications
June Report Form (Not Yet Released)Educators Linked with DDMs
Handout Sample Check all Items that are completed Definition Your Answers Here
Source of DDM Locally developed Standardized test
Are you developing the assessment as a department or team, or is your school/district purchasing an assessment?
These first four categories can be used for this year’s June report:
• Educator• Grade/Department• DDM name• Source of DDM
Course What is the title of the course that this DDM will be given in?
Possible educators who will use this DDM
Courses and teachers may change, but who at this time will probably teach this course?
Grade(s) of DDM Grade level(s) that this assessment will cover
Alignment to State and/or District Standards
At least 2 standards must be assessed to make this assessment a “substantial” assessmentFor indirect measures, 1) what are the substantial, important, essential areas that you are assessing? 2) How does this indirect measure connect with student growth?
Please list the two (or more) standards using standards language. 1. 2.
Rigor: Check the levels of Blooms that are assessed
The original Bloom is the first word on the list. The new Bloom (all verbs) is the second. Note, in the new Bloom, Creating is on the highest level, above evaluating.More than one level can be assessed.
Knowledge, Remembering Comprehension, Understanding Application, Applying Analysis, Analyzing Synthesis, Creating Evaluation, Evaluating
Type(s) of questions
Multiple Choice, fill in, short answer (recall items from content area)
Multiple Choice, fill in, short answer (text dependent questions)
Open Response (short answer) Essay (long response). Type:
Narrative Informational Text Argument with claims and proof
One text is read Two texts are read Performance Assessment (CEPA) Other_______ (Fill in at right.)
Indicate the percentage of the assessment for each question type, for example, multiple choice=50%; 2 open responses=50% (25% each). Multiple Choice _____% Open Response _____% Essay _____________%
Duration of assessment
Assessments can take place in a class period or over a period of days.
For next year’s scheduling
and implementation
When assessment(s) will take place
Provide approximate month or window for assessment(s), for example, end of first trimester, September.Provide multiple dates if the assessment is a pre-post or is administered more than once.
Components of assessment that are completed so far.
Directions to teacher for administering Directions to students Graphic organizers (optional) The assessment Scoring guide Rubric Security Calibration protocol if this assessment has a rubric
Rubric Not Yet Does not
apply
How was the rubric created? For example,adapted from DESE’s CEPA rubric, or developed by the middle school science department.
Please include rubric (even in draft form) Begin with:
CEPAOr PARCCOr MCAS
Turn and Talk If you took an inventory of assessments, what are your next steps?
Calculating Growth Scores
Defining growth, measuring growth, calculating growth for a classroom, for a district
4503699244/ 25 SGP 230/ 35 SGP
225/ 92 SGP
Sample Student GROWTH SCORES from the MCAS
TEACHER GROWTH SCORES are developed from student growth scores
Approaches to Measuring Student Growth
Pre-Test/Post TestRepeated MeasuresHolistic EvaluationPost-Test Only
30
Pre/Post Test Description:
The same or similar assessments administered at the beginning and at the end of the course or year
Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year with the passages changed
Measuring Growth: Difference between pre- and post-test.
Considerations: Do all students have an equal chance of
demonstrating growth?
31
Pre-Post AnalysisCut Scores for L-M-H Growth
Pre-test
Post test Difference
%age growthDiff/pre
%age growth low to high
Sort low to high diff
ONE “mock”
classroom
20 35 15 75% 20% 5 Cut score LOW Growth25 30 5 20% 42% 15 bottom 20%
30 50 20 67% 42% 2035 60 25 42% 50% 25 Moderate Growth
35 60 25 42% 60% 25 median teacher score
40 70 35 87% 62% 25 median Teacher score
40 65 25 62% 67% 2550 75 25 50% 70% 3050 80 30 60% 75% 35 Cut Score Top 20%?
50 85 35 70% 87% 35 HIGH GROWTH
Determining Growth with Pre- and Post Assessments
Cut scores need to be locally determined for local assessments Standardized assessments use “The Body of the Work” protocol which easily
translates to local assessments First determine the difference between pre- and post- scores for all students
in a grade or course Then determine what Low Moderate and High growth is. (Local cut scores)
Top and bottom 10% to begin as a test case Body of the Work check Then all scores are reapportioned to each teacher The MEDIAN score for each teacher determines that teacher’s growth score
Further measures beyond pre- and post- tests
Repeated measures, Holistic Rubrics, Post-Test Only
Repeated Measures Description:
Multiple assessments given throughout the year. Example: running records, attendance, mile run
Measuring Growth:GraphicallyRanging from the sophisticated to simple
Considerations:Less pressure on each administration.Authentic Tasks
35
Repeated Measures Example Running Record
36
9/24/2
012
10/3/2
012
10/12
/2012
10/21
/2012
10/30
/2012
11/8/2
012
11/17
/2012
11/26
/2012
12/5/2
012
12/14
/2012
12/23
/2012
1/1/20
13
1/10/2
013
1/19/2
013
1/28/2
013
2/6/20
13
2/15/2
013
2/24/2
013
3/5/20
13
3/14/2
013
3/23/2
013
4/1/20
13
4/10/2
013
4/19/2
013
4/28/2
013
0
10
20
30
40
50
60
70Running Record Error Rate
Low GrowthHigh GrowthMod Growth
Date of Administration
# of errors
Holistic Description:
Assess growth across student work collected throughout the year.
Example: Tennessee Arts Growth Measure System Measuring Growth:
Growth Rubric (see example) Considerations:
Option for multifaceted performance assessments Rating can be challenging & time consuming
37
Holistic Example
38
1 2 3 4
Details
No improvement in the level of detail.One is true* No new details across versions
* New details are added, but not included in future versions.
* A few new details are added that are not relevant, accurate or meaningful
Modest improvement in the level of detailOne is true* There are a few details included across all versions
* There are many added details are included, but they are not included consistently, or none are improved or elaborated upon.
* There are many added details, but several are not relevant, accurate or meaningful
Considerable Improvement in the level of detailAll are true* There are many examples of added details across all versions,
* At least one example of a detail that is improved or elaborated in future versions
*Details are consistently included in future versions
*The added details reflect relevant and meaningful additions
Outstanding Improvement in the level of detailAll are true* On average there are multiple details added across every version
* There are multiple examples of details that build and elaborate on previous versions
* The added details reflect the most relevant and meaningful additions
Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho. Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts
Post-Test Only Description:
A single assessment or data that is paired with other information
Example: AP exam Measuring Growth, where possible:
Use a baseline Assume equal beginning Or use your own historical data
Considerations: May be only option for some indirect measures What is the quality of the baseline information?
39
Post-Test OnlyA challenge to tabulate growth Portfolios Capstone Projects AP scores If you have local histories, you can use
them For example, a C student in pre-Calculus
getting a 1 might be low growth, a 2 or 3 might be moderate, a 4 or 5 might be high. This is a local determination of growth.
40
Turn and Talk
Discuss the calculations, security, storage, fairness of determining local cut scores.
Tools That May be HelpfulWhat is important?What does a good assessment look like?
Core Curriculum Objectives(http://www.doe.mass.edu/edeval/ddm/example/)
# Objective
1 Students analyze how specific details and events develop or advance a theme, characterization, or plot of a grade 9 literary text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.
2 Students analyze how the structure, syntax, diction, and connotative or figurative meanings of words and phrases inform the central idea or theme of a grade 9 literary text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.
3 Students analyze how specific details, concepts, or events interact to develop or advance a central idea of a grade 9 informational text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.
4 Students analyze how cumulative word choice, rhetoric, syntax, diction, and the technical, connotative, or figurative meanings of words and phrases support the central idea or author’s purpose of a grade 9 informational text.
5 Students produce clear and coherent writing to craft an argument, in which the development, organization, and style are appropriate to their task, purpose, and audience, using such techniques as the following:
introducing precise claim(s), distinguishing the claim(s) from alternate or opposing claims, and creating an organization that establishes clear relationships among claim(s), counterclaims, reasons, and evidence;
developing claim(s) and counterclaims fairly, supplying evidence for each while pointing out the strengths and limitations of both in a manner that anticipates the audience’s knowledge level and concerns;
using words, phrases, and clauses to link the major sections of the text, create cohesion, and clarify the relationships between claim(s) and reasons, between reasons and evidence, and between claim(s) and counterclaims;
establishing and maintaining a formal style and objective tone while attending to the norms and conventions of the discipline in which they are writing;
providing a concluding statement or section that follows from and supports the argument presented; and
demonstrating command of the conventions of Standard English.
ELA-Literacy — 9 English 9-12https://wested.app.box.com/s/pt3e203fcjfg9z8r02siAssessment
Hudson High School Portfolio Assessment for English Language Arts and Social Studies
Publisher Website/Sample
Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments.
Traditional Assessment
Non-Traditional Assessment
Administration/ Scoring
Traditional End-of-Grade Assessment Pre/Post or Repeated
Measures Paper/Pencil
Traditional End-of-Course Assessment Performance Task Rubric Computer Supported
Selected Response Portfolio or Work Sample Rubric Computer Adaptive
Short Constructed Response Project-Based Rubric Machine Scored
Writing Prompt/Essay Observation Rubric or Checklist Scored Locally
Other: Scored Off-Site
http://www.doe.mass.edu/edeval/ddm/example/
Click this link to see the
Blueprint of each
assessment
http://www.workforcereadysystem.org/index.shtml
Take a sample test You can take a ten question sample test once you find the
BLUEPRINT for your exam. To take a sample test: Select Blueprint Select “EXPERIENCE IT” You will have to register.
1) Each assessment provides instructions for giving the assessment.
2) The on-line assessments may have videos and may require students to put things in sequence
3) The questions are not just multiple choice
Customer Service Demonstrate professional development skills
in a simulated customer service or employment situation. Examples may include:
Job interview Customer service scenario Communications Decision making, problem solving and/or critical thinking
Safety
Could be a common assessment for areas without a specific assessment
Additional Testing Examples
Massachusetts Model Curriculum Units and Curriculum Embedded Performance Assessment examples www.doe.mass.edu. You must sign up for them
Other juried sites that may be helpful Most Core areas: Engage New York
www.engageny.orgELA: Achieve at www.achievethecore.org Math and ELA and Literacy: PARCC On Line www.parcconline.orgRubrics for writing by grade level and writing type
http://www.doe.k12.de.us/aab/English_Language_Arts/writing_rubrics.shtml
MA Model Curricula and Rubrics CEPAs 1 2 3 4 5 6
Topic development:
The writing and artwork identify the habitat and provide details
Little topic/idea development, organization, and/or details Little or no awareness of audience and/or task
Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task
Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language
Moderate topic/idea development and organization Adequate, relevant details Some variety in language
Full topic/idea development Logical organization Strong details Appropriate use of language
Rich topic/idea development Careful and/or subtle organization Effective/rich use of language
Evidence and Content Accuracy: writing includes academic vocabulary and characteristics of the animal or habitat with details
Little or no evidence is included and/orcontent is inaccurate
Use of evidence and content is limited or weak
Use of evidence and content is included but is basic and simplistic
Use of evidence and accurate content is relevant and adequate
Use of evidence and accurate content is logical and appropriate
A sophisticated selection of and inclusion of evidence and accurate content contribute to an outstanding submission
Artwork; identifies special characteristics of the animal or habitat, to an appropriate level of detail
Artwork does not contribute to the content of the exhibit
Artwork demonstrates a limited connection to the content (describing a habitat)
Artwork is basically connected to the content and contributes to the overall understanding
Artwork is connected to the content of the exhibit and contributes to its quality
Artwork contributes to the overall content of the exhibit and provides details
Artwork adds greatly to the content of exhibit providing new insights or understandings
ELA, math, social studies/history, science, the arts, technology DOK=Depth of Knowledgehttp://www.stancoe.org/SCOE/iss/common_core/overview/overview_depth_of_knowledge.htm
Protocols to Use Locally for Inter-Rater Reliability; Looking at Student Work
Rhode Island Calibration Protocol for Scoring Student Work on the wiki
Another brief one-hour training handout for assessing student work, and developing local rubrics is also posted (HML Protocol)
Next Steps Develop pilot assessments for SY 2014 Assess results; use results to help plan for full implementation in
2015 Develop a plan for all educators to have two DDMs: MCAS growth,
purchased, or local Develop a district process for assessing the quality of assessments
(DESE Quality Tool or attachment on last two pages) Develop an internal process for cut scores and determining low,
average, and high growth of students Track/organize information for June report: Educators/DDMs Plan for 2015 administration for all educators: Tracking,
scheduling, storing year 1 scores, storing year 2 scores
53
“Don’t let perfection get in the way of good.”
Potential as Transformative ProcessWhen C, I or A is changed….
Elmore, Instructional Rounds, and the “task predicts performance”
Curriculum
Instruction
Team Planning Time
• Adapting present assessments
• Creating new assessments
• Writing to text for HSDevelopin
g and Piloting
Assessments
• Alignment of Content
• Rigorous and appropriate expectations
• Approval of assessments
Assessing Quality
and Rigor
• Security• Calibration
of standards and of assessors
• Rubric quality
• Analysis of results: High-M-Low GrowthPiloting
• 2 DDMs per educator
• JUNE REPORT• Directions for
teachers • Directions for
students• Organizing for
the actual assessments
• Storing, tracking the information
2015 Full Implementatio
n• Data
storage• Data
Analysis• L-M-H
Growth• Roster
Verification• Data team
timeInterpretin
g the results Student Impact
Sample DDMsGood, Not-so-good, and Problematical
Demonstrating Growth(when accuracy of computation may be a concern)
Essay Prompt from Text
Read a primary source about Mohammed based on Muhammad’s Wife’s memories of her husband.Essay: Identify and describe Mohammed’s most admirable quality based on this excerpt. Select someone from your life who has this quality. Identify who they are and describe how they demonstrate this trait.
What’s wrong with this prompt?
Science Open Response from Text
Again, from a textbook, Is this
acceptable?
Scoring Guides from Text
Lou Vee Air Car built to specs (50 points) Propeller Spins Freely (60 points) Distance car travels
1m 70 2m 80 3m 90 4m 100
Best distance (10,8,5) Best car(10,8,5) Best all time distance all classes (+5)
235 points total
A scoring guide from a textbook for
building a Lou Vee Air Car. Is it good enough to ensure
inter-rater reliability?
Technology/Media RubricA multi-criteria rubric
for technology. What is good, bad,
problematical?
PE Rubric in Progress.Grade 2 Overhand
throw.Looks good?
Music: Teacher and Student Instructions
World Language Scoring Guide and Rubric
World LanguageMiddle School