53
Professional Development to Practice The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. Professional Development to Practice Sound Assessment Design Common Formative Assessment

Sound Assessment Design

  • Upload
    fay

  • View
    32

  • Download
    0

Embed Size (px)

DESCRIPTION

Common Formative Assessment. Sound Assessment Design. Draft: 3/30/2013. Welcome and Introductions. Please take a moment to introduce (or reintroduce) yourself to the group, by telling your name, district, and position. Our trainers for the day are…. Common Formative Assessment (CFA). - PowerPoint PPT Presentation

Citation preview

Page 1: Sound Assessment Design

Professional Development to Practice

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government.

Professional Development to Practice

Sound Assessment Design

Common Formative Assessment

Page 2: Sound Assessment Design

Professional Development to Practice

Welcome and Introductions

Please take a moment to introduce (or reintroduce) yourself to the group, by telling your name, district, and position.

Our trainers for the day are….

Page 3: Sound Assessment Design

Professional Development to Practice

KEY:

Core Training Modules

Follow-up Training Modules

Precursors to Training

Once teams determine an EP to focus on, they can choose one or multiples of these focused modules. Each of the EP modules in this section will include: implementation guidance with tools and troubleshooting, and using data to determine effectiveness.

The contentsof this presentation weredeveloped under a grant fromtheUSDepartment of Education to theMissouri Department of Elementaryand Secondary Education(#H323A120018). However, thesecontents do not necessarily representthe policy of the US Department ofEducation, and youshould not assumeendorsement by the FederalGovernment.

Collaborative Data Teams (CDT)

Foundational Processes

AgendasCommunicationNormsRoles

Overview and Purpose

Collaborative Teams

Activity: Wrap Up/Overview of

Next Steps

Follow-Up Based on Data: Coaching and Revisiting PD

School-Based Implementation Coaching

Overview and Purpose of

Coaching for supporting

school-wide implementation

Critical skills of coaching

Coaching in Practice

Activity: Wrap Up/Overview of

Next Steps

Follow-Up Based on Data: Coaching and Revisiting PD

Collaborative Work Training

Follow-up to Training

Getting Started

Wrap Up Activity

Focus AreasIntroduction to Missouri Collaborative Work

Use Getting Started Guide to determine starting point and scope of learning

Data-Based Decision Making (DBDM)

Overview and Purpose of DBDM

Data Team Process Steps Sequence and Examples

1. Collect and Chart Data2. Analyze and Prioritize3. SMART Goal4. Instructional Decision Making5. Determine Results Indicators6. Ongoing Monitoring

Developing Meaningful

Learning Targets

Quality Assessment

Design

Performance Events

Constructed Response

Items

Selected Response

Items

Common Formative Assessment (CFA)

Overview and Purpose of CFA

AdvancedProcesses

Consensus Collaborative SkillsProtocols

Overview and Purpose of EP

Effective Teaching/ Learning Practices (EP)

Spaced versus

Massed

Feedback

Assessment Capable Learners

Reciprocal Teaching

Spaced versus

Massed

Assessment Capable Learners

September 2013

Activity: Wrap Up/Overview of

Next Steps

Activity: Wrap Up/Overview of

Next Steps

Activity: Wrap Up/Overview of

Next Steps

Page 4: Sound Assessment Design

Professional Development to Practice

Quality Assessment

Design

Developing Meaningful Learning

Targets

Quality Assessment Design

Common Formative Assessment (CFA)

Overview and Purpose of CFA

Page 5: Sound Assessment Design

Professional Development to Practice

Learner Objectives of the Formative Assessment Series

Understand the clear purposes of assessment by clarifying Why they are assessingWho will use the results of assessment dataWhat they will do with the assessment data

Develop clear and meaningful learning targets to guide instruction and student learning.

Construct quality assessment instruments which are of sound design and measure pre-determined learning targets.

Page 6: Sound Assessment Design

Professional Development to Practice

Outcomes for the Day As a result of todays training you will…develop a better understanding of the components and characteristics of a quality formative assessment. develop a better understanding of various types of assessment items and the pros and cons of each type. continue using a backwards design approach and a template form to write a formative assessment. evaluate your formative assessment for quality.

Page 7: Sound Assessment Design

Professional Development to Practice

Module 3: Quality Assessment Design

Essential Questions:1.What decisions drive the type of assessment items to use in common formative assessments?2.What are the essential components needed to create a quality formative assessment? 3.What are the characteristics of quality selected-response, constructed-response and performance tasks?

Page 8: Sound Assessment Design

Professional Development to Practice

Session at a Glance Introductions/Objectives/Outcomes/ Norms Brief Review of Assessment Principles and Reflection Selected Response Items Constructed Response Items Performance Tasks Continue with Assessment Development Process Evaluate Your Test Using Data to Inform Test Writing Skills Implementation Steps, Roadblocks, and Supports Additional Learning Closure

Page 9: Sound Assessment Design

Professional Development to Practice

Norms

Begin and end on timeBe an engaged participantBe an active listener – open to new

ideasUse notes for side bar conversationsUse electronics respectfully

Page 10: Sound Assessment Design

Professional Development to Practice

Accurate Assessment

Effectively Used

WHY ASSESS?What’s the purpose?

ASSESS WHAT?What are the learning

targets?Are they clear?Are they good?

Who will use the results?

ASSESS HOW?What method?

Written well?Sampled how?

Avoid bias how?

COMMUNICATE HOW?

How is informationmanaged?reported?

Students are users too!

Students track progress and communication,!

Be sure students understand targets too!

Students can participatein the process too!

Source: Adapted from Classroom Assessment for Student Learning: Doing it Right-Using it Well., by R.J. Stiggins, J.Arter, J.Chappuis, & S. Chappuis, 2004, Portland, OR.

Page 11: Sound Assessment Design

Professional Development to Practice

ASSESS HOW?What method?Written well?

Sampled how?Avoid bias how?

Accurate Assessment

Source: Adapted from Classroom Assessment for Student Learning: Doing it Right-Using it Well., by R.J. Stiggins, J.Arter, J.Chappuis, & S. Chappuis, 2004, Portland, OR.

Page 12: Sound Assessment Design

Professional Development to Practice

Read and Reflect with a shoulder partner…

Every educator must understand the principles of sound assessment and must be able to apply those principles as a matter of routine in doing their work. Accurate assessment is not possible unless and until educators are given the opportunity to become assessment literate. (They) must understand student achievement expectations and how to transform those expectations into accurate assessment exercises and scoring procedures. (NEA, 2003)

Common Formative Assessments, Larry Ainsworth & Donald Viegut, 2006, Corwin Press, pg 53

Page 13: Sound Assessment Design

Professional Development to Practice

What is Assessment Literacy?

“The ability to understand the different purposes and types of assessment in order to select the most appropriate type of assessment to meet a specific purpose.” (Larry Ainsworth, 2006)

Page 14: Sound Assessment Design

Professional Development to Practice

What evidence do we need that students have met our stated purpose(s)?

“Fruitful assessment often poses the question ‘what is an array of ways I can offer students to demonstrate their understanding and skills?’ In this way, assessment becomes a part of teaching for success and a way to extend rather than merely measure learning.”

Quote by Carol Ann Tomlinson, 1995, taken from Common Formative Assessment by Ainsworth and Viegut

Page 15: Sound Assessment Design

Professional Development to Practice

Video Clip

http://www.youtube.com/watch?v=_CqgnZhb--Q

Page 16: Sound Assessment Design

Professional Development to Practice

Assessment of Missouri Learning Standards

The knowledge, skills and processes specified in Missouri’s Learning Standards (MLS) for Mathematics and English and Language Arts will be measured by Smarter Balanced Assessment Consortium (SBAC) using a variety of test item types…..selected response, constructed response and performance tasks.

Sample SBAC items may be viewed on the website: http://www.smarterbalanced.org/sample-items-and-performance-tasks

Page 17: Sound Assessment Design

Professional Development to Practice

Let’s define…Selected-response assessmentsConstructed-response assessmentsPerformance AssessmentsPull out this

template and pair up with someone. Using your current level of understanding, create a definition, identify the benefits and drawbacks of each type of assessment.

Page 18: Sound Assessment Design

Professional Development to Practice

Selected-Response Assessments…

Require students to select one response from a provided list

Types include multiple-choice; true-false; matching; short answer/fill-in-the-blank

Also include short answer/fill-in-the-blank (with a listing of answer choices provided)

Assess the student’s knowledge of factual information, main concepts, and basic skills

Page 19: Sound Assessment Design

Professional Development to Practice

Benefits of Selected-Response Items

Can be scored quicklyCan be scored objectively as correct

or incorrectCovers a wide range of content

Page 20: Sound Assessment Design

Professional Development to Practice

Drawbacks to Selected-Response Items

Tends to promote memorization of factual information rather than higher-level understanding

Inappropriate for some purposes (performance, writing, and creative thinking)

Lack of student writing in most cases, unless part of assessment design (Haladyna, 1997, pp.65-66)

Page 21: Sound Assessment Design

Professional Development to Practice

Five Roadblocks to Effective Item Writing

Unclear directionsAmbiguous statementsUnintentional cluesComplex phrasingDifficult vocabulary

Popham, 2003b, p.64

Page 22: Sound Assessment Design

Professional Development to Practice

Key Points for Writing SR ItemsChoose a selected-response format(s) that

aligns to the standard being measuredMake sure the items will produce the needed

evidence to determine mastery of the standardInclude the vocabulary of the standard

selected for assessment (as appropriate)Make sure the test question(s) require the

same level of rigor as that of the standardWrite each stem first, then write distractors

Page 23: Sound Assessment Design

Professional Development to PracticeExample of a Selected Response Item

Many experts will tell you that television is bad for you. Yet this is an exaggeration. Many television programs today are specifically geared towards improving physical fitness, making people smarter, or teaching them important things about the world. The days of limited programming with little interaction are gone. Public television and other stations have shows about science, history, and technical topics.

Which sentence should be added to the paragraph to state the author’s main claim? A. Watching television makes a person healthy. B. Watching television can be a sign of intelligence.

C. Television can be a positive influence on people.

D. Television has more varied programs than ever before.

Many experts will tell you that television is bad for you. Yet this is an exaggeration. Many television programs today are specifically geared towards improving physical fitness, making people smarter, or teaching them important things about the world. The days of limited programming with little interaction are gone. Public television and other stations have shows about science, history, and technical topics.

Which sentence should be added to the paragraph to state the author’s main claim? A. Watching television makes a person healthy. B. Watching television can be a sign of intelligence.

C. Television can be a positive influence on people.

D. Television has more varied programs than ever before.

Page 24: Sound Assessment Design

Professional Development to Practice

Smarter-Balanced Assessment Consortium (SBAC) has multiple-choice items that cue students to select more than one answer.

Example of another Selected Response Item

Page 25: Sound Assessment Design

Professional Development to Practice

A third example of a Selected Response Item

Use the illustration and your knowledge of social studies to answer the following question.

5. What colonial claim about the Boston Massacre is supported by this illustration?

a. Most American colonists in Boston were killed.

b. British soldiers fired on unarmed colonists.

c. There were more soldiers than civilians at the Boston Massacre.

d. Colonists were better equipped for war than British soldiers were.

Page 26: Sound Assessment Design

Professional Development to Practice

Constructed-Response Items…

Require students to organize and use knowledge and skills to answer a question or complete a task

Types include short-answer; open response; extended response; essays

More likely to reveal whether or not students understand and can apply what they are learning.

May utilize performance criteria (rubrics) to evaluate degree of student proficiency.

Page 27: Sound Assessment Design

Professional Development to Practice

Benefits of Constructed-Response Items

Responses will contribute to valid inferences about student understanding better than those derived from selected-response items.

Measure higher-levels of cognitive processes. Allow for diversity in student responses or

solution processes. Provide a better picture of students’ reasoning

processes. Promote the use of evidence to support claims

and ideas.

Page 28: Sound Assessment Design

Professional Development to Practice

Drawbacks of Constructed-Response

Take longer to scoreCan have errors in designDependent on student writing

proficiencyA challenge to score consistently and

objectively Must have clear rubrics for scoring

criteria so scoring is not subjective

Page 29: Sound Assessment Design

Professional Development to Practice

Video Clip http://www.youtube.com/watch?

v=gp7W6wV-obs

Page 30: Sound Assessment Design

Professional Development to Practice

Key Points to Writing Constructed Response Items

Items should be open-ended and require students to create a response

Students must demonstrate an integrated understanding of the “unwrapped” concepts and skills

Items must match the level of rigor of the “unwrapped” standards

A proficient answer reflects the understanding of higher-order instructional objectives

Constructed-response items MUST be accompanied by scoring guides.

Page 31: Sound Assessment Design

Professional Development to Practice

Example of Constructed Response Item

The table shows the price of different quantities of medium-sized apples at Tom’s Corner Grocery Store. What is the least amount of money needed to buy exactly 20 medium-sized apples if the bags must be sold intact and there is no tax charged? Be sure to show all of your work.

Number of Apples

Bag of 1 Bag of 6 Bag of 12

Total Price $.30 $1.20 $2.10

Page 32: Sound Assessment Design

Professional Development to Practice

Another example of Constructed Response Item

Scenario: Your friend is using the computer to type his one-page report for history. It is just two lines over one page and he doesn’t know how to make it fit on one page.

Question: Using the proper computer terminology, describe to your friend two valid solutions for making his report fit on one page without deleting any of the content.

Page 33: Sound Assessment Design

Professional Development to Practice

A third example of Constructed Response ItemStimulus: Information of Sally’s Experimental Design

Evaluate Sally’s experimental design. Identify two things Sally could have done differently to make her results more valid. Give reasoning for each one of your suggestions.

Page 34: Sound Assessment Design

Professional Development to Practice

Performance Tasks… Require students to construct a response,

create a product, or perform a demonstration.Are open-ended and usually allow for

diversity in responses or solution processes Are evaluated using scoring criteria given to

students in advance of performanceHighly engaging for studentsPromotes critical thinking and/or problem

solvingPromotes peer and self assessment

Page 35: Sound Assessment Design

Professional Development to Practice

Benefits of Performance TasksHave the ability to assess multiple learning

targets and application of knowledgeHighly engaging for studentsPromotes critical thinking and/or problem

solving Promotes peer and self-assessmentOffers multiple opportunities for students

to revise work using scoring guide feedback.

Page 36: Sound Assessment Design

Professional Development to Practice

Drawbacks of Performance TasksRubrics are more involved and take

longer to developPerformances take longer to scoreCan have error in evaluative designSuccess is often dependent on factors

other than those targeted for assessment (i.e. writing ability, verbal skills, physical abilities, etc.)

A challenge to score objectively and consistently

Page 37: Sound Assessment Design

Professional Development to Practice

Key Points to Writing a Performance Task Student performance should be recorded on a checklist or

scoring rubric. Contains a written prompt that cues the student to perform

some type of task that requires a demonstration, presentation or product creation.

Shows connections by measuring learning targets across strands or content areas.

Model what application of learning looks like in life beyond the classroom.

Should measure mastery of multiple learning targets and higher-level cognitive processes.

May be completed on one or more sittings or over time.

Page 38: Sound Assessment Design

Professional Development to Practice

Example of Performance Task

Part I: During the U.S. Civil War, quilts became a popular item for women to make. You will write an informative essay summarizing the history and purposes of civil war quilts. To gain the information needed to write your essay, you will watch a video and read two articles about quilts that were made during the Civil War. Take notes because you may want to refer back to your notes while writing your essay.

Part II: Your class is planning a field trip to a history museum. To help prepare for what you will see, write an informative essay about Civil War quilts. In your essay, discuss the history of the quilts, including the reasons people made these quilts during the Civil War, and explain how the quilts were made. Include evidence from the sources in Part I to help support the information you include in your essay. The rubric is provided showing how your essay will be scored.

Page 39: Sound Assessment Design

Professional Development to Practice

Another example of Performance Task

Page 40: Sound Assessment Design

Professional Development to Practice

A third example of Performance Task

Page 41: Sound Assessment Design

Professional Development to Practice

Reflection Time Think about the content, skills and processes you teach in your classroom and answer the three questions below.

What content information and simple skills do you teach in your classroom that would lend itself well to being assessed by using SR items?

What concepts, principles and processes do you teach in your classroom that would lend itself well to being assessed by the use of CR items.

When might you ask students to do a PT to show their application of multiple skills and processes?

Page 42: Sound Assessment Design

Professional Development to Practice

Module #3 is a continuation of Module #2!

Module 2

Module 3

Page 43: Sound Assessment Design

Professional Development to Practice

Now let’s practice!

Using either the sample CFA Development Template, or personal work created in steps 1 through 5 previously, complete Steps 6 through 9 by selecting the appropriate types of assessments, matching up the test items with the learning target.

Page 44: Sound Assessment Design

Professional Development to Practice

Step 10: Define Achievement Levels

When items are written, complete step 10 by describing how information from the scoring guides can be used collectively to determine achievement levels for students. These levels will be used later in the Data Team Process.

7. Selected Response: Write Test Items: Give correct answers.

8. Constructed Response: Write Test Items: Create Scoring Rubrics

9. Performance or Personal Communication Write Test Items: Create Scoring Rubrics.

10. Define Achievement Levels: Describe how information from the scoring guides can be used collectively to determine achievement levels for students. These levels will be used in the Data Team Process. (In example below, students complete a 7 question formative assessment. Questions 1-5 are selected response, and questions #6 and #7 are constructed response items with either 3 pt or 4 pt rubrics for scoring)

Proficient & Higher Correct answers on all 5 SR items, at least 2 out of 3 on CR item #6, and at least 3 out of 4 on CR item #7 Close to Proficient Correct answers on at least 3-4 SR items, at least 2 out of 3 on CR item #6, and at least 2-3 out of 4 on CR item #7 Far to Go Correct answers on 1-2 of 5 SR items, at least 1 out of 3 on CR item #6, at least 1 out of 4 on CR item #7 Intervention Correct answers on 0-1 SR items, OR 0-1 out of 3 on CR item #6, OR 0-1 on CR item #7

Proficient & Higher Close to Proficient Far to Go Intervention

11. Review and Revise……Exchange tests with another group. Evaluate the overall quality of the assessment as well as the individual items within the test. Make suggestions and return test to writers for them to make suggested revisions. NEXT STEPS: 12. Give the Pre-Assessment to students and collaboratively score—begin the DT process by charting the results for each teacher and for sub-populations. 13. Evaluate the students understanding of the BIG ideas as you go along with the unit of study by using the Essential questions…an indicator of what’s happening as you continue with the unit of study. 14. Give the Post-Assessment to students and collaboratively score----chart post test results. Compare Pre-Test Results with Post-Test Results. Determine next steps. (Adapted from Larry Ainsworth’s resources—Formative Assessment and Leadership and Learning CFA Resources))

Page 45: Sound Assessment Design

Professional Development to Practice

Step 11: Review and Revise

Review test items collaboratively to affirm the quality and appropriateness of the test items.

7. Selected Response: Write Test Items: Give correct answers.

8. Constructed Response: Write Test Items: Create Scoring Rubrics

9. Performance or Personal Communication Write Test Items: Create Scoring Rubrics.

10. Define Achievement Levels: Describe how information from the scoring guides can be used collectively to determine achievement levels for students. These levels will be used in the Data Team Process. (In example below, students complete a 7 question formative assessment. Questions 1-5 are selected response, and questions #6 and #7 are constructed response items with either 3 pt or 4 pt rubrics for scoring)

Proficient & Higher Correct answers on all 5 SR items, at least 2 out of 3 on CR item #6, and at least 3 out of 4 on CR item #7 Close to Proficient Correct answers on at least 3-4 SR items, at least 2 out of 3 on CR item #6, and at least 2-3 out of 4 on CR item #7 Far to Go Correct answers on 1-2 of 5 SR items, at least 1 out of 3 on CR item #6, at least 1 out of 4 on CR item #7 Intervention Correct answers on 0-1 SR items, OR 0-1 out of 3 on CR item #6, OR 0-1 on CR item #7

Proficient & Higher Close to Proficient Far to Go Intervention

11. Review and Revise……Exchange tests with another group. Evaluate the overall quality of the assessment as well as the individual items within the test. Make suggestions and return test to writers for them to make suggested revisions. NEXT STEPS: 12. Give the Pre-Assessment to students and collaboratively score—begin the DT process by charting the results for each teacher and for sub-populations. 13. Evaluate the students understanding of the BIG ideas as you go along with the unit of study by using the Essential questions…an indicator of what’s happening as you continue with the unit of study. 14. Give the Post-Assessment to students and collaboratively score----chart post test results. Compare Pre-Test Results with Post-Test Results. Determine next steps. (Adapted from Larry Ainsworth’s resources—Formative Assessment and Leadership and Learning CFA Resources))

Page 46: Sound Assessment Design

Professional Development to Practice

Evaluate Your Test Collectively, do the assessment items produce the necessary

evidence to determine whether or not the student has mastered the standard(s) targeted for assessment?

Are the assessment items the most appropriate type to use to measure the targeted standard?

Do the assessment items require students to demonstrate the same level of rigor as specified by the targeted standard?

Are the items worded clearly and concisely? Are the directions clear so students clearly understand what

they are to do? Are the items free of any type of bias?

Page 47: Sound Assessment Design

Professional Development to Practice

Using Data to Inform Test Writing Skills

Use of data from a careful item analysis can help a teacher improve his/her test writing skills.

Additionally, looking at student results can give the teacher ideas as to improvements that need to be made in instruction and/or curriculum.

See next slide. Written by Jana Scott, MAP Instructional Facilitator, University of MO-Columbia, 2007.

Page 48: Sound Assessment Design

Professional Development to Practice

Possible Causes for Faulty Items/Low Scores on Items

1. Basic content or skills have not been addressed or taught. 2. Students are unfamiliar with the process needed to solve the

problem/answer the question (i.e. problem solving, deductive/inductive thinking, making an inference, etc.)

3. Students are unfamiliar with the format needed to answer the question. (i.e. political cartoon, letter, graph, etc.)

4. Students are unfamiliar with the meaning of the language used in the test item. (i.e. compare and contrast, paraphrase, illustrate, evaluate, etc.)

5. Lack of reading ability. Vocabulary used in item stem or stimulus is too difficult.

6. Wording of the item is unclear or confusing. 7. The rubric does not align with the test item. The rubric holds students

accountable for something that was not cued in the item stem. 8. The rubric holds students to a standard that is not grade-level appropriate. 9. The item is asking the impossible or improbable (i.e. Asking for two

similarities and two differences when there are not that many. Asking for three details when there are not that many.)

10.The stimulus material used as a basis for item development is at fault.

Written by Jana Scott, MAP Instructional Facilitator, University of MO-Columbia, 2007.

Page 49: Sound Assessment Design

Professional Development to Practice

Reflection Based on what you have learned today, What steps might you take in order to become a “top notch” writer of formative assessments and of the various types of test items? What potential challenges do you foresee? How might these be overcome? What tools and/or resources might you use to ensure the assessments and assessment items you write are top quality?

Page 50: Sound Assessment Design

Professional Development to Practice

Practice ProfileMissouri Collaborative Work Practice Profile

Foundations present in the implementation of each essential function: Commitment to the success of all students and to improving the quality of instruction. Common Formati ve Assessment

Essential Functions Exemplary Proficiency Ideal Implementation

Proficient

Close to Proficient

(Skill is emerging, but not yet to ideal

proficiency. Coaching is

recommended.)

Far from Proficient (Follow-up

professional development and coaching

is critical.)

Evidence

1

Educators develop clear and meaningful learning goals to guide instruction and student learning.

All of the following criteria are met. Learning goal is clearly connected to a big idea/essential

learning in the domain Learning goal develops deep understanding of

underlying concepts and/or acquisition of skills Learning goal clearly engages higher order thinking

processes Learning goal is clearly manageable and can be

accomplished in the course of a lesson or unit (may be several periods)

Learning target is clearly explained to students Connections between current learning goal and prior

learning are clearly made

At least 6 of the criteria are met.

At least 4 of the criteria are met.

Less than 4 of the criteria are met.

Common Formative Assessment Development & Implementation Template.

2

Educators establish clear and measureable student success criteria in a rubric, scoring guide, or checklist.

All of the following criteria are met. Success criteria are clearly and effectively aligned to

learning goals Success criteria clearly and effectively relate to what

students will say, do, make or write to show evidence of learning

Success criteria clearly and effectively reflect ways for students to indicate their current status relative to the learning goals

Success criteria are communicated in language student can fully understand

Success criteria are frequently referred to during the learning process.

At least 3 of the criteria are met.

At least 2 of the criteria are met.

Less than 2 of the criteria are met.

Common Formative Assessment Development & Implementation Template.

Page 51: Sound Assessment Design

Professional Development to Practice

Implementation Fidelity

Yes Partially No If partially or no, please

explain. 1. Common formative assessment is linked to selected

learning standards.

2. Learning goal engages higher order thinking processes.

3. Learning goal can be accomplished in the course of a unit.

4. Learning target is written in language that students can clearly understand.

5. Learning target is clearly explained to students. 6. Success criteria are written in language that

students can clearly understand in a rubric or checklist.

7. Students receive feedback based on learning goal and their assessment results.

8. The quality of assessment items for measuring mastery is reviewed and items revised as needed.

Total

Page 52: Sound Assessment Design

Professional Development to Practice

For Additional Learning In-depth additional training from RPDC staff

members on how to write quality selected response items, constructed response items and performance events.

Books contained in Bibliography on the next slide. Websites and Videos about Formative Assessment

http://www.youtube.com/watch?v=2K8qbI_FzGE http://www.amle.org/Publications/WebExclusive/

Assessment/tabid/1120/Default.aspx http://www.ncpublicschools.org/docs/accountability/

educators/fastresearchresources.pdf http://www.ncpublicschools.org/accountability/educators/

vision/formative

Page 53: Sound Assessment Design

Professional Development to Practice

Bibliography Common Formative Assessments: How to

Connect Standards-Based Instruction and Assessment; Larry Ainsworth, Donald Viegut; Corwin Press;2006.

Common Formative Assessment Training Manual; Second Edition; The Leadership and Learning Center; Houghton Mifflin Harcourt;2011.