39
1 The development of rubrics as assessment tool to facilitate feedback and enhance learning

1 The development of rubrics as assessment tool to facilitate feedback and enhance learning

Embed Size (px)

Citation preview

1

The development of rubrics as assessment tool to facilitate feedback and enhance learning

2

The name: rubric

The term rubric is derived from the Latin term rubrica that means, "red earth“.  It came to refer to indications written in red ink within manuscripts of various forms during the middle ages.  Red markings within liturgical documents could indicate how a hymn was to be sung or a religious service was to be conducted.  In legal documents, text in red often indicated a heading in a code of law that led to rubric coming to mean any brief, authoritative rule.

3

WHAT IS AN ASSESSMENT RUBRIC?

• A checklist of characteristics - that makes it easier to assess - the quality of a learning product.• A rubric identifies the traits and components - that must be present - to indicate the extent to which - a learning outcome is achieved.• A rubric is a set of assessment criteria that

specifies the required characteristics for each level of quality.

4

WHY USE RUBRICS?

• Promote unanxious expectations• Make grading criteria known to students• Drive curriculum and pedagogy• Reduce teacher subjectivity• Ensure accountability• Maintain focus on content and performance standards and student work• Provide opportunities for self-assessment

5

What is a criterion?

• A characteristic or trait to make decisions by

• A standard on which a decision may be based

• A yard stick for measurement

• What you will use to base a decision on

6

Terms to use in measuring range/scoring levels

After you write your first paragraph of the highest level, circle the words in that paragraph that can

vary. These words will be the ones that you will change as

you write the less than top level performances.

• Needs Improvement...Satisfactory...Good...Exemplary

• Beginning...Developing...Accomplished...Exemplary

• Needs work...Good...Excellent

• Novice...Apprentice...Proficient...Distinguished

• Numeric scale ranging from 1 to 5, for example

7

Levels of achievement

Degrees of Quality Degrees ofFrequency

Degrees of Expertise

excellent frequently expert

good sometimes advanced

fair rarely intermediate

poor never beginner

8

Types of rubrics

• Holistic Rubric:

Scoring decision based on global look

• Analytic Rubric:

Examines only certain criteria but in detail

9

THE COMPONENTS OFA HOLISTIC RUBRIC

• Title

• Different achievement levels needed

• Level descriptor indication

• Level descriptor criteria

10

Collaboration RubricLEVEL 4 3 2 1

LEVELDESCRIP-TOR

ThoroughUnderstanding ofCollaboration

Good Understandingof Collaboration

SatisfactoryUnderstanding ofCollaboration

Needs Improvementin understanding ofCollaboration

LEVELDESCRIP-TORCRITERIA

Consistentlyand activelyworks towardgroup goals.

Is sensitive tothe feelings andlearning needsof all groupmembers.

Willinglyaccepts andfulfillsindividual rolewithin thegroup.

Consistentlyand activelycontributesknowledge,opinions, andskills.

Values theknowledge,opinion andskills of allgroup membersand encouragestheircontribution.

Works towardgroup goalswithoutprompting.

Accepts andfulfillsindividual rolewithin thegroup.

Contributesknowledge,opinions, andskills withoutprompting.

Showssensitivity to thefeelings ofothers.

Willinglyparticipates inneeded changes.

Works towardgroup goals withoccasionalprompting.

Contributes tothe group withoccasionalprompting.

Shows sensitivityto the feelings ofothers.

Participates inneeded changes,with occasionalprompting.

Workstoward groupgoals onlywhenprompted.

Contributesto the grouponly whenprompted.

Needsoccasionalreminders tobe sensitiveto thefeelings ofothers.

Participatesin neededchangeswhenpromptedandencouraged.

11

THE COMPONENTS OFAN ANALYTIC RUBRIC

• Title• Levels• Level descriptors (if applicable)• Categories• Category descriptors• Category level descriptor criteria [evidence

expected] (within categories or within category descriptors)

12

Standards

Level 1 Level 2 Level 3 Level 4

W eb Page (HTML) Creation Skills

No HTMLformatting tags;tex t is no t b roken in to paragraphs

Tex t is broken in toparagraphs; head ingsare used ; no o therHTML tags

H ead ings; T itle; Tags such as,p reformatted tex t; sty les;cen tering ; horizon tal lines,lists, etc .

Same as level 3 p lus atleast two lists, images ashyperlinks; co lor o rbackground image,Frames, tab les, o r imagemap

Web Page Layout

Layou t has nostructu re o rorgan ization

Tex t b roken in toparagraphs andsections

H ead ings label sections andcreate h ierarchy ; someconsistancy

Consistan t fo rmat; ex tendsthe in form ation page-to -page; easy to read ;atten tion to d ifferen tb rowsers and their qu irks

Navigation

One P age One page w ith titlebar added , head ing ,etc.

Two pages (o r one page w ithlinks w ith in page o r to o therresources); nav igation betw eenpages; links work

Title page w ith o therpages b ranch ing o ff, and atleast four pagesto tal;nav igation path isclear and log ical;all linkswork

13

(Describe here the task or performance that this rubric is designed to evaluate.)

Beginning

1

Developing

2

Accomplished 3

Exemplary 4

Score

Stated Objective orPerformance

Descript ion ofident ifiable

performancecharacteristics

reflect ing abeginning

level ofperformance.

Description ofidentifiable

performancecharacterist ics

reflect ingdevelopment

andmovement

towardmastery of

performance.

Descript ion ofidentifiable

performancecharacterist ics

reflect ingmastery of

performance.

Descript ion ofidentifiable

performancecharacteristicsreflect ing thehighest level

ofperformance.

FORMAT

14

Observation checklist

• Reasons for observing:– To assess knowledge and skills– To assess group interactions– To assess communication skills– To evaluate the effectiveness of a particular aspect– To provide a basis for support, guidance or intervention

• Hints for observing:– Know why you are observing– Know what you are looking for– Plan your checklist– Don’t try to observe everything

• Structure – increases the info you collect– cut down on time wasted

15

ExampleChecklist for observing microscope skills Student’s Name: 

Behaviour/Skills Date Yes No Date Yes No

             

             

             

              

16

Example

Criterion Yes No

Documentation is complete    

The information or data collected is accurate    

Written work is neat and legible    

Tables and diagrams are completed neatly    

Each new section begins with an appropriate heading    

Errors are crossed out but not erased    

Spelling and language usage are edited and corrected    

Information is recorded in a logical sequence    

Technological aids are used appropriately    

Notes are collected in a folder or binder    

Colour or graphics are used to enhance the appearance    

Rough work is done seperately    

Data Collection / Notebook Checklist Name:Date:

17

Rating scale

• Same usage as a observation checklist

• Records the degree to which particular knowledge, skills or processes are found or

• the quality of the performance

18

ExampleCriteria Average Above

AverageImprovement Needed

Task Attitude        Shows enthusiasm        Cooperates with others        Works hard at improving        Can work with others on a team        Shows consideration for the safety and well-being of others

     

Motivation        Can work by her/himself        Is able to understand the tasks to be done and completes them without being told

     

Reliability        Can be trusted        Is able to follow oral or written directions        Is on time with tasks        Attends class regularly        Meets responsibilities

     

Accepts recommendations      

Flexibility      

Group interaction skills      

Rating scale for affective aspects Name:Date or period of observation:

Tips For Effective Rubric Design

How to:design a rubric that does its jobwrite precise criteria and descriptorsmake your rubric student-friendly

The Cookie

Task: Make a chocolate chip cookie that I would want to eat.

Criteria: Texture, Taste, Number of Chocolate Chips, Richness

Range of performance:– Delicious(14-16 pts)– Tasty(11-13 pts)– Edible(8-10 pts)– Not yet edible(0-7 pts)

The Rubric

Delicious

4

Tasty

3

Edible

2

Not yet edible

1

# chips Chips in every bite

75% chips 50% chips Less than 50% chips

texture Consistentlychewy

Chewy middle, crispy edges

Crunchy Like a dog biscuit

color Even golden brown

Brown with pale center

All brown

Or all pale

Burned

richness Buttery, high fat

Medium fat Low-fat flavor

Nonfat flavor

Holistic Or Analytic—Which To Use?

HOLISTIC—views product or performance as a whole; describes characteristics of different levels of performance. Criteria are summarized for each score level.

(level=degree of success—e.g., 4,3,2,1 or “Tasty”)

(criteria= what counts, facets of performance—e.g., research or number of chips or presentation)

Holistic Or Analytic?

HOLISTIC—pros and cons

+Takes less time to create. Well…

+Effectively determines a “not fully developed” performance as a whole

+Efficient for large group scoring; less time to assess

- Not diagnostic

- Student may exhibit traits at two or more levels at the same time.

Holistic Or Analytic?

Analytic=Separate facets of performance are defined, independently valued, and scored.

Example: Music—skill=string improvisation development

Facets scored separately: melody; harmonics; rhythm; bowing & backup; confidence

Holistic Or Analytic?

Analytic—pros and cons

+Sharper focus on target

+Specific feedback (matrix)

+Instructional emphasis

-Time consuming to articulate components and to find language clear enough to define performance levels effectively

Tip

• Don’t use generic or “canned” rubrics without careful consideration of their quality and appropriateness for your project.

• These are your students, not someone else’s.• Your students have received your instruction.

Tip

• Limit the number of criteria– Well…– Don’t combine independent criteria.

• “very clear” and “very organized” (may be clear but not organized or vice versa).

It’s hard work…

• Expect to revise…and revise…– One problem is that the rubric must cover all potential

performances; each should fit somewhere on the rubric.

• “There are no final versions, only drafts and deadlines.”

• When you’ve got a good one, SHARE IT!

When to use these rubrics

• Usually with a relatively complex assignment, such as a long-term project, and essay, or research-based product.– Informative feedback about work in progress– Detailed evaluations of final projects

Also

Provide specific “Comments” on your

rubric and/or on the student product itself.

Rubric Basic Structure

Criteria 1 2 3Number of Sources

1-4 5-9 10-12

Historical Accuracy

Lots of historical inaccuracies

Few inaccuracies No apparent inaccuracies

Organization Cannot tell from which source information came

Can tell with some difficulty where information came

Can easily tell from which sources information was drawn

Use of APA Format

Lots of APA errors Few APA errors No apparent APA errors

Objective: Research Paper

Uses of Rubrics

Set evaluator & performer expectations Criteria by which work is judged

Difference between excellent & weak work

Formative student feedback

Grade assignments

Standardize grading across graders

Assess programs (GEC)

Advantages of Using Rubrics

• Clarity– Expectations, objectives, grading, feedback

• Objectivity– Standardized, consistent, fair, valid, reliable

• Legitimacy– Fairness increases student responsibility

• Efficiency– Easy to make, use and explain

• Improve skills & End Products– Instructor, students, peers

Writing Rubrics

Identify and define the assessment objective or purpose

Select and write the needed number of scoring criteria

Select and write the desired levels of performance

If desired, select and write the descriptors

Electronic Rubric Builders

• Here are a few:– Teachnology.com– Rubistar– Rubric Studio

Calibrating Rubrics: Validity

• Validity – the accuracy with which the rubric assesses the objective or purpose; are we evaluating what we intended?– Self Check– Colleague Review– Student Review– Pilot Test

Calibrating Rubrics: Reliability

• Reliability – how consistently the rubric assesses the objective or purpose over time and across raters; are the resulting scores consistent?

Rubric Resources• Documenting Excellence – General Education Rubrics:

http://www.documentingexcellence.com/examples/rubricgened/rubric.htm

• Authentic Assessment Toolbox: http://jonathan.mueller.faculty.noctrl.edu/toolbox/rubrics.htm

• Rubrics.com: http://www.rubrics.com/

• Teacher Created Rubrics for Assessment: http://www.uwstout.edu/soe/profdev/rubrics.shtml

• Sinclair Community College – General Education Rubrics: http://www.sinclair.edu/about/gened/genedrubrics/index.cfm

• CSU – Institutional research Assessment & Planning: http://www.csufresno.edu/ir/assessment/rubric.shtml

39

Some URLs on assessment rubrics: • http://school.discovery.com/schrockguide/assess.html• http://www.cloudnet.com/~edrbsass/edsci.htm• http://users.massed.net/~gailly/CollaborationRubric• http://lrs.ed.uiuc.edu/students/tbarcalow/490asa/ASAResources.htm• http://www.cmsdnet.net/alliance/ritterla/webtv.htm• http://www.odyssey.on.ca/~elaine.coxon/rubrics.htm• http://bragg-es.odedodea.edu/devers/rubrics.html• http://www.grand.k12.ut.us/curric/rubrics.html• http://www.odyssey.on.ca/~elaine.coxon/Reporting/assessment2.htm• http://home.iprimus.com.au/renaats/english_OUTCOMES.htm• http://www.arp.sprnet.org/inserv/eval5.htm• http://jawbone.clarkston.wednet.edu/pages/classwebs/rubrics.htm• http://www.coe.ilstu.edu/phklass/eaf493/rubric.htm• http://www.music.miami.edu/assessment/rubricsDef.html• http://perrynet.sparcc.org/webunits/bb/Hero/rubrics.html• http://pegasus.cc.ucf.edu/~jmorris/rubric.htm• http://www.kapaams.k12.hi.us/netshare/cinch/assessment_rubrics.htm