Upload
shea-allie
View
218
Download
3
Embed Size (px)
Citation preview
A Guide for College Assessment Leaders
Ursula Waln, Director of Student Learning AssessmentCentral New Mexico Community College
Facilitating Group Development of Program Rubrics
›Rubrics help reduce subjectivity in evaluations that are inherently subjective– Appropriate when:
› Learning manifests in varying degrees of sophistication
› Formative attainment goals can be described
› Evaluation seeks to differentiate of levels of performance
– “Either/or” dichotomies (such as right or wrong, present or not, can do it or cannot) are more appropriately evaluated with tests or checklists
When Are Rubrics Appropriate?
›Clarify program goals– Promote a shared
understanding of the desired student learning outcomes (SLOs)
– Help faculty see connections between course and program learning outcomes
Rubrics at the Program Level
Pro
gra
m S
LO
Cou
rse
SLO
Pro
gra
m S
LO
Cou
rse
SLO
›Serve as norming devices– Identify benchmarking indicators of achievement
– Describe progressive levels of competency development in ways that clearly guide ratings
›Help aggregate diverse classroom assessments– Provide the means to translate findings into a
common program assessment language
Rubrics at the Program Level
CNM Liberal Arts Effective Communication Rubric
In written, oral, numeric or visual formats, the student will:
Did it awesomelyMastery90-100%
A3 points
Did itProficient80-89%
B2 points
Kind of did itDeveloping70-79%
C1 point
Didn’t do it Non-attempt or Emerging
0-69%D-F
0 points
Demonstrate organization and/or coherence of ideas, content, and/or formulas
Material is sharply focused and organized. The student presents a logical organization of ideas around a common theme that demonstrates an advanced understanding of the subject matter.
Material is mostly focused and organized. The student presents logical constructions around a common theme that reflects meaning and purpose.
The student’s ideas and organizational patterns reflect a common theme that demonstrates a basic understanding of the subject matter. Ideas are disorganized or may lack development in some places.
The material lacks focus and organization with few or no ideas around common theme. Student struggles to demonstrate her/his understanding of the subject matter.
Produce communication appropriate to audience, situation, venue, and/or context
Demonstrates a thorough understanding of context, audience, and purpose that is responsive to the assigned task(s) and focuses on all elements of the work.
Demonstrates adequate consideration of context, audience, and purpose and a clear focus on the assigned task(s) (e.g., the task aligns with audience, purpose, and context).
Demonstrates a basic awareness of context, audience, purpose, and to the assigned tasks(s) (e.g., begins to show awareness of audience's perceptions and assumptions).
Struggles to demonstrate attention to context, audience, purpose, and to the assigned tasks(s) (e.g., expectation of instructor or self as audience).
›Course-level assessments– Connect to program rubrics through SLO
alignment
– Need to be based on course outcomes to be useful at the course level
– May be formative or summative, direct or indirect, qualitative or quantitative
– Can be unique
Rubrics Can Foster Academic Freedom
Formative
Summativ
e
Direct
Indire
ct
Qualitativ
e
Quantitative
Varying assessment type and focus → a broader body of evidence
– Internal and/or external stakeholder input› Students, employers, faculty,
community members
› Surveys, focus groups, interviews
– Standardized/licensing/ certification exam results
– Accreditor and/or consultant feedback
– Practicum/internship evaluations– Peer/benchmark comparisons
Internal
Exte
rnal
›Program-level assessments can vary too
Rubrics help piece together findings from disparate assessments to reveal a bigger picture
›Start with well-written, measurable program outcome statements– Upon successful completion of this
program, the student will be able to…
›Make any necessary revision before developing rubrics
First Things First
›Solicit faculty input and agreement through inclusive processes:1. Brainstorming
2. Grouping
3. Prioritizing
4. Rubric Refinement
Build Shared Ownership
›Ask the faculty how students demonstrate learning related to the competency– Remind everyone that brainstorming is a process of
generating ideas without censure
– Get the ideas down in writing where all can see (e.g., using a flip chart, projection, or note cards)
– Note themes that emerge as potential component criteria, but avoid pigeonholing ideas as this can stifle and/or canalize the flow of ideas
Hold a Group Brainstorming Session
›Group ideas to identify key criteria for assessing the competency
›Combine related ideas, with a focus on broad representation
›Rephrase ideas as needed to describe indicators of learning
Collaboratively Group the Ideas
›Reorder or number the indicators of learning based on progressive levels of sophistication/mastery
›Discuss options for performance-level headings and organization– Number of levels
– Words that could be used as headings
– Ascending or descending order of sophistication
Collaboratively Prioritize
›One person (or a small group) can now draft a rubric based on the group’s input
›Strive for clear, concise descriptions of observable demonstrations of learning– Broad enough to be applicable to multiple courses
– Specific enough to provide unambiguous rating scales
Draft a Rubric
›Ask the faculty for input– Performance-level descriptions need to be truly
representative
– All faculty need to be able to translate their course learning outcomes to the program rubric
›Revisit and revise as needed– After implementation, follow up with the faculty to
assess how well the rubric worked
Collaboratively Refine the Rubric
› Curriculum mapping
› Student Learning Outcome statement writing/refinement
› Facilitating rubric development
› Rubric drafting
› Course-level assessments
› Program-level assessments
› Data analysis
› Representing findings in graphics
Ursula Waln
505-224-4000 x 52943
LSA 101C
ASSESSMENT ASSISTANCEAVAILABLE