21
Building Assessments in Competency-Based Programs Association for the Assessment of Learning in Higher Education June 2016

Building Assessments in Competency-Based Programsschd.ws/hosted_files/aalheconference2016/28/University of Phoenix... · Building Assessments in Competency-Based Programs Association

Embed Size (px)

Citation preview

Building Assessments in Competency-Based Programs

Association for the Assessment of Learning in Higher Education June 2016

© 2016 University of Phoenix | All Rights Reserved

Conna Bral, EdDProgram Dean

Erin Hugus, MAInstructional Designer

Mary Tkatchov, MAEditor

Center for Competency-Based EducationUniversity of Phoenix

Introduction

2

© 2016 University of Phoenix | All Rights Reserved

Background of CBE Assessment Role of Industry

Expertise Assessment and

Rubric Development Lessons Learned

AGENDA

3

© 2016 University of Phoenix | All Rights Reserved

Poll Question: In which stage of competency-based education program development is your institution?

4

© 2016 University of Phoenix | All Rights Reserved

Authentic Assessments− Wiggins (1990) defined authentic assessments as

assessments that “require students to be effective performers with acquired knowledge”

− “[Authentic assessment] presumes that students will produce something that reflects not a narrow, compartmentalized repetition of what was presented to them, but an integrated scholarship which connects their learning housed in other disciplines and which is presented in a setting consistent with that in which the learning is likely to be most useful in the future.” Tanner (1997)

− “The CBE program relies upon a strong foundation for the assessment of student learning outcomes…”HLC Elements of Good Practice in CBE (2015)

Authentic Performance-Based Assessments

5

© 2016 University of Phoenix | All Rights Reserved

Performance-Based Assessment Provide a more direct measure of students skills and

abilities. More motivating to students (Hancock, 2007)

Allow for evaluation of both process and product (Messick, 1994)

In an age …in which the workplace will require “new ways to get work done, solve problems, or create new knowledge”—the assessment of students will need to be largely performance based so that students can show how well they are able to apply content knowledge to critical thinking, problem solving, and analytical tasks…The Partnership for 21st Century Skills (2007)

Performance-based assessment requires students to use high-level thinking to perform, create, or produce something with transferable real-world application. Stanford (2008)

“Proposed learning outcomes emphasize performance…” Evaluation Considerations: Council of Regional Accrediting Commissions (2015)

Authentic Performance-Based Assessments

6

Presenter
Presentation Notes

© 2016 University of Phoenix | All Rights Reserved

From Understanding by Design by Grant Wiggins and Jay McTighe− Starts with the end in mind

“Effective curriculum is planned backward from long-term, desired results through a three-stage design process”− Desired Results = Competencies − Evidence = Assessments and Rubrics− Learning Plan = Learning Activities and Materials

Backward Design

7

© 2016 University of Phoenix | All Rights Reserved

Competencies (Outcomes)

Validation

Assessments (performance

tasks)

Validation and Revisions

Scoring Rubrics

Scorer Training and Calibration

Scoring and Data Collection

Data Analysis (reliability)

The Performance Assessment Cycle

8

© 2016 University of Phoenix | All Rights Reserved

Poll Question: How does your institution determine the knowledge and skills to assess in a new program?

9

© 2016 University of Phoenix | All Rights Reserved

Hart Research Associates (2013) employer survey, 78% of employers indicated there should be more emphasis on “the application of knowledge and skills to real-world settings”

Subject matter experts (SMEs)− Focus Groups− Advisory Boards

Professional standards Drives competency development

Industry Expertise

10

© 2016 University of Phoenix | All Rights Reserved

Focus group with SMEs from industry (including faculty)

− Brainstorm knowledge, skills, and abilities for role(s) program prepares students for

− Cluster into related categories− Write competency statements− Align with professional standards

Competency Development

11

© 2016 University of Phoenix | All Rights Reserved

Validation of competencies − Validation by faculty council− Industry expert focus group at national conference− Alignment with professional standards

Desired Results

Competency Validation

12

© 2016 University of Phoenix | All Rights Reserved

Assessment Development

Draft

• SMEs draft assessment instructions and rubrics• Curriculum Team and Assessment Dean provide guidance

Revision• Curriculum Team revises and edits drafts

Validation• College Faculty Council validates assessments

Final Review

• Curriculum Team, Assessment Dean, and College Assessment Manager review and finalize assessments

13

Presenter
Presentation Notes

© 2016 University of Phoenix | All Rights Reserved

Poll Question: How does your institution utilize rubrics?

14

© 2016 University of Phoenix | All Rights Reserved

Provide consistency and reliability for assessing outcomes of performance-based assessments

Enhance transparency Clearly describe levels of quality

− Not a checklist or rating scale− Represents developmental sequence across levels

Provide data about student learning

Rubrics

15

© 2016 University of Phoenix | All Rights Reserved

Rubric Structure Applied

(1) Does Not Meet Expectations

(2) Approaches Expectations

(3) Meets

Expectations

(4) Exceeds

Expectations

Grading Criterion

Parts are missing or incomplete and/orquality is poor

All parts are complete but quality needs improvement

All parts are complete and quality is acceptable

All parts are complete and quality is exceptional

Use of Research to Support Ideas

Research to support ideas is missing or irrelevant.

OR

Writing contains mostly quoted material.

Ideas are minimally supported by relevant research.

Ideas are adequately supported by relevant research.

An in-depth understanding of the topics is demonstrated through exceptional use of relevant research to support ideas.

16

© 2016 University of Phoenix | All Rights Reserved

(1) Does Not Meet Expectations

(2) Approaches Expectations

(3) Meets

Expectations

(4) Exceeds

Expectations Depth missing, incomplete,

minimal attention to, not comprehensive, cursory, serious deficiencies, misses,omits, ignores, lacks, little or no

vague, general, too narrow, minimal, few, some, attempts to

adequate, appropriate, sufficient, complete, specific, includes

thorough, deep or in depth, substantial, comprehensive, explicitly, complex/with complexity, wide variety of

Quality illogical, unsupported, inappropriate, superficial, off topic, ambiguous, irrelevant, inaccurate, poor, /poorly, erroneous, demonstrates misunderstanding of

vague, basic, general, weak, underdeveloped, repetitive, redundant, inconsistent

adequate, sufficient, sound, effective, consistent, relevant, adequate, adequately, appropriate, appropriately, moderate, moderately, accurate, accurately, clear

exceptional, skillfully, skillful use of, insightful, logical, creative, well articulated, compelling, persuasive, engaging,deep and thoughtful judgments, error-free, precise

Rubric Language Library

17

© 2016 University of Phoenix | All Rights Reserved

Poll Question: How do you validate performance-based assessments and related rubrics?

18

© 2016 University of Phoenix | All Rights Reserved

Validation and Content

Validation Activity Calibration and training Reliability studies Evidence

SMEs develop learning activities and materials that support the assessments and competencies

Learning Plan

19

© 2015 University of Phoenix | All Rights Reserved

QualityPlan for Future RevisionsReference Library

SME TrainingContinuous Evaluation During Development

Lessons Learned

20

© 2016 University of Phoenix | All Rights Reserved

Questions?

[email protected]@phoenx.edu

[email protected]

21