40
Lance Speelmon Scholarly Technologist Enhancing OSP for Programmatic and Institutional Assessment

Lance Speelmon Scholarly Technologist Enhancing OSP for Programmatic and Institutional Assessment

Embed Size (px)

Citation preview

Lance SpeelmonScholarly Technologist

Enhancing OSP for Programmatic and Institutional Assessment

Many thanks…

• Shoji Kajita, Nagoya University

• Lynn Ward, Indiana University• John Gosney, Indiana University

IUPUI: Institutional Profile

• Indiana University Purdue University at Indianapolis• Founded 1969 with a strong local mission• Blended campus• Metropolitan research university• 20+ schools (15 with professional/pre-professional

foci)• Commuter campus, ~30,000 students (~20,000

undergraduates)

Approach to General Education

• Early 1990s - General Education based in the schools—distributive model

• Accreditation prompts internal reflection• Campus mandate for change, specifically a

centrally coordinated approach and specific learning outcomes for general education

• 1998 – Campus adopts a competency or ability-based model

Principles of Undergraduate Learning (PULs)

• Core Skills • written and oral communication• ability to comprehend, interpret, and analyze texts• analytical (quantitative reasoning)• Information and technological literacy

• Critical Thinking• Integration and Application of Knowledge• Intellectual Depth, Breadth, and Adaptiveness• Understanding Society and Culture• Values and Ethics

Assessment Needs

• Document and demonstrate the effectiveness of IUPUI’s approach to general education (HLCNCA and ICHE)

• Document student achievement in programs subject to specialized accreditation (Education, Engineering, Visual Communications, etc.)

• Standard reports that aggregate and summarize assessment data across courses and programs

• Filter and group on demographic and academic criteria

The Elephant in the US Living Room

• Department of Education – Spelling Commission• 2006 Report: A Test of Leadership: Charting the

Future of U.S. Higher Education• Recommendation 3:

Higher education institutions should measure student learning using quality assessment data from instruments such as, for example, the Collegiate Learning Assessment…

• Risks homogenized education• We want a better solution…

Original Vision

Challenges

• Site-centric nature of OSP tools; no way to easily aggregate data across sites

• No tools to simplify management of very large sites

• Customization also makes it difficult to aggregate data; each department uses a different evaluation form and rating scale.

• No canned reports; every report requires an experienced XML programmer understands underlying data structures

• Academic programs more concerned with their own disciplinary outcomes than PULs

Current Vision: Phase 1: Goal/Outcome Linking• Instructor or program administrator creates and publishes

goal set; goal set becomes aggregation point

• Instructors in program can link any course assignment, matrix cell, or wizard page to one or multiple goals

• Students can attach examples of their work directly to one or multiple goals.

• Standardization of evaluation form elements makes it possible to aggregate and report data across courses and programs

Goal/Outcome Linking

EDUC 301

EDUC 401 PracticumPortfolio

Principles of Teacher Education

Filtering and Grouping

Secondary Education

Phase 2: Goal/Outcome Mapping

Program

OutcomesProgram

Outcomes

Institutional Outcomes

Institutional Outcomes

Campus-level Aggregation via Mapping

IUPUI

PUBLISHING AND LINKING ITEMS TO MATRICES

Create and Publish Matrix

Manage Site Associations

Associate Site(s) with Matrix

Create Assignment in “Associated” Site

Link Assignment to Matrix Cell(s)

Student View of Linked Assignment

Linked Assignment in Matrix Cell

Linkable Tools

• Assignments

• Matrices (cells can be linked to other cells)

• Wizards (pages can be linked to cells)

OTHER OSPENHANCEMENTS

The Problem

Matrix authors must select forms and evaluators in each cell, even when the same choices are used in every cell.

The Solution: Matrix Defaults

26

Use or Override Defaults in Cell

Feedback settings do not use defaults

The Problem

• No workflow support for reviewers (providers of formative feedback)

• Students cannot solicit feedback from peers, advisors, etc.

• No way for individuals who are not CIG members to provide feedback

Reviewer Workflow

• “Request Feedback” button

• Email Notification• Eventually …

Recipient notification

Visual indicator of new feedback

Student Initiated Feedback

External Reviewers

• Email notification provides direct link to cell • Reviewer need not be a member of the portfolio

site• Reviewer must have a server login and password• Eventually …

Reviewer dashboard in My Workspace to aggregate pending feedback requests

The Problem

Users in evaluator (or reviewer) role can open all cells in a users matrix, even if not selected as an evaluator for the cell.

More granular access control is needed to support range of implementation scenarios from highly sensitive and secure to open and collaborative.

Per Cell Access Control

• Cells can only opened by the designated evaluators/reviewers

• Revision of matrix permissions to provide much greater flexibility and granularity of access

Revised Matrix PermissionsTool-level permissions (focus is on authoring activities)

Revised Matrix PermissionsMatrix-level permissions

The Problem

Cells that have been evaluated and returned for additional evidence or other modifications look just like cells that have never been submitted.

The Solution: Returned Status

WHAT’S NEXT?

For Spring and Summer 2009

• Standardized evaluation form and reports • Auto-population of portfolio sites based on

membership of associated course sites• Participant and evaluator notifications• Bring Wizards into functional parity with Matrices

and consolidate into a single tool• Merge IU enhancements with community code,

pending community acceptance

QUESTIONS?

Lance Speelmon, [email protected]

Lynn Ward, [email protected]

John Gosney, [email protected]