21
Service Assessment Programmes: The SCONUL Experience Stephen Town Cranfield University Chair, SCONUL Working Group on Performance Improvement

Service Assessment Programmes: The SCONUL Experience

Embed Size (px)

DESCRIPTION

Presentation delivered as Chair of the SCONUL Working Group on Performance Improvement.

Citation preview

Page 1: Service Assessment Programmes: The SCONUL Experience

Service Assessment Programmes:The SCONUL Experience

Stephen TownCranfield University

Chair, SCONUL Working Group on Performance Improvement

Page 2: Service Assessment Programmes: The SCONUL Experience

Quality in Higher Education

• Quality Assurance – eg Audit, TQA, ISOs, Bologna

• Traditional patterns of Peer Review– eg RAE, Library Reviews

• Batteries of Performance Indicators– CVCP, SCONUL, HESA

• Quality Culture – eg IIP, TQM, Satisfaction Surveys

Page 3: Service Assessment Programmes: The SCONUL Experience

The University Context (from the Library Assessment Conference,

Charlottesville, Va, September 2006)

Universities have two “bottom lines”

1. Financial (as in business)

2. Academic, largely through reputation in• Research (the priority in “leading” Universities)• Teaching (& maybe Learning)

Page 4: Service Assessment Programmes: The SCONUL Experience

Library Pressures for Accountability

The need is therefore to demonstrate the Library contribution in these two dimensions:

1. Financial, through “value for money” or related measures

2. Impact on research, teaching and learning

This also implies that “competitive” data will be highly valued

Page 5: Service Assessment Programmes: The SCONUL Experience

The SCONUL Response

The SCONUL Working Group on Performance Improvement

• Ten years of “toolkit” development to assist in performance measurement and improvement (for both management and advocacy)

• SCONUL ‘Top concern survey’ 2005, leading to VAMP

Page 6: Service Assessment Programmes: The SCONUL Experience

Frameworks: Issues

• Balanced Scorecard• EFQM• Key Performance Indicators

Page 7: Service Assessment Programmes: The SCONUL Experience

Quality:Issues & Solutions

• Benchmarking– Process Benchmarking – Statistical Benchmarking– Peer Group Benchmarking

• Customer Satisfaction Surveys– LibQUAL+– SCONUL Satisfaction

Survey– Priority Research– Opinion Meters

• Charter Mark• Customer Relationship

Management• Investors in People• Quality Assurance

– QAA Guidelines for Assessors

– ISO 9001

• Quality Maturity Model

Page 8: Service Assessment Programmes: The SCONUL Experience

SCONUL Satisfaction Survey

Page 9: Service Assessment Programmes: The SCONUL Experience

SCONUL LibQUAL+ Results 2006

Page 10: Service Assessment Programmes: The SCONUL Experience

CMM‘Capability Maturity Model’

1

2

3

4

55 Optimising

4 Managed

3 Defined

2 Repeatable

1 Initial

Page 11: Service Assessment Programmes: The SCONUL Experience

Statistics: Issues & Solutions

• SCONUL Statistical Questionnaire• SCONUL Statistics on the Web• SCONUL Annual Library Statistics• SCONUL Statistics: Trend Analysis• Higher education library management

statistics (HELMS)• E-Measures Project

Page 12: Service Assessment Programmes: The SCONUL Experience
Page 13: Service Assessment Programmes: The SCONUL Experience

New SCONUL Statistics Measures

• 2d: Breakdown of 'unique serial titles' into:– print only (2e)– electronic only (2f) – print and electronic (2g)

• 2k 'number of electronic databases' • 2l 'number of electronic books' • 4r 'number of successful requests for full-text

articles' • 4s 'number of successful accesses to electronic

books' • 7g Breakdown of 'electronic resources' into:

– 'subscriptions to electronic databases' (7h)– 'expenditure on e-books' (7j)– 'expenditure on other digital documents' (7k)

SCONUL Statistics Web Site

Page 14: Service Assessment Programmes: The SCONUL Experience

Impact & Value:Issues & Solutions

• Impact– Impact Initiative (LIRG / SCONUL)– Institutional Critical Success Factors for Information

Literacy

• Value– Contingent Valuation– Transparency Costing– Staff Value Added

Page 15: Service Assessment Programmes: The SCONUL Experience

Conclusions of Impact Measurement

“Helps us to move library performance on from simply counting inputs and outputs to looking at what difference we really make.”

Payne, et al, 2004

Page 16: Service Assessment Programmes: The SCONUL Experience

SCONUL VAMP Objectives

• New missing measurement instruments & tools

• A full coherent framework for performance, improvement and innovation

• Persuasive data for University Senior Managers, to prove value, impact, comparability, and worth

Page 17: Service Assessment Programmes: The SCONUL Experience

VAMP Project Structure

• Phase 1 (March-June 2006)– Critical review– SCONUL Member Survey– Gap analysis & synthesis– SCONUL Conference Workshops

• Phases 2 & 3 (July 2006 - April 2007)– Development of new measures & techniques– Review and re-branding of existing tools– Web site development– Dissemination & maintenance strategy

Page 18: Service Assessment Programmes: The SCONUL Experience

VAMP Web site

• Showcasing approaches to performance measurement in the following areas:– Frameworks– Impact– Quality– Statistics– Value

• Providing detailed methods on how to apply the different approaches

• Sharing experience from those who have already applied the approach

• Discussion areas

Page 19: Service Assessment Programmes: The SCONUL Experience

Communities of Practice

“groups of people who share a passion for something that they know how to do,and who interact regularly to learn how to do it better”

“coherence through mutual engagement”

Etienne Wenger, 1998 & 2002

Page 20: Service Assessment Programmes: The SCONUL Experience

Member’s Forum(Blog?)

Techniques in Use(Wiki?)

VAMP Home Page

SimpleIntroductio

ns

DetailedTechniques

Community of Practice

Techniques

Page 21: Service Assessment Programmes: The SCONUL Experience

J. Stephen Town

[email protected]