33
1 May 2010 The evaluation of learning and development in the workplace: Scanning the external environment Professor Sharon Mavin, Dean of Newcastle Business School Lesley Lee, Deputy Director of HR, Northumbria University Dr. Fiona Robson, Senior Lecturer in HRM

The evaluation of learning and development in the ... · PDF fileThe evaluation of learning and development in the workplace: Scanning the external environment ... Senior Lecturer

Embed Size (px)

Citation preview

1 May 2010

The evaluation of learning and development in the workplace:

Scanning the external environment

Professor Sharon Mavin, Dean of Newcastle Business School

Lesley Lee, Deputy Director of HR, Northumbria University

Dr. Fiona Robson, Senior Lecturer in HRM

2 May 2010

Contents

1. Introduction .............................................................................................................................. 3

2. Learning evaluation in practice: What happens in organisations? ............................................ 4

2.1. UK (General) .................................................................................................................... 4

2.2. UK (Public sector) ............................................................................................................. 5

2.3. International ...................................................................................................................... 6

3. Review of toolkits ..................................................................................................................... 6

4. How can organisations assess their current provision? ............................................................ 7

5. The context of Higher Education .............................................................................................. 8

5.1. Current challenges in the sector ....................................................................................... 8

5.2. Future challenges ............................................................................................................. 9

6. Training and development practices in Higher Education ....................................................... 10

7. The evaluation of learning and development in Higher Education: ......................................... 12

7.1. An overview ........................................................................................................................ 12

7.2. Feedback provided from delegates at HEFCE LGM Conference ........................................ 12

8. Evaluation of learning and development in Higher Education: SWOT analysis ...................... 13

8.1. Strengths ........................................................................................................................ 13

8.2. Weaknesses ................................................................................................................... 13

8.3. Opportunities .................................................................................................................. 13

8.4. Threats ........................................................................................................................... 14

9. Evaluation of learning and development in Higher Education: PESTLE analysis ................... 14

9.1. Political ........................................................................................................................... 14

9.2. Economic ........................................................................................................................ 14

9.3. Sociological .................................................................................................................... 14

9.4. Technological ................................................................................................................. 14

9.5. Legal ............................................................................................................................... 15

9.6. Environmental ................................................................................................................. 15

10. Conclusions ........................................................................................................................ 15

11. References ......................................................................................................................... 16

12. Appendix One .................................................................................................................... 18

13. Appendix Two .................................................................................................................... 27

14. Appendix Three .................................................................................................................. 32

3 May 2010

1. Introduction

This paper leads on from the desk review by looking at some more of the practical elements

related to effective evaluation of learning and development interventions. Split into two parts, the

first part provides examples of current resources and practices in organisations, drawing upon

good practices that may be replicated for this HEFCE LGM project. The second half focuses

clearly on the Higher Education sector to provide more context-specific information to guide this

project. Figure 1 illustrates the sources of information used to inform this section of the research.

Figure 1

Sources of information

Leadership Foundation

Research Papers

Review of online policies and procedures

from range of Universities

External feedback from

LGM Conference

Researchers’ reviews of partners’

documentation

Evaluation of wider policies

and practices in public and

private sectors

Feedback from practitioners on current stage of

maturity of evaluations

Feedback from practitioners on

CIPD toolkit

4 May 2010

Part One: Current organisational practices: An overview

2. Learning evaluation in practice: What happens in organisations?

2.1. UK (General)

It is evident from this study that a diverse range of practices exist within organisations both in the

UK and internationally. An unexpected outcome however was that only a very small number of

private sector organisations provide details of their practices and supporting policies and

documentation online.

Where private sector organisations provided this information it was often linked to their

dissemination of CSR activities (for example; Kingfisher) or as part of their recruitment

programmes. Table 1 shows an overview of four organisations that provide online data about their

practices as well as an overview of the strengths of their documentation and processes.

Table 1

Organisation Information provided online

Perceived strengths of information provided

Natural History Museum

Learning and development strategy

Learning and development plan

Application for funding

Feedback form

Very detailed

Specific L&D mission supported by six strategic aims –also maps how each aim will be achieved

Links with external accreditation (IIP)

Provides valuable context on diverse nature of workforce and the roles performed (including unpaid)

Clear responsibilities for all stakeholders (including employees)

Priority attendance for employees where the learning need was identified within their development plans

Very open about budget

Explicit section on evaluation o All courses evaluated o Details how they are monitored and

how they are reported

Reinforces the importance of transfer of learning as part of the evaluation process

Very clear feedback form which requires links to how the department benefits

Kingfisher plc Training webpage Provides facts and figures to describe their previous commitment

Strong focus on providing vocational qualifications

Rolls Royce Learning and development webpage

Have set up a new Centre of Excellence specifically to deal with L&D

Clear links with strategic training priorities

Customised (online)system so that employees can address their specific needs

Combination of online and face-to-face interventions available

Key organisational issues are prioritised i.e. business ethics

5 May 2010

Asda „Training at Asda‟ Clear commitment for developing all employees

Focus on tailored programmes

Programmes are linked to core values

2.2. UK (Public sector)

An internet-based search allowed the identification of a diverse range of public sector

organisations who provide access to sufficient online resources to be able to gain an

understanding of their practices. For the purpose of this research five organisations (as identified

below in Figure 2) have been analysed:

Figure 2

Organisations’ whose documentation was reviewed as part of the secondary research

Trafford Metropolitan Borough Council

Office for Fair Trading (OFT)

Thames Valley Police

Heddlu Dyfed-Powys Police

Lincolnshire County Council (Children‟s Services)

The perceived strengths of each organisation (based solely on the documentary analysis) is

summarised in the table below:

Table 2

Perceived strengths of learning, development and evaluation information from selected public sector organisations

Organisation Strengths

Trafford Metropolitan Borough Council

Clear learning and development strategy

Clear link between identifying training needs and corporate/service needs

Emphasises the importance of management commitment

Links to modernisation agenda

Links to wider organisational objectives

States that evaluation is important and links to Kirkpatrick

Evaluation should be undertaken at individual and organisational level

Office for Fair Trading (OFT)

Reinforces a partnership approach between stakeholders

Emphasises the importance of a supportive environment

Coherent strategy to identify learning needs and then take the next steps

Clear commitment to training and development evaluation o Details arrangements for monitoring and evaluation o Explains how the data will be communicated o Evaluation at different levels

Thames Valley Police

Very clear about where the learning and development plans and policy sits within the overall training strategy

Clear diagrams to show the learning and development at different levels within the organisation e.g. department, team, individual

Clear emphasis on ensuring that all training and development that is delivered meets the agreed standards

o Details what should be monitored, when and by whom o Clear about which elements should be evaluated at stages 3 and 4

from Kirkpatrick‟s model (and criteria used)

6 May 2010

Heddlu Dyfed-Powys Police

Clear mission and vision statements which link to overall organisational strategy

Identifies key themes/priorities

Demonstrates successes from previous year and the impact that they have had in the organisation (though these cannot always be measured tangibly)

Links to appropriate national documents and priorities e.g. Home Office Strategic Plan

Identifies the need to undertake „environmental scanning‟ so that future needs can be planned and met through appropriate interventions (including changes in legislation and the development of national programmes

Two full pages on evaluation feature in the Force Learning and Development Strategy

o Force Evaluator must be involved with new programmes at the design stage so that the purpose, content and timing of evaluation can be established from the offset

o Clear list of priorities for evaluating learning and development e.g. first two current priorities are:

Police Race and Diversity Learning Development Programme Initial Police Learning Development Programme

o Evaluation data is reported to: Trainer Evaluation Sponsor Force Learning Manager Training Prioritisation Group

o Formal system for monitoring recommendations made in evaluation reports

Lincolnshire Council (Children‟s Services)

Identifies both national and local drivers (e.g. The Children Act, 2004 and Every Child Matters, 2004)

Clear link to Council‟s overall People Strategy

Identifies key themes and workforce development priorities

Reiterates the importance of evaluation and outlines the reasons why

Introduces the pilot of a five stage evaluation process

2.3. International

3. Review of toolkits

As part of this project all of the HR Practitioners were asked to review the CIPD toolkit on Learning

Evaluation and highlight the strengths and weaknesses of the ideas posed and the tools that have

been provided. In addition they were asked to explore which tools could be contextualised to

enable them to be more useful within the Higher Education context. A copy of the evaluations

received from the Practitioners is available as Appendix One, and some of the common issues are

reported below in Figure 3.

Figure 3 Common views from the practitioners’ evaluation of the CIPD toolkit

Too many tools in the toolkit – would be impractical to try to use them all

Some useful tools – some of which may need adapting

Individually designed evaluations for each event would be very time (and resource) consuming

Some good ideas but in practice they would take too long i.e. interviews before every intervention

The roles played by different stakeholders is important and could be disseminated more widely

There is potential in using critical incident techniques

7 May 2010

The ability of managers to support the process will be important

No consistent approach across three partners in terms of getting feedback from trainers

RoI is important but also a complex and time consuming area

Need to avoid using too many forms

4. How can organisations assess their current provision?

There are a number of ways that organisations could assess their current provision. Robson and

Mavin (2010) have created a framework which incorporates a diverse range of literature and links

to „best practice‟ as a way of evaluating an organisation‟s approach to evaluation in an objective

manner. A copy of this framework is available in Appendix Two.

Following their review of the CIPD Evaluation toolkit the three practitioners from the partner

Universities in the project also agreed that it would be useful to respond to some of the issues

identified within Tool 1.1. The specific questions that they answered are shown below in Figure 4.

Figure 4

Questions from Tool 1.1 answered by the practitioners:

1. How often do learners and their managers meet to discuss performance and learning and development issues?

2. Do learners and managers take joint ownership for learning and development? 3. How well are people able to explain the contribution that learning and development makes to:

(a) their own (b) their team‟s (c) the organisation‟s performance? performance? ability to meet its performance

4. Do you use performance review meetings to evaluate the impact of learning and development activities on individual performance?

5. How many examples do you have of improvements in organisational performance that can be attributed to learning and development activities?

6. Does top management review the contribution of learning and development activities to the achievement of the organisation‟s goals and targets?

7. Does top management review the cost and benefits/ROI of learning and development activities?

8. How many improvements have been made to your learning and development interventions as a result of evaluation information?

9. How effective are your evaluation processes in achieving continuous improvement in your learning and development activities?

Finally an international source of good practice in evaluation was identified; the Australian Public

Sector Commission (APSC) and as a consequence their checklists were used as a further self

analysis tool. An overview of the self-assessment areas covered in their report entitled „Evaluating

learning: A framework for judging success‟ are summarised below in Figure 5.

Figure 5

Key areas explored in APSC learning and development evaluation maturity model

1. Evaluation decisions and planning

2. Relevance

3. Appropriateness

4. Reaction

5. Capability acquired

6. Performance on the job

7. Outcomes

8 May 2010

Part Two: Addressing the context of evaluation in Higher Education

5. The context of Higher Education

On 1 December 2008 there were 382,760 staff employed in the HE sector, of whom 179,040

(46.8%) were academic professionals. 252,520 staff were employed on full-time contracts and

130,240 on part-time contracts (HESA, 2010). This demonstrates the size of the sector and the

number of employees who are likely to exposed to learning and development interventions. The

significant proportion of part-time contracts raises a pertinent question about whether these

employees have the same access to learning and development initiatives. In addition it suggests

that it will be essential to undertake primary research with a diverse selection of staff including part-

time staff. When reviewing evaluation data it will also be useful to establish whether there are any

specific issues highlighted by part-time employees in terms of opportunities to engage with learning

and development interventions.

Blackmore and Blackwell (2004) identify some differences in the infrastructure for developing staff,

commenting on the differences in size between „staff development functions in pre- and post- 1992

Universities in the past. Whilst the authors believe that great strides in reducing this gap have been

taken they argue that staff development could still be used more strategically throughout the

sector.

5.1. Current challenges in the sector

Whitchurch‟s (2008) work on the changing role of non-academics has clear implications for

the design, delivery and evaluation of learning interventions as she more explores the wider

and more diverse range of activities that are carried out. This will also have implications for

learning and development on a team or departmental level because [University staff] “are

moving laterally across functional and organisational boundaries” (Whitchurch, 2008:1).

Further implications can be drawn from her assertion that “the situation is more dynamic and

multi-faceted than might be suggested solely by an examination of job descriptions or

organisation charts”. This potentially has a significant implication as in the past learning and

development strategies may have been evolved from these documents and it suggests that

they are no longer such a reliable evidence base.

Blackmore and Blackwell (2004) argue that as University staff are required to do more and

more it is essential that they are supported appropriately to take on their new roles and

responsibilities. This is particularly pertinent for the growing number of professional support

staff,

Change from UK Research Assessment Exercise to Research Excellence Framework

External accountability – NSS, QAA, School based accreditations such as AACSB

The current economic climate

Uncertainty caused by the political climate; awaiting details of implications from new

coalition Government

Media in terms of financial performance etc (Burgoyne et al, 2009).

Requirement for academic staff studying for Doctorates to engage in appropriate training

and development activities to develop knowledge and skills

9 May 2010

Leadership development is an ongoing issue (Burgoyne et al, 2009).

Audit cultures (Deem and Lucas, 2007)

Learning and develop offerings interpreted as managerialism (Adams, 2005)

Conflicting pressures: pressure to collaborate as well as compete against other institutions

(Barnett, 2003).

Increasing legislative requirements (Gordon and Whitchurch)

Synchronising of organisation planning cycles with setting up/reviewing need and focus for

training and development programmes

The amount of re-organisation within Universities. Even where this is supported by

development, coaching and mentoring, there is still a sense of having to compete with the

supporting interventions for time and energy of staff.

A lot of mandatory training (Adams, 2005) where it may be more difficult to motivate people

to engage in the event and evaluate positively afterwards. The work of Adams suggests that

there may be ways to reduce this likelihood when providing „universal‟ training

In some circumstances i.e. mentoring or coaching, the relationship may be key and comparable to doctor/patient in relation to confidentiality (Adams, 2005)

5.2. Future challenges

Looking specifically at this research project we need to consider views expressed by the

Rugby Team (2005:11) who make an interesting point in stating “we need to find measures

that are reliable and appropriate, not just easy to collect”. Identifying and collecting

appropriate performance information is an ongoing issue, reliance on the standard KPIs is

likely to be insufficient.

Finding efficient ways to collect meaningful evaluation data.

Continuing to enhance cluster of performance areas; research, teaching and student

progression (Guest and Clinton, 2007).

Financial challenges

New legislation which provides the entitlement to „time off for training‟ will have financial

costs attached as well as indirect costs where work has to be covered and there is a

possible short term impact on productivity. Universities will need to consider how best to

manage this process.

Government priorities and vision for Higher Education

Establishing and maintaining the teaching/research balance

The increasing focus on students as customers; this may increased the pressure for more

effective performance and a more customer service led orientation.

Figure 6 shows challenges to UK HE uncovered by Burgoyne et al (2009:56)

10 May 2010

Figure 6

6. Training and development practices in Higher Education

As part of this study the research team reviewed the training and development practices of a

number of UK-based Universities; evaluating the information that was available via their websites.

Five Universities were randomly selected, ensuring inclusion from both post- and pre- 1992

Universities. This sample was not intended to be representative as this would be beyond the scope

of this study, though it does provide an insight into the approach taken by Universities. Figure 7

identifies the Universities‟ whose documents were reviewed.

Figure 7

Universities evaluated as part of the process

University of Manchester

University of Brighton

University of the West of England (UWE)

University of Liverpool

Bournemouth University

Table 3 shows the documents reviewed as part of this process and highlights the perceived

strengths which could be considered for the purpose of the current project.

Challenges

Market forces

Global agenda

Government initiatives

Rapid changeRAE [now

REF] performance

Student recruitment

Finanical performance

11 May 2010

Table 3 Review of documents from UK Universities

University Documents reviewed

Perceived strengths

University of Manchester

People and Organisational Development Strategy

Clear links with overall strategic plan

Commits to a review of activities to ensure they are consistent and appropriate with new strategic plan

Strong emphasis on equal need to support and develop support and professional staff

14 specific objectives related to Learning and Development

Includes a specific objective on evaluation: o Develop and pilot an evaluation framework to cover

the return on investment at corporate/team/individual level.

Faculty of Humanities Training and Development strategy

Clear links between operational objectives, training and development objectives and service delivery.

Clear links between training and other priorities e.g. improving student satisfaction.

Clear links with faculty strategy

Keenness to learn from good practice in other parts of the University

Staff Training and Development Policy

Clear responsibilities for each group of stakeholders

Details diverse ways of identifying learning needs

Specific section on importance of evaluation and places responsibility on managers to define success criteria.

Staff Training & Development Brochure

Clear identification of pre-requisites

Detailed information on the purpose of sessions/interventions

University of Brighton

Staff Development Policy statement (September 2009)

Is clear about the different types of learning

Clarifies the important role played by managers and leaders

Clear roles and responsibilities of different stakeholders

Is clear about the evaluations that need to be carried out at different levels

Staff Development Review Process

Clear links to learning and development needs within the staff development review process

University of the West of England (UWE)

Learning and Development Policy

Clear definition of responsibilities with very detailed notes on the function of the HR Department

Clear links with appraisal system

Full section on monitoring and evaluation

Details the reports that are received by Academic Board

University of Liverpool

HR Strategy Emphasises the importance of staff development

Learning and Development Programme 2009/10

Good link between organisational values and learning and development initiatives and interventions

Clear diagrams to illustrate the role played by HR

Requires staff to outline reasons why they want to attend

Requires staff to attend time period where learning is required

Interesting section on personal development planning

Identifies the requirement to evaluate programmes

12 May 2010

Bournemouth University

Staff Development Policy (Sept, 2009)

Makes clear links with strategic objectives

Reinforces benefits at individual as well as organisational level

Refers to staff development priorities

Provides clear links to Corporate Planning and strategy documents

Details eight purposes of staff development

Links to probation and induction

Commitment to opportunities for part-time staff

Identifies responsibilities for stakeholders

States that an annual review of staff development will be made to senior management team.

Clear details on implementation

Clear section on evaluation

7. The evaluation of learning and development in Higher Education:

7.1. An overview

Secondary research carried out by the research team found limited evidence of specific evaluation

policies or associated documents. From an extensive web-based review only one specific paper is

available which specifically looks at the way that Universities‟ evaluate the learning and

development of their staff; Evaluation of staff learning and development at Anglia Ruskin University

Library (Cefai, 2009). This paper focuses only on support staff which addresses one of the staff

groups that will be investigated within the current paper; however some of the findings and

recommendations may be generalisable more widely.

An important contemporary study into leadership development (The Baseline study of leadership

development in higher education) by Burgoyne et al (2009) reviewed how leadership development

interventions have been evaluated. Burgoyne et al (2009:5) are clear that the evaluation of these

interventions were generally one of the weaknesses in institutions:

“evidence from the case studies and interviews suggests that much evaluation is of an

informal kind. Explicit and systematic evaluation is not the norm”.

There also appears to be other common ground with the desk review that was compiled for the

current study as the authors commented “there is room in the sector for more alignment of

leadership development with strategic goals and with organisational change” (Burgoyne et al,

2009:2). It appears significant that a recommendation from this large study was “helping to

introduce more systematic practices on the evaluation of leadership development initiatives”

(Burgoyne et al., 2009:5)

7.2. Feedback provided from delegates at HEFCE LGM Conference

Attendance at the Leading Transformational Change Conference provided an opportunity to

discuss the current project with key stakeholders from other HEIs and seek their views on some of

the key issues. Feedback from this event was that there was definitely a perceived need for the

work that we are doing and that it has the potential to add real value to the institutions. All parties

agreed that this would be a complex process and that the nature of the academic environment

must be considered throughout. In addition to gaining feedback we received some information on

other institutions that may have tackled this issue in various forms which will be explored at a later

date. A copy of the comments provided is provided as Appendix 3.

13 May 2010

8. Evaluation of learning and development in Higher Education: SWOT analysis

8.1. Strengths

There is evidence that some meaningful data is being collected (this provides a useful

starting point)

Can build on some of the pedagogical work that focuses on evaluating the work of students

Evidence of pockets of good practice which could be disseminated and shared more widely

Some evidence of good supporting infrastructures in terms of well developed learning and

development strategies

8.2. Weaknesses

Still a focus on individually focused learning and development rather than organisational

development (Burgoyne et al, 2009).

Uncertainty about what the „bottom line‟ is in Higher Education.

“Explicit and systematic evaluation is not the norm [in the sector]” ((Burgoyne et al, 2009:5).

Wider range of jobs for academics to undertake (teaching, scholarship, research,

consultancy, community service and administration)

Tendency to not pay sufficient attention to professional support staff and focus on

evaluating the learning and development of academics.

Staff joining from outside the sector may have very different expectations of what sort of

training and development opportunities will be provided (Gordon and Whitchurch, 2007).

When looking at postgraduate research a major piece of work by The Rugby Team (2005,

2009) confirmed that in this area of HEIs there is no evidence of systematic national

approaches to measuring development activities.

8.3. Opportunities

Use of technology e.g., „My Northumbria‟ portal to make the process simpler

Meaningful evaluations can help to justify the continuing expenditure on L&D activities.

Effective evaluations could be used to provide evidence for internal promotions thereby

increasing the significance of evaluating

Considering the use of student learning as an evaluation measure?

More institutions are implementing more formal coaching and mentoring support

If we accept some of the arguments put forward by The Rugby Team (2005) in terms of the

wider benefits to stakeholders there may be an opportunity to demonstrate benefits to the

nation (in terms of outputs), the sector (in planning more coherent approaches) and to the

individuals in terms of maximising employability

14 May 2010

8.4. Threats

Reduction in funding provided and ensuing cuts on budgets within Universities – they may

identify learning and development as an area where cost savings can be made.

Competing priorities

Continuing to ensure fairness/equality of opportunity when prioritising with reduced

resources

Different stakeholders will have different priorities (The Rugby Team, 2005) so careful

consideration to decision making will be required.

In the past institutions have not been subject to the same market forces as the private

sector and this may change in the future (Burgoyne et al., 2009).

Resistance (line managers, staff, trade unions)

9. Evaluation of learning and development in Higher Education: PESTLE analysis

9.1. Political

Current political climate and uncertainty due to the Election Results, awaiting details of

what the likely implications will be. Impact on funding will also impact heavily on

Universities

9.2. Economic

UK is still within a recession with all of the issues that brings in terms of status of

customers, suppliers and other stakeholders. May impact upon student numbers for coming

years. There are likely to be funding costs to Universities so difficult decisions may need to

be made.

9.3. Sociological

Universities need to keep on top of Demographics changes so that they can respond

appropriately with current and future provision. The use of social networking sites

continues to grow so an appropriate response is required. Nationwide there is an

understanding about the importance of (and benefits to be gained from), working with /

supporting local community; this can also extend to the roles that students play within the

local community. Economic issues may impact on local community and there could be a

knock-on effect to Universities in some instances.

9.4. Technological

Choice of software and internet options continue to grow so it is essential that Universities

are keeping up to date with developments and ensuring that they have, up to date and

compatible equipment and software. Whilst there have been few reported incidents of

severe computer viruses it is essential that security and maintenance continues to be

proactively managed. There are implications of maintenance (and moving forwards) in term

of both cost of equipment and the time needed to manage the process.

15 May 2010

9.5. Legal

Health and safety remains a critical issue and one that no University can afford not to pay

careful attention to in terms of maintenance of standards and keeping up to date with

changes in law and agreed good practice. As large employers there is a need for HR

Departments to ensure that they are constantly preparing for notified changes in

employment law and ensuring that this is communicated to all relevant employees; there

are also likely to be training implications to ensure that all stakeholders are aware of their

responsibilities. Where Universities provide childcare options for staff there will be ensuing

legislation to comply with (i.e. child protection issues). Other relevant issues may include

dealing appropriately with Freedom of Information requests and dealing with serial litigants.

9.6. Environmental

Environments for which Universities have some element of responsibility or concern are

constantly growing in line with the increasing emphasis placed on Corporate Social

Responsibility. There are also growing requirements to provide environmental data. There

is a need to have a joined up strategy which encompasses the wide range of issues related

to the environment.

10. Conclusions

This paper demonstrates a number of key points which clearly identify the complex environments

(both internal and external) in which Universities operate. There is evidence of some good practice

within the sector, however this appears to be spread across a range on institutions and a more

cohesive approach could be employed. The case studies provide some useful examples of

individual initiatives though there is also still scope to investigate good practices outside of the

sector. The identification of constraints as well as opportunities provide a good basis to move

forwards with this project by following up by considering how some of the potential sector-specific

problems could be overcome.

16 May 2010

11. References

Adams, J.C. (2005) Leading horses to water: The management of mandatory staff development in

Higher Education Institutions in the UK. London: Leadership Foundation for Higher Education.

Barnett, R. (2003) Beyond reason: Living with Ideology in University. Buckingham: Open University

Press.

Blackmore, P. and Blackwell, R. (2004) How can staff development be more strategic? London:

Leadership Foundation for Higher Education.

Blandy, R., Dockery, M., Hawke, A. and Webster, E. (1999) Does training pay? Evidence from

Australian enterprises [Online]. Available at: http://www.ncver.edu.au/research/proj/nr8010.pdf

Burgoyne, J., Mackness, J. and Williams, S. (2009) Baseline study of Leadership Development in

Higher Education. London: Leadership Foundation for Higher Education.

Coryn, C.L.S. (2006) „A conceptual framework for making evaluation support meaningful, useful

and variable‟, Evaluation Journal of Australasia, 6(1), pp.45-51.

Cummings, D. (2006) „What if: The counterfactual program evaluation‟, Evaluation Journal of

Australasia, 6(2), pp.6-15.

Deem, R. and Lucas, L. (2007) „Research and teaching cultures in two contrasting UK policy

contexts: Academic life in Education Departments in five English and Scottish Universities‟, Higher

Education, 54, pp.115-133.

Doucouliagos, C. and Sgro, P. (2000) Enterprise return on a training investment [Online]. Available

at: http://www.ncver.edu.au/research/proj/nr8021.pdf

Foxon, M. (1998) „Evaluation of training and development programs: A review of the literature‟,

Australian Journal of Educational Technology, 5(2), pp.89-104.

Gordon, G. and Whitchurch, C. (2007) „Managing Human Resources in Higher Education: The

implications of a diversifying workforce‟, Higher Education Management and Policy, 19(2), pp.135-

155.

ICVET (2008) Professional development evaluation: Models and tools [Online]. Available at:

http://www.icvet.tafense.edu.au/resources/pd_evaluation_models.htm (Accessed: 1 March 2010).

Ryan, B. (2003) „Death by evaluation? Reflections on monitoring and evaluation in Australia and

New Zealand‟, Evaluation Journal of Australasia, 3(1), pp.6-16.

Smith, C. and Beno, B. (1993) Guide to staff development evaluation. California: Community

College League of California.

The Rugby Team (2005) Evaluation of Skills Development of Early Career Researchers: A strategy

paper from the Rugby Team. [Online] Available at:

http://www.rcuk.ac.uk/cmsweb/downloads/rcuk/researchcareers/rugbyteamstrategyreport.pdf

[Accessed: 7 March 2010]

The Rugby Team (2009) The Rugby Team Impact Framework: One year on. [Online] Available at:

http://www.vitae.ac.uk/CMS/files/upload/RTIF_update_Sept09.pdf [Accessed: 7 March 2010]

17 May 2010

University of Tasmania (2003) Project Evaluation Toolkit [Online}. Available at:

http://www.utas.edu.au/pet/sections/introducing.html (Accessed: 1 March 2010).

Whitchurch, C. (2008) Professional Managers in UK Higher Education: Preparing for complex

futures. London: Leadership Foundation for Higher Education.

W.K. Kellogg Foundation (1998) W.K. Kellogg Foundation Evaluation Handbook. [Online}.

Available at: http://www.wkkf.org/knowledge-center/resources/2010/W-K-Kellogg-Foundation-

Evaluation-Handbook.aspx (Accessed: 1 March 2010).

18 May 2010

12. Appendix One

Practitioners’ feedback on the CIPD Toolkit

Comments from Lesley Comments from Richard Comments from Rosemary

General areas for review:

Strengths and weaknesses –

please give examples of

specific tools where appropriate

Generally I found the toolkit too wieldy. My perception is that it is aimed at a practitioner just setting off on the development of a learning and development programme – and not to an organisation that has typically delivered in excess of 600 learning events per year. The toolkit was too „back to basic‟ to be helpful.

Any toolkit produced as a result of our project would, I hope, achieve a good balance between rigorous assessment, realistic approaches and meaningful data. I am hoping for a generic range of tools that can be used across all learning and development –rather than an approach which promotes individual evaluation for each individual event. Recognising tools may vary in complexity it is also worth asking for tools where the output from each tool can be compared in some way. In the language of CIPD „systematic‟ rather than just „reaction‟ or „targeted‟.

There were many strengths of the CIPD toolkit – in the messages it

The Toolkit gives a very thorough

and clear description of learning

evaluation issues faced by those

involved in L&D in most sectors.

There are undoubtedly tools

(referred below) that we can use

on the project.

What the toolkit has done is make

me realise the importance of

building in evaluation work from

our own sector (SEDA being an

obvious source) in order to

engage academic staff.

It is important to note that my

perspectives on evaluation come

from the context of management

and leadership development.

Clearly there are many forms of

staff development including IT

training, Health and Safety, CPD

related activity, some of which

lends itself more readily to a

quantitative approach.

19 May 2010

relayed: e.g. important emphasis on learning needs analysis, the involvement of individual and line manager when evaluating the impact on job behaviours/skills (evaluation not just about HR evaluating!), greater emphasis on explaining why we do evaluation (something I think we could do more on at Northumbria), linkage to L&D Strategy, evaluation strategy.

The structure set out on page 18

serves as a reminder / touch-

stone for staff learning and

development.

Kirkpatrick‟s model (p19) still

seems the most useful evaluation

model available; one I am very

happy to use but it is still only a

model. Can it really be the only

appropriate one?

Find it hard not to still focus on

outcomes from learning and

development (p23). Doing so

connects and emphasises the

importance of learning to what an

organisation wants to achieve (cf

IiP)

Re „Main approaches‟ (p39) a mix

of „targeted‟ and „reaction level‟ is

likely to be most readily

achievable and so makes most

sense to me.

Exploring who key stakeholders in

this project should be and their

role would be valuable. In relation

to that „Tool 2.2‟ (p45) seemed

attractive.

The second paragraph on page

Which tools do you think could

be adopted in your

organisation?

We would like a ROI tool

Reflection Sheets / Journals may be worth exploring

Self complete questionnaires

13.1 is interesting in that it places significant focus on the TNA (why) – and the support once individuals return to the day job. I am not sure how much we could do with the pre-event briefing (not sure we could rely on managers to do this) whilst could be helpful is sent to delegates from HR once booked on programme/workshop? I like the immediately after event and 3 months after – whilst probably because this is something we try to do. We have good success with immediately after (probably because we build time into workshop) – whilst not with the post workshop.

I would like a tool that measures

20 May 2010

a level of return in terms of promotions? Tracking career moves

94 caught my eye; the

anthropological paradigm is an

attractive one.

Think that reflection sheets /

journals (p102) have considerable

evaluation potential.

Unit 11 is very interesting but

doesn‟t include much that is new.

Clarity about organisational

culture and values is relevant

here I think. Particularly the

behaviours that are valued and

those that should be valued; how

closely aligned are they and what

are the consequences?

The challenge for evaluation is to

establish a baseline against which

learning / development / change

can be assessed

Are there any tools which could

be valuable if adapted for the

HE sector?

13.1 might be worth an adapted version

Specific questions for consideration

Tool 5.2 introduces the critical

incident technique. Do you think

this could be a useful

approach?

I think critical incident interviews are

particularly helpful for example in

coaching – I think similarly it would

be helpful when working with those

requesting training intervention for

groups of staff to identify training

needs – however, I think this method

would be too resource intensive for

Critical incident interview is

always a useful technique in my

view as it is grounded in what

actually happens and what people

do in those circumstances. It‟s

analogous to behavioural

questions at interview where

people helpfully access their

experience. Responses cannot

This technique could have its place. It relies heavily on people‟s memory and perspective. If an incident has been bad, there may be a tendency to re-frame by an individual. The negative / positive

incident questioning feels

uncomfortable. Starting

21 May 2010

general use. always be taken at face value but

probing „follow on‟ questions can

help. I would also ask some more

questions around the outcome of

their actions.

from a neutral level with

clean questions to gain an

insight into an experience

may be a more constructive

start point.

Looking at Tool 6.3 – what data

collection methods do you

currently use? Does this tool

provide you with any

enthusiasm for using

additional/different methods?

We have used: face to face, group and telephone interviews whilst this is more the exception than the standard – e.g. to evaluate pilots, for very senior staff, when looking to develop programmes/workshops for groups of staff to understand the learning needs (using critical incidents actually!)

We have not used observation – I think this level of observation could really only be undertaken with line manager who can „witness‟ the before and after.

Desk research - - in the loosest sense – comparing rating scores across workshops.

We use self complete

questionnaires and interviews

most commonly. Observations are

carried out after a small number

training courses (e.g. interviewing

skills) but observation in the

workplace is not used, is unlikely

to be well received and is too time

consuming.

Re desk research; there are lots

of data out there, and this

approach is clearly in tune with

the sector, but we simply do not

have the time.

Data collection methods currently used vary. We use focus groups, online and hard copy questionnaires, 1:1 meetings, peer observation and feedback. I personally am not looking

at this tool with enthusiasm

for using different methods.

Do you currently gain formal

feedback from facilitators? If

yes, how does the data

compare with the questions

shown in Tool 9.9?

Yes we do receive feedback whilst it

tends to focus on the extremes for

short / 1-day workshops or

workshops delivered infrequently.

We receive more in depth feedback –

and work with facilitators to achieve

change/improvement when

programmes / workshops are more

frequent / attract higher volume or

Interview with facilitators is done

sometimes but not in depth. This

has been particularly important in

order to engage the external

facilitators we use on our

leadership and management

programmes.

We do not currently gain

formal feedback from

facilitators. We do as peer

groups of developers, share

how an event was, reflect on

the feedback from delegates

and look at any changes that

could benefit the

22 May 2010

are programmes (number of days)

rather than 1 day only.

intervention.

How practical do you think it

would be to use something like

tool 10.8?

We follow a similar process when

delivering Belbin team building –

whilst this is undertaken more

informally / verbally – I am not sure

of the value of this method as a

formal process when looking at

evaluation overall?

Seems very appropriate for the

type of learning that I am likely to

be facilitating more of; namely

working as a „consultant‟ with a

number of departments, working

with existing teams on real issues.

The scope of the form wasn‟t

entirely clear but could be

developed for use.

(I thought Tool 10.9 to be

potentially very useful. It is

couched in clear and appropriate

language.)

Tool 10.8 formalises self

reflection following an

intervention. These may be

useful prompt questions, or

alternative self / team

reflection questions that

could be useful.

How would you rate Tool 11.3?

Would this approach be viable

in your organisation?

I am not sure that managers would have the time or inclination to complete such a form for every member of their team who attends a particular learning event?

I thought this was useful, suitably

short and clear. Completing such

a questionnaire does depend on

the managers abilities concerning

feedback because it is important

to share such relevant information

about performance with the

learner. It must be done

sensitively and appropriately to

help consolidate learning, not a

chance to harangue / harass.

Personally, I find rating someone from 1 – 5 is not helpful. This could have a possible use, if the manager were to rate the individual prior to the learning intervention and then again some months after. Or if descriptors were used not numbers, possibly.

I am not sure this would be viable in this organisation. Something of a similar style is used to rate managers by line reports in one area of the university for PDR. This is not always used

23 May 2010

effectively. In another area of the University, this has move on and become an informal discussion, which is proving extremely effective.

Would the increased time

investment to carry out 11.4

verbally in interview format be

worth it (i.e. in preparation and

delivery time)?

Too resource intensive for general

use whilst I can see how this may be

of benefit at the local level when

evaluating team events.

(I thought tool 11.4 worthwhile but

in a shortened form. It fits when

working with teams.)

I believe the time may be

well invested, but in times of

„doing more for less‟, when

time is at a premium, I am

not sure the ROI would

justify the outcomes.

What percentage of your

training provision do you think

could be evaluated using a

framework such as Tool 11.6?

A very small percentage.

Could only envisage using this on

a small number of occasions.

Learners need to be very clear

that they want this level of

feedback. Also that they know,

before participating, that they will

be observed. It will help them to

be clear about what the learning

intervention is trying to address.

Such feedback is time consuming

for all involved but potentially very

powerful.

This could be used in the

majority of delivered

development courses. I feel

that the questions are good

as a guide, but again the

ratings could be subjective.

The peer observation

scheme we offer gives a

safe environment to give and

receive constructive

feedback. Maybe a hybrid of

the two could be a benefit.

What are your views on Unit 12

(How effectively has the money

been spent)? Where would you

like to see the focus of our

work?

A great question to which we are

seeking to find the answer! – how

can we do this in as least a resource

intensive way as possible?

As for all organisations for whom

profit is not the only bottom line, it

presents a challenge. There is

absolutely no doubt that HE must

be able to demonstrate wise and

effective stewardship of public

money. Always being clear about

the learning budget and working

within it, whilst lobbying for more,

I believe that tools to enable

managers and development

professionals to see an ROI

would be well received. It

fits with linking development

interventions to ensure the

organisational objectives are

met.

24 May 2010

is crucial. Benchmarking HE

expenditure on staff learning is

important here and we could

usefully bring such information

together in the project.

Related to this is use of

appropriate language within the

HE culture (which I realise is not

homogenous) e.g. using the term

ROI can be poorly received.

Students as customers is a

related and vexed issue, even

more so when the concept is

applied to staff. In this project I

envisage trying to achieve a

balance that doesn‟t lag behind

contemporary vfm in HE issues

but neither forges too far ahead

risking alienating some staff.

Reporting to top management is

vital (p300) although tool 12.10 is

not a particularly good example.

Could envisage using tool 12.7,

Checklist for benefits and tool

12.8 has some attractions but

requires a sophisticated

competency framework (and will

appear too mechanical to some in

HE).

Tool 12.9 seems appropriate for

25 May 2010

some training e.g. H&S, some HR

(where money lost through ET‟s)

The most practical and effective

approach I‟ve encountered was

introduced as part of assessing

the value of coaching. I‟ll share it

in more detail but simply requires

the learner to assess the cost of

their part of the „business‟ and to

estimate their % confidence of the

impact made by specific training

How realistic would the use of

tool 13.1 be in your

organisation? Would any

adaptations be needed

I like the robustness of the questions

– whilst worry that the time element

could be prohibitive. I particularly like

the need for the involvement of

manager.

In my view, this would not be

realistic in UCL. Too procedural

and „mechanical‟ and so unlikely

to get buy-in. It is important to

state that this may not be a view

universally held within HR.

Simply and from a practical point

of view, our competency

framework is not sophisticated

enough to support this approach.

Concerted work on competencies

could change that though.

(Tool 13.1 could be usefully used

with some work. We have many

external events and the form

overleaf would be a good way to

monitor activity. We currently do

this but could be more concerted).

I think this tool could be

useful. Although, am „put

off‟ by the number of forms.

Managers can be guided to

have conversations around

this without so many forms.

In reality, we have a

development application

form and a follow up

evaluation form (on the

reverse) and getting people

to fill these in is an

interesting challenge.

26 May 2010

To consider:

Would it be useful for each

institution to complete tool 1.1

as a self-evaluation? This

would provide a useful resource

for when we need to

disseminate the findings of our

study, in addition it may enable

us to measure progress at the

end of the project.

Yes would be happy to do so My initial reaction to this was that

it was too bureaucratic but when

completing it found that it asked

some very helpful and relevant

questions. Simply, if each partner

institution‟s rep completed it,

useful aspects for discussion are

likely to arise. Doing it more

widely with key stakeholders

would be a possibility but still not

essential in my view. Certainly

would not use it more widely

because of questionnaire fatigue

(something we are likely to have

to consider)

This would be useful.

27 May 2010

13. Appendix Two

Framework developed by Robson and Mavin (2010)

Met Partially met

Not met

Comments Clarification /additional information needed from practitioners?

Policy documents 1 The learning & development

policy is clear about the importance of evaluation.

2 There is a clear emphasis on senior management commitment in The learning & development policy

3 The learning & development policy features clear objectives

4 The role of line managers is clearly defined within the learning & development policy

5 The role to be played by employees is clearly defined within the learning & development policy

6 The role of the HR/L&D team is clear in the evaluation process

7 It is clear that the training process begins with a training needs analysis

8 It is clear how the learning function‟s performance is measured.

9 It is clear how the learning function‟s strategic activity translates into business results.

10 There is evidence that learning and development opportunities are available for all staff

11 The organisation has an evaluation strategy

12 There is evidence that the organisation has communicated learning and development priorities to its workforce

13 Learning and development key result areas are included in the human resource management strategy and business plans.

14 There is consistency throughout the documentation on the purpose of evaluation e.g.

To improve the quality of the learning in terms of the delivery

To assess the effectiveness of the overall learning interventions

To justify the interventions

To justify the role of

28 May 2010

learning in the organisation

House-keeping

15 Documentation is written in accessible language

16 There is evidence that the policy documents are regularly updated

Other associated HR policies e.g. performance management

17 There is a clear link between learning and development and the performance management process.

18 There a clear link to the importance of learning and development in associated policies

19 There a clear link to the importance of evaluation in associated policies

20 Line managers‟ performance in supporting the evaluation process is monitored

Evaluation tools e.g. questionnaires or interview schedule

21 The evaluation criteria require participants to consider how they will use their learning

22 There is evidence of the use of focus groups or interviews as part of the evaluation process

23 The evaluation process covers stage 1 of Kirkpatrick

24 The evaluation process covers stage 2 of Kirkpatrick

25 The evaluation process covers stage 3 of Kirkpatrick

26 The evaluation process covers stage 4 of Kirkpatrick

27 Differentiated evaluation forms are used to reflect the diversity of the learning interventions that are used.

28 Evaluation is considered in terms of benefits to team/department/directorate as well as the individual employee

29 The evaluation requires the respondents to consider alignment to strategic objectives

30 The evaluation requires the respondents to consider alignment to local objectives

31 The evaluation criteria include a link to performance management objectives

32 Evaluation questionnaire includes the following question: What was/has been the most important learning for you? Why? How did you learn this? (encouraging reflection on

29 May 2010

immediate and more distant learning experiences, and looking to uncover those insights that help people to see that they can learn and how they learn)

33 Evaluation questionnaire includes the following question: How does this learning match your expectations, and the expectations of your manager/team/the organisation? (pointing up the importance of the value expectations of various stakeholders and the criteria they might apply to assessing learning opportunities and LTD interventions)

34 Evaluation questionnaire includes the following question: How can you apply/what have been the results of applying what you learned to the needs of your job, your team/department and the strategic priorities of the organisation? (keeping the focus on the desirable and potentially rewarding links between learning and the achievement of the organisation‟s strategic intentions and related hard and soft metrics)

35 Evaluation questionnaire includes the following question: How could you integrate what you have learned so far into further learning opportunities available to you, and into the changing demands of your job, your team/department, the organisation as a whole? (pointing the way to future learning opportunities and looking for the multiplicative effects of various learning opportunities and methods).

36 Evaluation forms capture appropriate equal opportunities data

37 Training participants are required to demonstrate their new knowledge and/or skills

Specific documentation on evaluation

38 There is evidence that the organisation evaluates its evaluation data

39 There is evidence of a formal process for collecting evaluation data

40 There is a clear process to show what happens to the evaluation data

30 May 2010

41 There is evidence of measuring return-on-investment

42 There is evidence of measuring return-on-expectations

43 Training interventions are evaluated again after a period of more than one month

44 There clear links between the training programmes on offer and the organisation‟s strategic priorities

45 Trainers/facilitators receive copies of evaluation feedback

46 Trainers/facilitators are required to demonstrate how they act upon the feedback.

47 The purpose of evaluating learning is clear (proving, controlling, improving, reinforcing)

48 Where appropriate there is evidence that the organisation uses pre- and post- test data.

Misc

49 The organisation has a clear learning and development budget.

50 The organisation can produce data to show how and where learning is delivered in the organisation.

51 Evaluation results are presented in an appropriate manner.

52 The organisation knows the costs incurred in the evaluation process evaluation

53 The organisation can evidence how they have responded to the evaluation feedback

54 Overall feedback is shared with the learners

55 The HR Director (or most appropriate person) communicates the results of evaluations to the Senior Management Team.

56 There is evidence that the organisation has used innovative methods to capture their evaluation data.

57 The trainer/facilitator is asked to evaluate the sessions that they delver

58 The organisation has a named person with responsibility for this area

59 There is evidence to show that line managers are encouraged to help their staff to transfer their knowledge into the workplace

31 May 2010

60 There is evidence to suggest that the organisation is trying to develop a learning culture.

32 May 2010

14. Appendix Three

Feedback received from delegates at the HEFCE LGM Conference

1 Are you aware of the work that is going on within the researcher development agenda

through RCUK and Rugby Team? You could contact Tony Bromley @ Leeds who is

collecting case studies too.

2 Do not leave out cohort effects which seem to build confidence.

3 Challenges:

Determining impact of the L&D

Costing the impact of its effect on the bottom line.

4 Key challenges for evaluation:

Against organisational goals – so they need to be clear.

Effect on core business – might vary with institutional mission

Applicability of the learning to the wider community

5 Challenges:

Selection of hard and soft measures.

The time-frame over which the evaluation takes place.

6 I think this is a really important area to develop. We do need to be able to evidence the

positive impact of L&D.

7 What about individuals identifying behaviour change for themselves with a subsequent

reward for the best examples. This would at least develop a set of case studies.

8 We are also looking at validation of our programmes and offerings. Very happy to

exchange ideas and/or join up with you on this.

9 Solutions:

Relate all L&D provision to the University‟s strategic plan

Follow it through using the PDR process

10 Are you looking at what business/industry does to evaluate training and development?

Beyond sector views might be interesting?

11 Challenges:

Time lag – time „doing‟ the learning as an event then process e.g. „trying‟ „forgetting‟

then possibly on impact – trapping this downstream.

12 Have you noticed that the Leadership Foundation is running an event on ROI in

February?

13 Challenges:

Shifting the responsibility (for evaluation) more on to the learner.

Employees in HE are a learning community but prone to forget when it suits!

Whose toolkit?

14 Very happy to send you our evaluation reports/processes if it helps or be interviewed

as part of your research.

15 Challenges:

Identifying if it was the learning intervention or if another variable.

Looking at benefits/ROI from one-to-one coaching

Solution:

Pre-course focus include one-to-ones with participants confirming needs and

expectations – then post course follow up

16 Cranfield challenges:

33 May 2010

Gaining line management involvement

Getting individuals to reflect themselves

Non labour intensive effective process for data gathering

Solutions:

Finding case studies, storytelling, formal levels 1-3 Kirkpatrick questionnaires etc.

17 Challenges:

Understanding the motivation behind the participation and how they come to learn what we think they are coming to learn

Engaging line managers in the process (pre and post learning)

Investment in preparation i.e. knowing what changes need to happen as a result of doing the learning

Tools/process to measure if the learning has taken place and if it has effected a change improvement

SMART objectives in training programme

Building in work based projects/learning, connecting learning to the job