54
03/23/22 HRD3e Contributed by Wel ls Doty, Ed.D. Clemson Un iv. 1 Evaluating HRD Programs Chapter 7

Evaluating HRD Programs

Embed Size (px)

DESCRIPTION

Evaluating HRD Programs. Chapter 7. Effectiveness. The degree to which a training (or other HRD program) achieves its intended purpose. Measures are relative to some starting point. Measures how well the desired goal is achieved. HRD Evaluation. Textbook definition: - PowerPoint PPT Presentation

Citation preview

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

1

Evaluating HRD Programs

Chapter 7

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

2

Effectiveness The degree to which a training (or

other HRD program) achieves its intended purpose.

Measures are relative to some starting point.

Measures how well the desired goal is achieved.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

3

HRD Evaluation

Textbook definition:“The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.”

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

4

In Other Words…

Are we training: the right people the right “stuff” the right way with the right materials at the right time?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

5

Evaluation Needs Descriptive and judgmental

information needed. Objective and subjective data

Information gathered according to a plan and in a desired format.

Gathered to provide decision making information.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

6

Purposes of Evaluation Determine whether the program is

meeting the intended objectives. Identify strengths and weaknesses. Determine cost-benefit ratio. Identify who benefited most or least. Determine future participants. Provide information for improving

HRD programs.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

7

Purposes of Evaluation-2 Reinforce major points to be made. Gather marketing information. Determine if training program is

appropriate. Establish management database.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

8

Evaluation Bottom Line Is HRD a revenue contributor or a

revenue user? Is HRD credible to line and upper-

level managers? Are benefits of HRD readily evident

to all?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

9

How Often are HRD Evaluations Conducted? Not often enough!!! Frequently, only end-of-course

participant reactions are collected. Transfer to the workplace is

evaluated less frequently.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

10

Why HRD Evaluations are Rare Reluctance to having HRD programs

evaluated. Evaluation needs expertise and

resources. Factors other than HRD cause

performance improvements, e.g., Economy Equipment Policies, etc.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

11

Need for HRD Evaluation Shows the value of HRD. Provides metrics for HRD efficiency. Demonstrates value-added

approach for HRD. Demonstrates accountability for

HRD activities. Everyone else has it… why not HRD?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

12

Make or Buy Evaluation “I bought it, therefore it is good.” “Since it’s good, I don’t need to

post-test.” Who says it’s:

Appropriate? Effective? Timely? Transferable to the workplace?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

13

Evolution of Evaluation Efforts

1. Anecdotal approach: Talk to other users.

2. Try before buy: Borrow and use samples.

3. Analytical approach: Match research data to training needs.

4. Holistic approach: Look at overall HRD process, as well as individual training.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

14

Models and Frameworks of Evaluation Table 7-1 lists nine frameworks for

evaluation. The most popular is that of D.

Kirkpatrick: Reaction Learning Job Behavior Results

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

15

Kirkpatrick’s Four Levels Reaction

Focus on trainee’s reactions Learning

Did they learn what they were supposed to? Job Behavior

Was it used on job? Results

Did it improve the organization’s effectiveness?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

16

Issues Concerning Kirkpatrick’s Framework Most organizations don’t

evaluate at all four levels. Focuses only on post-training. Doesn’t treat inter-stage

improvements. WHAT ARE YOUR THOUGHTS?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

17

Other Frameworks/Models – 1 CIPP: Context, Input, Process, Product CIRO: Context, Input, Reaction, Outcome Brinkerhoff:

Goal setting Program design Program implementation Immediate outcomes Usage outcomes Impacts and worth

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

18

Other Frameworks/Models – 2 Kraiger, Ford, & Salas:

Cognitive outcomes Skill-based outcomes Affective outcomes

Phillips: Reaction Learning Applied learning on the job Business results ROI

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

19

A Suggested Framework – 1 Reaction

Did trainees like the training? Did the training seem useful?

Learning How much did they learn?

Behavior What behavior change occurred?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

20

Suggested Framework – 2 Results

What were the tangible outcomes? What was the return on investment

(ROI)? What was the contribution to the

organization?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

21

Data Collection for HRD Evaluation

Possible methods: Interviews Questionnaires Direct observation Written tests Simulation/Performance tests Archival performance information

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

22

Interviews

Advantages: Flexible Opportunity for

clarification Depth possible Personal contact

Limitations: High reactive

effects High cost Face-to-face

threat potential Labor intensive Trained observers

needed

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

23

QuestionnairesAdvantages: Low cost to

administer Honesty increased Anonymity

possible Respondent sets

the pace Variety of options

Limitations: Possible

inaccurate data Response

conditions not controlled

Respondents set varying paces

Uncontrolled return rate

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

24

Direct Observation

Advantages: Non-threatening Excellent way to

measure behavior change

Limitations: Possibly

disruptive Reactive effects

are possible May be unreliable Need trained

observers

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

25

Written TestsAdvantages: Low purchase cost Readily scored Quickly processed Easily

administered Wide sampling

possible

Limitations: May be threatening Possibly no relation

to job performance Measures only

cognitive learning Relies on norms Concern for racial/

ethnic bias

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

26

Simulation/Performance TestsAdvantages: Reliable Objective Close relation to

job performance Includes cognitive,

psychomotor and affective domains

Limitations: Time consuming Simulations often

difficult to create High costs to

development and use

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

27

Archival Performance Data

Advantages: Reliable Objective Job-based Easy to review Minimal reactive

effects

Limitations: Criteria for

keeping/ discarding records

Information system discrepancies

Indirect Not always usable Records prepared

for other purposes

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

28

Choosing Data Collection Methods

Reliability Consistency of results, and freedom from

collection method bias and error.

Validity Does the device measure what we want

to measure?

Practicality Does it make sense in terms of the

resources used to get the data?

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

29

Type of Data Used/Needed

Individual performance System-wide performance Economic

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

30

Individual Performance Data Individual knowledge Individual behaviors Examples:

Test scores Performance quantity, quality, and

timeliness Attendance records Attitudes

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

31

System-Wide Performance Data

Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates and improvement

rates

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

32

Economic Data Profits Product liability claims Avoidance of penalties Market share Competitive position Return on Investment (ROI) Financial utility calculations

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

33

Use of Self-Report Data Most common method Pre-training and post-training data Problems:

Mono-method bias Desire to be consistent between tests

Socially desirable responses Response Shift Bias:

Trainees adjust expectations to training

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

34

Research Design

Specifies in advance: the expected results of the study. the methods of data collection to

be used. how the data will be analyzed.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

35

Research Design Issues Pretest and Posttest

Shows trainee what training has accomplished.

Helps eliminate pretest knowledge bias.

Control Group Compares performance of group with

training against the performance of a similar group without training.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

36

Recommended Research Design Pretest and posttest with control

group. Whenever possible:

randomly assign individuals to the test group and the control group to minimize bias.

Use “time-series” approach to data collection to verify performance improvement is due to training.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

37

Ethical Issues Concerning Evaluation Research Confidentiality Informed consent Withholding training from control

groups Use of deception Pressure to produce positive

results

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

38

Assessing the Impact of HRD Money is the language of business. You MUST talk dollars, not HRD

jargon. No one (except maybe you) cares

about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

39

HRD Program Assessment HRD programs and training are

investments. Line manager often see HR and HRD

as costs, i.e., revenue users, not revenue producers.

You must prove your worth to the organization – Or you’ll have to find another

organization….

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

40

Two Basic Methods for Assessing Financial Impact Evaluation of training costs Utility analysis

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

41

Evaluation of Training Costs Cost-benefit analysis

Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc.

Cost-effectiveness analysis Focuses on increases in quality,

reduction in scrap/rework, productivity, etc.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

42

Return on Investment Return on investment =

Results/Costs

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

43

Types of Training Costs Direct costs Indirect costs Development costs Overhead costs Compensation for participants

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

44

Direct Costs Instructor

Base pay Fringe benefits Travel and per diem

Materials Classroom and audiovisual

equipment Travel Food and refreshments

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

45

Indirect Costs Training management Clerical/Administrative Postal/shipping, telephone,

computers, etc. Pre- and post-learning materials Other overhead costs

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

46

Development Costs Fee to purchase program Costs to tailor program to

organization Instructor training costs

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

47

Overhead Costs General organization support Top management participation Utilities, facilities General and administrative

costs, such as HRM

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

48

Compensation for Participants Participants’ salary and benefits

for time away from job Travel, lodging and per-diem

costs

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

49

Measuring Benefits Change in quality per unit measured

in dollars Reduction in scrap/rework measured

in dollar cost of labor and materials Reduction in preventable accidents

measured in dollars ROI = Benefits/Training costs

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

50

Utility Analysis Uses a statistical approach to

support claims of training effectiveness:

N = Number of trainees T = Length of time benefits are expected to last dt = True performance difference resulting from

training SDy = Dollar value of untrained job performance (in

standard deviation units) C = Cost of training

U = (N)(T)(dt)(Sdy) – C

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

51

Critical Information for Utility Analysis dt = difference in units between

trained/untrained, divided by standard deviation in units produced by trained.

SDy = Standard deviation in dollars, or overall productivity of organization.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

52

Ways to Improve HRD Assessment Walk the walk, talk the talk: MONEY. Involve HRD in strategic planning. Involve management in HRD planning

and estimation efforts. Gain mutual ownership

Use credible and conservative estimates. Share credit for successes and blame for

failures.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

53

HRD Evaluation Steps Analyze needs. Determine explicit evaluation strategy. Insist on specific and measurable

training objectives. Obtain participant reactions. Develop criterion measures/instruments

to measure results. Plan and execute evaluation strategy.

04/19/23 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

54

Summary Training results must be

measured against costs. Training must contribute to the

“bottom line.” HRD must justify itself

repeatedly as a revenue enhancer, not a revenue waster.