Upload
angel-dawson
View
225
Download
1
Tags:
Embed Size (px)
Citation preview
1
ROI Certification
Building Capability and Expertise with ROI Implementations
Jack J. Phillips, Ph. D.
Patti P. Phillips, Ph. D.
2
Reaction Objectives
Provide participants knowledge and skills that are:
• Relevant to their job
• Important to their current job success
• Immediately applicable
• New to their understanding of accountability
• Relevant to their colleagues in similar job situation
3
Learning Objectives
Enable participants to:•Describe the five critical components of a successful evaluation practice
•Describe the five levels of evaluation
•Describe the six types of data in the chain of impact
•Describe the ten steps in the ROI Methodology
and . . .
4
Learning Objectives
• Follow the 12 guiding principles
• Plan and execute an ROI evaluation project
• Calculate and explain the difference in the benefit-cost ratio (BCR) and the return on investment (ROI)
• Communicate the results of an ROI study to a variety of stakeholders
• Implement the ROI Methodology within their organization
5
Application Objectives
Support participants as they:
• Build support for the ROI Methodology in their organization
• Complete their initial ROI evaluation project
• Plan and implement future ROI projects
• Revise/update internal evaluation strategy/practice
• Brief/teach other in the ROI Methodology
• Change the way the propose, implement, and evaluation programs, processes and initiatives
6
Impact Objectives
Enable participants to realize positive consequences as a result of applying what they learn such as:• Improving program effectiveness
• Improving program efficiencies
• Expanding successful programs
• Redesigning or discontinuing ineffective programs
• Improving relationships with clients and executives
• Enhancing the influence of their function within the organization
7
Setting the
Stage
The ROI Methodology
Planning Evaluation
Implementing ROI
Converting Data to Money
Collecting Data
Tabulating Costs and
Calculating ROI
Reporting Results
Isolating the Effects
of the Program
Forecasting ROI
Program Success will be Measured by:
8
• Ratings achieved on end of course evaluation
• Increase in knowledge gain as reported on end-of course evaluation
• Demonstration of knowledge through:
– Course exercises
– Case study presentations
– ROI project plan presentation
– ROI implementation plan
and . . .
Program Success will be Measured by:
9
• ROI project completion following ROI Methodology steps and guiding principles
– Evaluation planning
– Data collection
– Data analysis
– Report submittal
• Steps toward implementing (beyond ROI project) completed as planned.
10
Roles of the ROI Implementation Leader
• Technical Expert
• Consultant• Problem Solver• Initiator• Designer• Developer• Coordinator
• Cheerleader• Communicator• Process
Monitor• Planner• Analyst• Interpreter• Teacher
11
Skill Areas for Certification Planning for ROI calculations Collecting evaluation data Isolating the effects of solutions Converting data to monetary values Monitoring program costs Analyzing data including calculating the ROI Presenting evaluation data Implementing the ROI process Providing internal consulting on ROI Teaching others the ROI process
12
Certification Projects
Item Due Date
Case Study Presentation During Workshop
Implementation Plan for the ROI Process
End of Workshop
ROI Project Plan End of Workshop
Implementation complete 3-6 Months
ROI project complete 6 Months, Ideally
13
Case Study Presentation
• Team based assignment
• Present results to executive audience (you own the study)
• Q and A session
• Critique the case study (you don’t own the study)
14
ROI Project
• Based on a planned or anticipated ROI impact study
• Individual or team based• Provide a copy of Data collection Plan during
workshop• Provide a copy of ROI Analysis Plan during
workshop• Ask for input from the group
Ideally, complete within 6 months
15
Implementation Plan Requirements
• Specific
• Motivational
• Achievable
• Realistic
• Time-based
Must be within your control!
16
Global Communications
17
Paradigm Shift in Programs
1. no business need for the program
2. no assessment of performance issues
3. no specific measurable objectives
4. no effort to prepare program participants to achieve results
1. program linked to specific business needs
2. assessment of performance effectiveness
3. specific objectives for application and business impact
4. results expectations communicated to participants
Characterized by: Characterized by:
Activity-Based Results-Based
18
Paradigm Shift
5. no effort to prepare the work environment to support transfer
6. no efforts to build partnerships with key managers
7. no measurement of results or cost benefit analysis
8. reporting on programs is input focused
5. environment prepared to support transfer
6. partnerships established with key managers and clients
7. measurement of results and cost benefit analysis (ROI)
8. reporting on programs is output focused
Characterized by: Characterized by:
Activity-Based Results-Based
19
Definition of Results-Based Programs
• Programs are initiated, developed, and delivered with the end in mind.
• A comprehensive measurement and evaluation system is in place for each program.
• Impact and ROI evaluations are regularly developed.
• Program participants understand their responsibility to obtain results with programs.
• Support groups help to achieve results from training.
20
How Results-Based Are Your Programs?
• Take the assessment entitled “How Results-Based Are Your Programs?” When taking this assessment, try to be candid in selecting the appropriate response.
• Score your assessment using the guidelines provided.
• Compare your scores with others.
• What is considered to be an adequate score?
• What are the potential uses of this survey?
21
Human Capital PerspectivesTraditional View Emerging View
Expenses are considered costs Expenditures are viewed as a source of value
Function is perceived as a support staff
Function is perceived as a strategic partner
Involved in setting HR budget Top executives involved in budget
Metrics focus on cost and activities
Metrics focus on results
Metrics created and maintained by HR alone
Top executives involved in metrics design and use
. . . and
22
Human Capital Perspectives
Traditional View Emerging View
Little effort to understand the ROI in HC
ROI has become an important tool
Measurement focuses on the data at hand
Measurement focuses on the data needed
Measurement is based on what others measure
Measurement is based on organization needs
Programs initiated without a business need
Programs linked to specific business needs
Reporting is input-focused Reporting is output-focused
23
DRIVERS:• The increasing cost of human capital• Consequences of improper or ineffective HR
practices• Linkage of human capital to strategic
initiatives• Increased accountability of all functions• Top executive requirement for HR contribution,
and human capital ROI
Increased Interest in the Value of Human Capital
24
ProfitabilityPROGRAM IMPACT
Strategic
AccountabilityEvaluation
BOTTOM LINE CONTRIBUTION
EffectivenessVital SignsBenefits vs Costs
ECONOMICVALUE ADDED
PERFORMANCE STANDARDS
Shareholder Value
Balanced Scorecard
Value Based
ROI
25
Three Journeys
1. The need to change the HR measurement mix
2. Setting the investment level for human capital
3. Valuing human capital
Each is explored next…
26
Apex, Inc.
27
Comparison Of Approaches To Measure The
HR Contribution
Measuring the HR Contribution: Status
28
HR Accountability Progress
Leading Edge
Approach
es
Solid V
alue-
Added
Approach
es
Early
Approach
es
ROI Methodology
HR Macro Studies
Human Capital Measurement
HR Profit Center
Competitive HR Benchmarking
HR Satisfaction Surveys
HR Cost Monitoring
HR Key Indicators
HR Auditing
HR Case Studies
Feedback Surveys
MBO in Personnel
HR
Acc
ount
abili
ty
1960’s 1970’s 1980’s 1990’s 2000
Balanced Scorecard
29
Leading Edge Approaches to Measuring the HR Contribution
• Balanced Scorecard
• HR Profit Center
• Human Capital Measures
• HR Macro Studies
• ROI Process Most promise as an immediate tool
30
• Attitudinal Data
• Comparative Data
• Human Capital Measures
• Benefit/Cost Analysis (ROI)
Select an approach in each of these categories:
Recommendations for Measurement Categories
31
Common Human Capital Measures
1. Innovation and Creativity
2. Employee Attitudes
3. Workforce Stability
4. Employee Capability
5. Human Capital Investment
6. Leadership
7. Productivity
8. Workforce Profile
9. Job Creation and Recruitment
10. Compensation and Benefits
11. Compliance and Safety
12. Employee Relations
32
Setting the Investment Level
Let Others Do
It!11
33
Motivating Forces
• Cost control• Lack of
infrastructure• Instability• Access to
expertise• Short-term focus• Survival
Approaches
1. Hire fully competent employees
2. Use contract employees
3. Outsource major functions
34
Setting the Investment Level
Invest the Minimum!22
35
Motivating Forces
• Low cost industry• High labor use• Strong competition• Employees are
dispensable
Approaches
1. Pay minimum wages
2. Provide few benefits
3. Keep training simple
4. Expect turnover and address it
36
Human Resources Development Issues
Training
Risk for Payback
Low
Short
Low
Job-related Skills
Time for Payback
Costs per Employee
Focus
Risk for Payback
37
Human Resources Development Issues
Education
Risk for Payback
Moderate
Medium
Moderate
Preparation for the next job
Time for Payback
Costs per Employee
Focus
Risk for Payback
38
Human Resources Development Issues
Development
Risk for Payback
High
Long
High
Cultural change and continuous learning
Time for Payback
Costs per Employee
Focus
Risk for Payback
39
Setting the Investment Level
Invest with the
Rest!33
40
Motivating Forces• Desire to have best
practices• Benchmarking is
acceptable• Benchmarking is
used in all parts of organization
• Benchmarking can be low cost
• Benchmarking is low risk
Approaches
1. Locate existing reports
2. Participate in existing projects
3. Create a custom project
4. Search the literature
41
Human Capital Investment Benchmarks
1. Human Resource Expenses (HR Department Costs/Budget
2. Total Investment in Human Capital (Total HR expenses plus all salaries and benefits of non-HR staff
3. HR Expenses by Function
4. HR Expenses by Process/Programming
5. Selected HR Costs
42
1 Determining What to Benchmark
2 Building the Benchmarking Team
3 Identifying Benchmark Partners
4 Collecting Benchmarking Data
5 Analyzing the Data
6 Distributing Information to
Benchmarking Partners
7 Initiating Improvement from
Benchmarking
Phases of the Benchmarking
Process
43
Setting the Investment Level
Invest Until it Hurts!44
44
Motivating Forces
• Fad chasing• Happy employee
dilemma• Quick fixes• Retention concerns• Competitive
strategy• Union demands• We can afford it!
Approaches
1. Pay above-market wages
2. Provide above-market employee benefits
3. Implement most new fads/ programs
4. Provide all types of employee services
45
The Relationship Between Over-Investing and Performance
Investment in Human Capital
Optimal
Over-InvestingUnder-Investin
g
46
Setting the Investment Level
Invest as Long as
There is a Payoff!55
47
Motivating Forces
• Need to show HR contribution
• Increasing cost of human capital
• Secure funding• Business partner• Improve processes
Approaches
1. Measure success of each HR program
2. Collect up to six types of data
3. Use ROI routinely
4. Involve stakeholders
5. Use the data
48
The ROI Methodology
Develop EvaluationPlans and
Baseline Data
Develop EvaluationPlans and
Baseline Data
EvaluationPlanning
Data Collection
Collect Data During
SolutionImplementation
Collect Data During
SolutionImplementation
LEVEL 1: REACTION AND PLANNED ACTIONS
LEVEL 2:LEARNING AND CONFIDENCE
Collect DataAfter
Implementation
Collect DataAfter
Implementation
LEVEL 3: APPLICATION ANDIMPLEMENTATION
LEVEL 4:BUSINESS IMPACT
DevelopObjective of Solution(s)
DevelopObjective of Solution(s)
49
TabulateCosts ofSolution
TabulateCosts ofSolution
Isolate the Effects Isolate
the Effects
Convert Data To
MonetaryValue
Convert Data To
MonetaryValue
Calculate the Return
OnInvestment
Calculate the Return
OnInvestment
Generate Impact Study
Generate Impact Study
Data Analysis
Identify IntangibleMeasures
Identify IntangibleMeasures
LEVEL 5: ROI
Reporting
INTANGIBLE BENEFITS
50
Methodical Development
Training and Learning
Organization Development
HR Programs
Change Initiatives
Technology Implementation
Quality / Six Sigma
Meetings and Events
Coaching
51
Valuing Human Capital:Three Approaches
1. What we know from Logic and Intuition
2. What we know from Macro Level Research
3. What we know from ROI Analysis
52
1. Logic and Intuition
• Automation has limitations
• People are necessary
• Stock market mystery
• Accounting dilemma
• Last source of competitive advantage
• Superstar Phenomena
53
Superstar Characteristics
• People are the difference
• Good and great
• Great places to work
• Most admired companies
54
2. Macro Level Research
• HR Effectiveness Index
• Gallup Studies
• The Service Profit Chain
• Watson-Wyatt Studies
• Deloitte & Touche Studies
. . . . and many others
55
3. ROI Analysis
• Micro Analysis Tool
• 5,000 studies per year
• Over 40 Countries / 25 Languages
• Variety of Applications
• ROI Certification ROI Networks
• ROI Standards
• ROI Best Practices
56
Valuing Human CapitalThe Complete Picture
Micro Analysis(ROI Studies)
Macro Analysis(Relationships)
Logic & Intuition(Intangibles)
57
Reliance Insurance Company
Matching Evaluation Levels with Objectives
• Level 1: Reaction• Level 2: Learning• Level 3: Application• Level 4: Business Impact• Level 5: Return on Investment
58
Measurement in Learning and HR
Level Measurement Category
Current Status*
Coverage (Now) (%)
Goal in 5 Years
Coverage (Goal) (%)
Comments About Status
O Inputs/Indicators
Measures the number of programs, participants audience, costs, and efficiencies
100% 100% This is being accomplished now
1 Reaction and Planned Action
Measures reaction to, and satisfaction with, the experience, contents, and value of program
100% 100% Need more focus on content and perceived value
59
Measurement in Learning and HR
60
Level Measurement Category
Current Status*
Coverage (Now) (%)
Goal in 5 Years
Coverage (Goal) (%)
Comments About Status
2 Learning
Measures what participants learned in the program – information, knowledge, skills, and contacts (takes-away from the program)
30 – 40% 80 – 90% Must use simple learning measures
3 Application
Measures progress after the program – the use of information, knowledge, skills, and contacts
10% 30% Need more follow-up
Measurement in Learning and HR
Level Measurement Category
Current Status*
Coverage (Now) (%)
Goal in 5 Years
Coverage (Goal) (%)
Comments About Status
4 Business Impact
Measures changes in business impact variables such as output, quality, time, and costs linked to the program
5% 10% This is the connection to business impact
5 ROI
Compares the monetary benefits of the business impact measures to the costs of the program.
1% 5% The ultimate level of evaluation
61
The Results
• Reacted very positively to the program and found it to be very relevant to their work;
• Learned new skills and gained new insights about themselves;
• Utilized the skills and insights routinely with their teams, although they had some difficulty in a few areas;
• Improved several important work unit measures, with some measures improving as much as 28%;
• Achieved an impressive 105% return on investment; and
• Reported an increase in job satisfaction in the work unit.62
63
Key Issues with This Level of Analysis Objectives? Credibility of data? Source of data? Consistent methodology? Scope? Standards? Use of data? Cost of process? Fear of data?
64
The ROI Process
…Generates six types of data:
1. Reaction to a project or program
2. Learning skills/knowledge
3. Application/Implementation progress
4. Business impact related to the project or program
5. Return on Investment
6. Intangible Benefits
…..and includes a technique to isolate the effects of the program
65
ROI by the Numbers• Process refined over a 25-year period
• 5,000 impact studies conducted each year
• 100 case studies published on ROI
• 3,000 individuals certified to implement the ROI Methodology
• 15 ROI books developed to support the process
• 600 member professional network formed to share information
• ROI methodology adopted by over 2,000 organizations in manufacturing, service, non-profit, and government settings in over 40 countries
66
ROI Dilemma
70-80% of organizations want to use ROI
15-20% of organizations are
currently using ROI
Wish List
ROI
-----
-----
-----
-----
Use List
-----
-----
-----
-----
ROI
Why the gap?
HIGH
LOW
67
Why Use Impact and ROI Analysis?
Reactive
• Show contributions of selected programs
• Justify/defend budgets
• Identify inefficient programs that need to be redesigned or eliminated
68
Why Use Impact and ROI Analysis?
Proactive
• Aligns programs to business needs
• Earn respect of senior management/administrators
• Improve support for programs
• Enhance design and implementation processes
• Identify successful programs that can be implemented in other areas
69
Applications
• Learning and Development• Career Development • Competency Systems• Diversity Programs• E-Learning• Executive Coaching• Gainsharing• Meetings and Events• Leadership Development• Organization Development
• Orientation Systems• Recruiting Strategies• Safety & Health Programs• Self-Directed Teams• Skill-Based/Knowledge-
Based Compensation• Technology
Implementation • Quality Programs• Wellness/Fitness
Initiatives
70
Basic Elements
An EvaluationFramework
Case Applicationsand Practice
A ProcessModel
Operating Standards and
Philosophy
Implementation
71
Evaluation Framework
Level Measurement Focus
1. Reaction & Planned Action
Measures participant satisfaction and captures planned actions, if appropriate
2. Learning & Confidence
Measures changes in knowledge, skills, and attitudes related
3. Application & Implementation
Measures changes in on-the-job behavior or actions
4. Business Impact Measures changes in business impact variables
5. Return on Investment
Compares project benefits to the costs
72
Defining theReturn on Investment
Benefits/Cost Ratio
ROI
Monetary Benefits
Program Costs
Net Monetary Benefits
Program Costs=
=
X 100
73
ROI Example
Costs for project $80,000
Benefits from project $240,000
BCR =
ROI = x 100 = %
3.0
$160,000
$80,000
200
74
ROI Target Options
1. Set the value at the same level as other investments, e.g. 15%
2. Set slightly above other investments, e.g. 25%
3. Set at break even - 0%
4. Set at client expectations
Private sector organizations usually go with option #2; public sector usually prefer option #3.
Characteristics of Evaluation Levels
Chain of Value of Customer Frequency Difficulty ofImpact Information Focus of Use Assessment
Satisfaction Lowest Consumer Frequent Easy
Learning
Application
Impact
ROI Highest Client Infrequent Difficult
Customers
Consumers: The customers who are actively involved in the process.
Client: The customers who fund, support, and approve the project
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered
Level 1: Reaction and Planned Action
Was the program relevant to participants’ jobs and mission?
Was the program important to participants’ job/mission success?
Did the program provide new information?
Do participants intend to use what they learned?
Would participants recommend it to others?
Is there room for improvement with facilitation, materials, and the learning environment?
76
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered
Level 2: Learning and Confidence
Do participants know what they are supposed to do with what they learned?
Do participants know how to apply what they learned?
Are participants confident to apply what they learned?
Did participants gain new knowledge, change their attitude, increase awareness?
77
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered
Level 3: Application and Implementation
How effectively are participants applying what they learned?
How frequently are they applying what they learned?
If they are applying what they learned, what is supporting them?
If they are not applying what they learned, why not?
78
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered
Level 4: Business Impact So what? To what extent does participant
application of what they learned improve the measures the program was intended to improve?
How did the program impact output, quality, cost, time, customer satisfaction, employee satisfaction, work habits?
What were the consequences of participants’ application of knowledge and skills acquired during the program, process, intervention, change?
How do we know it was the program that improved these measures?
79
Evaluation Framework and Key Questions
Levels of Evaluation Key Questions Answered
Level 5: ROI Do the monetary benefits of the improvement in business impact measures outweigh the cost of the program?
80
81
Reaction & Planned Action
Learning & Confidence
Application & Implementation
Impact
ROI
Isolate the Effects of the Program
Chain of Impact
Intangible Benefits
Needs ProgramAssessment Objectives Evaluation
Business Impact BusinessNeeds Objectives Impact
Job Performance Application ApplicationNeeds Objectives
Skills/Knowledge Learning LearningNeeds Objectives
Preferences Satisfaction ReactionObjectives
4 4
3 3
2 2
1 1
Potential ROI ROIPayoffs Objectives5 5
83
Matching Evaluation Levels with Objectives
1. Reaction
2. Learning
3. Application
4. Impact
5. Return on Investment
84
The ROI Methodology
Develop EvaluationPlans and
Baseline Data
Develop EvaluationPlans and
Baseline Data
EvaluationPlanning
Data Collection
Collect Data During
SolutionImplementation
Collect Data During
SolutionImplementation
LEVEL 1: REACTION AND PLANNED ACTION
LEVEL 2: LEARNING AND CONFIDENCE
Collect DataAfter
Implementation
Collect DataAfter
Implementation
LEVEL 3: APPLICATION ANDIMPLEMENTATION
LEVEL 4:BUSINESS IMPACT
DevelopObjective of Solution(s)
DevelopObjective of Solution(s)
85
TabulateCosts ofSolution
TabulateCosts ofSolution
Isolate the Effects Isolate
the Effects
Convert Data To
MonetaryValue
Convert Data To
MonetaryValue
Calculate the Return
OnInvestment
Calculate the Return
OnInvestment
Generate Impact Study
Generate Impact Study
Data Analysis
Identify IntangibleMeasures
Identify IntangibleMeasures
LEVEL 5: ROI
Reporting
INTANGIBLE BENEFITS
86
Evaluation PlanningDevelop/Finalize Objectives
• Reaction
• Learning
• Application
• Impact
• ROI
87
Evaluation PlanningData Collection Plan
• Broad Program Objectives
• Measures
• Data Collection Method/Instruments
• Data Sources
• Timing
• Responsibilities
88
Evaluation PlanningROI Analysis Plan
• Data Items (Usually Level 4)• Methods for Isolating the Effects of the
Program/Process• Methods of converting Data to Monetary
Values• Cost Categories• Intangible Benefits• Communication Targets for Final Report• Other Influences/Issues during Application • Comments
89
Evaluation PlanningProject Plan
• Major Milestones
• Deliverables
• Timelines
• Flow
90
Data Collection During Program
•Surveys •Questionnaires •Observation •Interviews •Focus Groups •Tests/Quizzes •Demonstrations •Simulations
Level 1 Level 2Method
91
Data Collection Post Program
•Surveys •Questionnaires •Observations on the job •Interviews •Focus Groups •Action planning/improvement•plans •Performance contracting •Performance monitoring
Level 3 Level 4Method
92
Isolating the Effects of the Program
• Use of control groups• Trend line analysis • Forecasting methods• Participant’s estimate• Management’s estimate of impact (percent)• Use of experts/previous studies• Calculate/Estimate the impact of other factors• Customer input
93
Method1
1.Comparison Group Analysis 35%
2.Trend/Forecasting Analysis 20%
3.Expert Estimation 50%
4.Other 20%
1 Listed by credibility2 Percentages exceed 100%
Best Practice Use2
Isolating the Effects of the Program
94
Example - Use of Control Groups
• Customer Service Compensation
• Six sites chosen for program evaluation
• Each site had a control group and an experimental group randomly selected
• Experimental group received new plan - control group did not
• Observed performance for both groups at the same time
95%
Use of Trend Line Analysis
Shipment Productivity
Months
Per
cent
of
Sch
edul
e S
hipp
ed
J F M A M J J A S O N D J
100%
90%
85%
Team ImplementationActual Average 94.4%
Average of Trend Projected 92.3%
Pre Program Average87.3%
Trend Projection
96
Example of a Participant’s Estimation
Factor that Influenced
Improvement
Percent of Improvement
Caused By
Confidence Expressed as
a Percent
Adjusted Percent of
Improvement Caused By
HR Project 60% 80% 48%
System Changes
15% 70% 10.5%
Market Changes
5% 60% 3%
Process Changes
20% 80% 16%
Other ____% % %
Total 100%
97
Converting Data to Money
• Profit/savings from output (standard value)• Cost of quality (standard value)• Employee time as compensation (standard value)• Historical costs/savings from records• Expert input• External studies• Linking with other measures• Participant estimation• Management Estimation• Estimation from staff
98
Converting Data to Money
CredibilityResources
Needed
Standard values High Low
Records/Reports analysis High High
Databases Moderate Moderate
Expert Estimation Low Low
99
Example of Converting Data Using External Database
Cost of one turnover*
Middle Manager $70,000 annual salary
Cost of turnover 150%
Total cost of turnover $105,000
* External data - value obtained from industry related study
100
Cost of a Sexual Harassment Complaint
35 Complaints35 Complaints
Actual Costs from RecordsActual Costs from Records
Additional Estimated Costs from Staff
Additional Estimated Costs from Staff
Legal Fees, Settlements, Losses, Material, Direct
Expenses
Legal Fees, Settlements, Losses, Material, Direct
Expenses
EEO/AA Staff Time, Management Time
EEO/AA Staff Time, Management Time
$852,000 Annually$852,000 Annually
Cost per complaint = $24,343$852,000
35
101
A Compelling Place A Compelling Place A Compelling Place to Work to Shop to Invest
AttitudeAbout
the Job
CustomerRetention
EmployeeRetention
CustomerRecommendations
EmployeeBehavior
AttitudeAbout theCompany
Service
Helpfulness
CustomerImpression
Merchandise
Value
Return on Assets
Operating Margin
Revenue Growth
5-Unit Increase
in EmployeeAttitude
Drives Drives
1.3-Unit Increase in Customer
Impression
0.5 Increasein Revenue
Growth
Example of Linkage with Other Measures
102
Tabulating Program Costs
Direct• Program Materials• Facilitator Costs• Facilities• Travel
Indirect• Needs Assessment• Program
Development• Participant Time• Administrative
Overhead• Evaluation
103
Intangible Benefits
Complaints
Conflicts
Stress
Job Satisfaction
Commitment
Teamwork
Customer Service
Engagement
104
ROI Process Flexibility
• Pre-program ROI forecast• End-of-program ROI estimation
• Application data (Level 3)• Impact data (Level 4)
Look Forward
Examine Accomplishments
105
Do not Confuse the CFO
• ROI – Return on Investment . . . not Information
. . . not Intelligence
. . . not Inspiration
. . . not Involvement
• ROE – Return on Equity . . . not Expectation
• ROA – Return on Assets . . . not Anticipation
• ROCE – Return on Capital Employed
. . . not Client Expectation
Common Target AudiencesReason for Communication Primary Target AudienceSecure approval for program Client, top executivesGain support for the program Immediate managers,
team leadersBuild credibility for the training Top executivesstaffEnhance reinforcement of the Immediate
managersprogramEnhance results of future programs ParticipantsShow complete results of the Key client teamprogramStimulate interest in HR programs Top executivesDemonstrate accountability for client expenditures All employeesMarket future HR programs Prospective clients
107
Select Media
• Impact Studies•Full report•Executive summary•General overview•One-page summary
•Meetings•Executive meetings•Manager meetings•Staff meetings•Panel discussions•Best practice meetings
•Internal Publications•Announcements•Bulletins•Newsletters•Magazines
•Progress Reports•Schedules•Preliminary results•Memos
•Case Studies•Program Brochures•Scoreboards•Electronic Media
•E-mail•Web sites•Video•blogs
108
Implementation Issues
• Resources (staffing / budget)
• Leadership (individual, group, cross functional team)
• Timing (urgency, activities)
• Communication (various audiences)
• Commitment (staff, managers, top executives)
109
Key Implementation Actions
• Determine /establish responsibilities
• Develop skills /knowledge with ROI
• Develop transition / implementation plan
• Conduct ROI studies
• Prepare /revise/evaluation /policy/ procedures/guidelines
• Train/brief managers on the ROI Process
• Communicate progress/results
110
Retail Merchandise Company
111
Utility Services Company
112
Utility Services CompanyBusiness Impact
Monthly Improvement in Six Months
A
Percent Contribution From Team
BuildingB
Average Confidence
Estimate (Percent)
C
Adjusted Improvement in Six Months
A x B x C
Productivity 23% 57% 86% 11.3%
Quality 18% 38% 74% 5%
Efficiencies 14.5% 64% 91% 8.4%
113
Program Costs for 18 Participants = $54,300
Annualized, First Year Benefits
Productivity 197,000Quality 121,500
Efficiency 90,000
408,500
408,000 - 54,30054,300ROI = x 100 = 652%
Utility Services Company
Matching Exercise:
The Twelve Guiding Principles of ROI
114
115
Regional Public Utility
116
Level 3 and 4 Objectives Provide:
• Direction to designers and developers
• Guidance to instructors and facilitators
• Goals for participants
• Satisfaction for program sponsors
• A framework for evaluators
117
Needs ProgramAssessment Objectives Evaluation
Business Impact BusinessNeeds Objectives Impact
Job Performance Application ApplicationNeeds Objectives
Skills/Knowledge Learning LearningNeeds Objectives
Preferences Satisfaction ReactionObjectives
4 4
3 3
2 2
1 1
Potential ROI ROIPayoffs Objectives5 5
Linking Needs Assessment with Evaluation
4
3
2
1
An absenteeism problem exists
Discussions between team leader/supervisor are not occurring when
there is an absence
Deficiency in counseling/discussion
skills
Supervisor prefers to attend training
one day per week
Needs Assessment
Weekly absenteeism rate will reduce
Counseling discussions conducted in 95% of situations when an
unexpected absence occurs
Counseling discussion skills will be
acquired/enhanced
Program receives favorable rating of 4 out of
5 on the structure of program
Program Objectives
Monitor absenteeism data for six months
Follow-up questionnaire to participants to check
frequency of discussions - three months
Skill practice sessions during program
Reaction questionnaire at the end of program
Evaluation4
3
2
1
118
ProjectBusiness Alignment and Forecasting The ROI Process Model
V Model
Learning Needs
Preference Needs
Measurement and Evaluation
Reaction
Learning
Application
Impact
ROI
Reaction Objectives
Learning Objectives
Application Objectives Performance Needs
Impact ObjectivesBusiness Needs
Payoff Needs ROI Objectives
End HereStart Here
5
4
3
2
5
4
3
2
1 1
Initial Analysis
119
End Here
5 ROI Objectives 5
4 Impact Objectives 4
3 Application Objectives 3
2 Learning Objectives 2
1 Reaction Objectives 1
Project
Payoff Needs
Business Needs
Job Performance Needs
Learning Needs
Preference Needs
Impact
Learning
Reaction
Initial Analysis
Measurement and Evaluation
Application
Business Alignment and Forecasting The ROI Process Model
ROI
Start Here
Absenteeism is costing $10,000 monthly.
Unexpected absenteeism is 9% and increasing; greater than benchmarking of 5%
Discussions between team member and supervisor are not occurring when there is an unplanned absence.
Deficiency in counseling/ discussion skills.
One-day counseling skills workshop must provide usable necessary and relevant skills; facilitator-led; participants are supervisors
Needs ObjectivesEvaluations
Program Alignment V-Model
120
End Here
5 ROI Objectives 5
4 Impact Objectives 4
3 Application Objectives 3
2 Learning Objectives 2
1 Reaction Objectives 1
Project
Payoff Needs
Business Needs
Job Performance Needs
Learning Needs
Preference Needs
Impact
Learning
Reaction
Initial Analysis
Measurement and Evaluation
Application
Business Alignment and Forecasting The ROI Process Model
ROI
Start Here
Absenteeism is costing $10,000 monthly.
Unexpected absenteeism is 9% and increasing; greater than benchmarking of 5%
Discussions between team member and supervisor are not occurring when there is an unplanned absence.
Deficiency in counseling/ discussion skills.
One-day counseling skills workshop must provide usable necessary and relevant skills; facilitator-led; participants are supervisors
Needs ObjectivesEvaluations
Program Alignment V-Model
121
122
NEEDS
Start Here
Payoff Needs Sales growth is sluggish, less than competitors
Specific Business Needs A need to improve growth in all product lines. Current growth is only 2%; 10% should be achievable. Also, new accounts’ acquisition is very low, averaging only slightly over 1 per sales rep per month. Five accounts should be easily achievable.
Performance Needs Sales team appears to be not very aggressive, in particular: Providing inadequate follow-up Not focusing on cross-selling possibilities Not building the proper relationships Not prospecting for new clients Not negotiating to close
Learning Needs Participants will need to enhance competencies on strategic selling, closing deals, assembling a broader perspective, seeking new clients, providing excellent customer service
Preference Needs Participants need to see this as valuable and necessary at this time, and immediately applicable in their work situations.
OBJECTIVES
ROI Objectives Attain an ROI of over 25%
Impact Objectives Increase sales growth rate of total sales from 2% to 10%
Increase the number of new accounts from 1 per person to 5 per person
Application Objectives After completing this program, participants should be:
Providing adequate follow-up Focusing on cross-selling possibilities Building the proper relationships Prospecting for new clients Negotiating to close
Learning Objectives After participating in this program, participants should be
able to use each competency at an acceptable level of performance
Reaction Objectives Participants will perceive this program to be:
Valuable Necessary Immediately Applicable
EVALUATION
End Here
ROI
Calculating ROI examining benefits vs. costs
Impact Monitoring the records of the system
Application Self-assessment questionnaire on the use
of new skills with confirmation from sales manager
Learning Skill practice observations and self-
assessments
Reaction Feedback questionnaire at the end of the
program
SELLING SKILLS V-Model
5
5
4 4
3 3
2 2
1 1
Project
Initial Analysis Measurement and Evaluation
Business Alignment and Forecasting The ROI Process Model Project
123
NEEDS
Start Here
Payoff Needs It is taking too long to complete the projects, creating delays, bottlenecks and problems.
Specific Business Needs The time to complete projects has been excessive, averaging ten days and increasing. The penalties for delays have increased to an average of $10,000 per project.
Performance Needs Project team members are not using the designated project management software tool (Microsoft Project).
Learning Needs Project team members have inadequate skills on the use of Microsoft Project software
Preference Needs Project team members must see this program as necessary, relevant to their needs, and important to the success of their projects.
OBJECTIVES
ROI Objectives 20%
Impact Objectives Reduce the time to complete projects by at least 50% in six
months. Reduce the penalties for delayed projects by at least 50% in six months.
Application Objectives Project team members will be using Microsoft Project on
every project within two weeks of the program.
Learning Objectives
Participants will demonstrate use of the 75% of the critical routines in a classroom setting.
Reaction Objectives Participants would rate this as: Four out of five in relevance to current needs Necessary for success, Important to the success of the project.
EVALUATION
End Here
ROI ROI is calculated with costs versus benefit.
Impact
Excessive time project records are monitored, and project penalties are
monitored.
Application Self-assessment questionnaire and input
from centralized software system
Learning Software demonstration observed by the
facilitator
Reaction Feedback at the end of the session using a
standard questionnaire
PROJECT MANAGEMENT V-Model
5 5
4 4
3 3
2 2
1 1
Initial Analysis
Measurement and Evaluation
Business Alignment and Forecasting The ROI Process Model Project
124
NEEDS
Start Here
Payoff Needs Recruiting is taking too long, and the quality of candidates is less than desired
Specific Business Needs The time to start (time from the requisition approval to on-the-job is 50 days and growing. The time to performance of employees in their first nine months is at a rating of 3.2 on a five-point scale. A four should easily be obtained.
Performance Needs Current recruiting sources are sluggish and are not producing quality candidates. The current selection system is cumbersome and takes too much time, particularly with executive interviews.
Learning Needs With a new system in place, the stakeholders must understand how to use a new selection system and a new recruiting source.
Preference Needs Participants must see this as a necessary change that will be very useful and valuable.
OBJECTIVES
ROI Objectives 25%
Impact Objectives Reduce the time to start from 50 days to 30 days, and
quality rating would improve from 3.2 to 4 in six months.
Application Objectives To implement a new selection system to
eliminate executive interviews Drop the two inefficient recruiting sources and
focus on website recruiting, which is much more efficient.
Learning Objectives
Participants must be aware of the new system and the new procedure on the sourcing.
Reaction Objectives Participants must see this as useful, necessary
and valuable.
EVALUATION
End Here
ROI ROI would be calculated using cost
versus benefit.
Impact
Records will be examined for time-to-start data and quality data.
Application A self-assessment questionnaire for
participants on their use of the new system and source
Learning A quick check of roles and responsibilities
Reaction
Feedback questionnaire at the end of the meeting to launch the new process.
RECRUITING STRATEGIES V-Model
5 5
4 4
3 3
2 2
1 1
Project
Initial Analysis
Measurement and Evaluation
Business Alignment and Forecasting The ROI Process Model
125
Nissan Motor Manufacturing Company
126
Wachovia Bank
127
Metro Hospital
128
Regional Health Center
129
Department of Internal Affairs
130
International Car Rental
131
Performance Assessment and Analysis Process
Each level includes:• Data Sources• Data Collection• Key Questions• Key Issues
Level 4 Level 3 Level 3 and 2 Level 1Level 5
Results-Based Approach
Specify Skill/Knowledge
Deficiencies of Affected
Population
Specify Skill/Knowledge
Deficiencies of Affected
Population
TrainingRequired
?
TrainingRequired
?Level 2
Problem/Opportunity Present or Anticipated
Problem/Opportunity Present or Anticipated
Identify Job Performance
Needs, Gaps, and
Why
Identify Job Performance
Needs, Gaps, and
Why
Identify SolutionsIdentify
SolutionsIdentify
PreferencesIdentify
Preferences
Develop Objectives/ Evaluation Strategy
Develop Objectives/ Evaluation Strategy
Identify TransferStrategy
Options and L-2 & L-3
Support For All Stakeholders
Identify Business,
Needs, Gaps, and
Stakeholders
Design Solution and StakeholderComponents
Design Solution and StakeholderComponents
ConsiderResources/ Logistics Delivery
ConsiderResources/ Logistics Delivery
Develop Content/ Materials
Develop Content/ Materials
Conduct/ Implement Solution
Conduct/ Implement Solution
Level 1Level 2
Implement Pre-
activity
Implement Pre-
activity
Implement Transfer Strategy
Implement Transfer Strategy
- Solution- Transfer Strategy
132
TabulateCosts of Solution
Isolatethe Effects
of the Solution
Convert Data To
MonetaryValue
CalculateThe Return On
Investment
Significant Influences • Policy Statement• Procedures and Guidelines• Staff Skills• Management Support• Technical Support• Organizational Culture
Phillips ROI Methodology TM
Develop Report and
Communicate
Results
Develop Report and
Communicate
ResultsLevel 3Level 4
Level 5
Intangible Benefits
Collect Data After Solution
Implementation
IdentifyIntangibles
133
Key Alignment Questions
Potential ROI ROIPayoffs Objectives5
5
Is this a problem worth solving?
Is there a potential pay off?
What is the actual ROI?
What is the BCR?
Needs ProgramAssessment Objectives Evaluation
134
Key Alignment Questions
44
What is the specific measure?
What happens if we do nothing?
Which business measure improved?
How much is related to the program?
Needs ProgramAssessment Objectives Evaluation
Business Impact BusinessNeeds Objectives Impact
135
Key Alignment Questions
33
What is occurring or not occurring on the job that influences the business measure?
What has changed?
Which skills/knowledge has been applied?
Needs ProgramAssessment Objectives Evaluation
Job Performance Application ApplicationNeeds Objectives
136
Key Alignment Questions
22
What skills or knowledge is needed to support the job performance need?
What did they learn?
Who did they meet?
Needs ProgramAssessment Objectives Evaluation
Skills/Knowledge Learning LearningNeeds Objectives
137
Key Alignment Questions
11
How should the solution be structured?
What was the reaction to the program?
Do we intend to implement the program?
Needs ProgramAssessment Objectives Evaluation
Satisfaction Preferences Objectives Reaction
Developing Reaction Objectives
138
Objective MeasureAt the end of course, participants will perceive program content as relevant to their jobs.
80% of participants rate program relevance a 4.5 out of 5 on Likert scale.
Developing Learning Objectives
139
Objective MeasureAt the end of the course, participants will be able to implement Microsoft Word.
Within a 10-minute time period, participant will be able to demonstrate to the facilitator the following applications on Microsoft Word with zero errors.• File, Save as, Save as Web
Page• Format, including font,
paragraph, background, and themes
• Insert tables, add columns and rows, and delete columns and rows.
Developing Application Objectives
140
Objective MeasuresParticipants will use effective meeting behaviors.
• Participants will develop a detailed agenda outlining the specific topics to be covered for 100% of meetings.
• Participants will establish meeting ground rules at the beginning of 100% of meetings.
• Participants will follow up on meeting action items within three days following 100% of meetings.
Developing Impact Objectives
141
Objective MeasuresIncrease market share. • Increase market share of young
professionals by 10% within nine months of new ad launch
Improve the quality of the X-1350 • Reduce the number of warranty claims on the X-1350 by 10% within six months after the program.
• Improve overall customer satisfaction with quality of the X-1350 by 10% as indicated by customer satisfaction survey taken six months after the program.
• Achieve top scores on product quality measures included in industry quality survey.
142
Developing Level 3 and 4 Objectives
143
Evaluation Targets(Large Telecommunications Company)
0%10%20%
30%40%50%60%
70%80%90%
100%
Level 1 -Reaction
Level 2 -Learning
Level 3 -Application
Level 4 -Impact
Level 5 -ROI
144
Criteria For Selecting Programs For Level 3 Evaluation
• Significant gaps in performance suspected
• Safety and health of employees at risk
• Learning transfer significantly important to customer service / satisfaction goals
• Learning transfer significantly important to success of company strategic initiatives
• Pilot program delivered
145
Criteria for Selecting Programs for Level 4 & 5 Evaluation
• The life cycle of the program• The linkage of the program to operational goals and
issues• The importance of the program to strategic objectives• The cost of the program• Visibility of the program• The size of the target audience• The investment of time• A comprehensive needs assessment is conducted• Top executives are interested in the evaluation
146
Evaluation Planning MeetingWho Should Be Involved?
• Program owner
• Program designer
• Program analyst
• Program facilitator
• Business unit partner
• Subject matter expert
• Typical participant
147
Evaluation Planning MeetingFactors For Success
• Credible sources
• Access to data
• Complete coverage
• Move quickly
• Consider outputs to be drafts
• Sponsor sign-off
148
Evaluation Planning MeetingAgenda
• Explain purpose
• Finalize/adjust objectives
• Complete data collection plan
• Complete ROI analysis plan step by step
• Compile ROI project plan step by step
149
Global Financial Services, Inc.
150
Data Analysis and Results
• Rating of 4.23 out of 5 achieved on the relevance of ACT! for specific job applications.
• 92.5% of participants indicated an intention to use ACT! within two weeks of the workshop.
Results at Level 1
151
Data Analysis and Results
• 83% of the participants scored 75 or better on the ACT! use test.
• Participants successfully demonstrated an average of 4.2 out of 5 key features of ACT!
Results at Level 2
which are . . .
152
Key Features of Act!
1. Enter a new contact
2. Create a mail-merge document
3. Create a query
4. Send an e-mail
5. Create a call report
153
Data Analysis and Results
Participants indicated that within 10 days, 92% of new customer prospects are entered into the system.
Participants report an increase in the number of planned follow-up contacts with customers.
Unscheduled audit of daily use resulted in a score of 76% out of a possible 100%.
Results at Level 3
Results at Level 4
Impact MeasureAverage Monthly
ChangeContribution of
ACT!Annual
Value
Customer Complaints24.2 - 43% $ 575,660
Customer Response 18 minutes per customer
72% N/A
Sales to Existing Customers $321,000 + 14% $539,280
Customer Satisfaction26% + 49% N/A
Total: $1,114,940
155
Project Costs
Development Costs $ 10,500
Materials/Software $ 18,850
Equipment $ 6,000
Instructor (Including Expenses) $ 7,200
Facilities/Food/Refreshments 60 @ $58 $ 3,480
Participants Time (Lost Opportunity) 58 @ $385$ 22,330
Coordination/Evaluation $ 15,600
Total: $ 83,960
156
ROI Calculation
$1,114,940 - $83,960
X 100 = 1228%$ 83,960ROI (%) =
157
Program Profile
Title: Interactive Selling Skills Target Group: Sales Associates in
Electronics Vendor Produced and Delivered 3 Days - (2 Days Plus 1 Day) Significant Use of Skill Practices 3 Groups Trained (48 Participants from 3
Stores)
158
ROI Analysis Profile
Converting Data to Monetary Values
Isolating the Effects of Training Control Group Arrangement Participant’s Estimate (For Back-up)
Profit Contribution of Increased Output
(4) Performance Monitoring 3 months (3) Questionnaire 3 months (3) Program Follow-up Session 3 weeks (last session)
Post Program Data Collection
159
Level 1 - Selected Data
Success with Objectives 4.3
Relevance of Material 4.4
Usefulness of Program 4.5
Exercises/Skill Practices 3.9
Overall Instructor Rating 4.1
160
Level 2 - Selected Data
All Participants Demonstrated
That They Could Use The Skills Successfully
161
Level 3 - Selected Data
Strongly Agree
Agree Neither Agree Nor Disagree
Disagree Strongly Disagree
78% 22% 0% 0% 0%
With Each Customer
Every Third
Customer
Several Times
Each Day
At Least Once Daily
At Least Once
Weekly
52% 26% 18% 4% 0%
(2 Questions out of 20)
I utilize the skills taught in the program
Frequency of use of skills
162
Level 4 Data:Average Weekly Sales
Post Training Data
Weeks After Training Trained Groups Control Groups
1 $ 9,723 $ 9,6982 9,978 9,7203 10,424 9,812
13 $13,690 $11,572 14 11,491 9,683 15 11,044 10,092
Average for Weeks $12,075 $10,449 13, 14, 15
163
Annualized Program Benefits
Average Weekly Sales per Employee Trained Groups $12,075
Average Weekly Sales per Employee
Untrained Groups 10,449
Increase 1,626
Profit Contribution (2% of Store Sales) 32.50
Total Weekly Improvement (x 46) 1,495
Total Annual Benefits (x 48 Weeks)$71,760
46 participants were still in job after 3 months.
164
Cost Summary
48 participants in 3 courses
Facilitation Fees: 3 courses @ $3750 $11,250
Program Materials: 48 @ $35/participant 1,680
Meals/Refreshments: 4,032
3 days @ $28/participant
Facilities: 9 days @ $120 1,080
Participant Salaries Plus Benefits (35% factor) 12,442
Coordination/Evaluation 2,500
Total Costs $ 32,984
165
Level 5 Data
BCR = =
ROI (%) = X 100 =
166
ROI Example:Retail Merchandise Company
Collecting Post Program
Data
Isolating the Effects of
the Program
Converting Data to
Monetary Value
Calculating the Return
on Investment
Calculating the Return
on Investment
Tabulating Program
Costs
Identifying Intangible Benefits
$32,984
118%• Follow-up Session
• Questionnaire
• Performance Monitoring
• Control Groups
• Participants’ Estimates
• Standard Values
$71,760
167
The ROI Process Takes A Balanced View by Measuring And Reporting:
Reaction to program Learning and attitudes Application on the job Impact in work unit Impact on the customer The financial results Intangible benefits Nature and source of problems and
opportunities
168
The Business Case for EI
169
Self Test: How Results-Based Are Your Human Resources Programs?
170
Data Collection Issues
Objectives Type of data Instruments Methods Sources of data Timing of collection Responsibilities
171
Collecting Program Data
Surveys Questionnaires Observation Interviews with Participants Focus Groups Tests Action Planning
Performance Contracting Performance Monitoring
Level 1 2 3 4
172
Classic Evaluation Instruments
Questionnaires Surveys Tests Interviews Focus Groups Observation Performance Records
173
Applications of Data Collection Instruments
A. Survey
B. Test
C. Questionnaire
D. Interview
E. Focus groups
F. Observation
G. Performance Records
Matching Exercise
Data Collection Exercise Part 1Program: 3-day Leadership Workshop
Audience: 50 Middle Level Managers
Level 3 Objectives:• Apply 11-step goal setting process with each employee
three months after workshop• Apply techniques that influence motivational climate
within three months• Apply techniques that inspire teamwork• Apply coaching techniques to enhance employee
engagement
Level 4 Objective:• Improve business measures, important to your work unit174
175
Survey/Questionnaire Design
Determine the specific information needed
Review information with stakeholders
Select the type(s) of questions
Keep questions and statements simple
Develop the questions
Design for easy tabulation and analysis
and . . .
176
Survey/Questionnaire Design
Check the reading level
Address the anonymity issue
Test the questions
Review results of the field test
Develop the completed questionnaire
Develop administrative procedures
177
Common Mistakes in Survey/Questionnaire Design
Vague statements/questions
Too many questions
Reading level too high
Improperly worded questions
Confusing instructions
Too difficult to analyze
178
Questionnaire Design Checklist
1. Is the overall length appropriate?
2. Is it a valid instrument?
3. Is it a reliable instrument?
4. Do the questions flow properly?
5. Are the types of questions appropriate for the information desired?
179
6. Are the questions designed to take advantage of data comparisons?
7. Is it designed to minimize distortion?
8. Are the questions designed to ease data tabulation and analysis?
9. Have administrative issues been addressed?
10. Is it easy to read?
Questionnaire Design Checklist
180
11. Are the instructions clear?
12. Have steps been taken to ensure confidentiality?
13. Have provisions been made for demographic data?
14. Is the appearance of the questionnaire adequate?
15. Is a pre-test scheduled?
Questionnaire Design Checklist
181
Selecting Survey Scales
Variance – are there enough choices?
Discrimination – can you tell the difference between choices?
Accuracy – do the scale labels accurately describe the choices?
Symmetry – is the scale balanced appropriately?
182
What Makes an Effective Survey Question?
Focus – every question should focus on a single issue or specific topic
Brevity – short questions present less opportunity for measurement error
Clarity – clear questions are understandable to all respondents
183
Types of Tests
Objective
Criterion reference tests
Norm referenced
Performance tests
184
Types of Objective Tests
True/false
Matching items
Multiple choice items
Fill in the blank items
Short answer items
Essay items
185
Steps to Developing Objective Tests
Focus on one set of related course objectives at a time
Determine behavioral evidence of capability related to these objectives
Select a format and an item type that fits the objectives
Develop 3 to 5 items for each objective Sequence items in a logical order Prepare test instructions that are simple and easy to
understand Pilot test
186
Interview Design
List basic questions to be asked. Follow the same principles as survey/ questionnaire
design. Allow for probing. Try out the interview. Prepare the interviewers. Provide instructions to the individual being
interviewed. Administer the interviews consistently.
Structured and Unstructured
187
Focus Group Guidelines
Select topics, questions, and strategy carefully. Keep the group size small. Ensure that there is a representative sample of
the target population. Insist on facilitators having appropriate
expertise. Stay on track and on time Allow equal time for all participants Control over-talking and under-talking
188
Observation Guidelines
Observations should be systematic
Observers should know how to interpret and record what they see
Observer’s influence should be minimized
Observers must be carefully selected
Observers must be prepared
189
Observation Methods
Behavior Checklist
Coded Behavior Record
Delayed Report Method
Video Recording
Audio Monitoring
Computer Monitoring (software)
190
Typical Sources of Performance Data
Operating reports Departmental reports Work unit audits Key performance indicators Six Sigma reports Scorecards Dashboards
191
Monitoring Performance Data
Identify appropriate data sources.
Collect data related to objectives only.
Develop new data as needed.
Convert current data to usable items.
Develop a collection plan to include Who, What, Where, and When.
192
Characteristics of Effective Instruments
Valid Reliable Simple Economical Easy to administer Easy to analyze data
193
Factors to Consider When Selecting Data Collection Methods
Time required for participants Time required for participant’s supervisor Costs of method Amount of disruption of normal activities Accuracy Utility Culture / Philosophy
194
Sources of Information for Program Evaluation
Participants Supervisors of participants Subordinates of participants Peer group Internal staff External Group Organizational performance records
195
Factors to Consider When Determining Timing of Follow-Up
Availability of data
Ideal time for behavior change (level 3)
Ideal time for business impact (level 4)
Convenience of collection
Constraints on collection
196
Data Collection Exercise
Program: 3-Day Leadership Workshop
Audience: 50 Middle Level Managers (2 Groups)
Follow Up: Anonymous questionnaire in 3 months to collect application and impact data
Assignment:
1. What topics should be included in thequestionnaire?
197
Cyber International
Sales Culture at Progress Bank
198
199
Developing ROI with Action Planning
Communicate the action plan requirement early.
Describe the action planning process at the beginning of the intervention.
Teach the action planning process. Allow time to develop the plan. Have the facilitator approve the action plan. Require participants to assign a monetary
value for each improvement.and . . .
200
Ask participants to isolate the effects of the program.
Ask participants to provide a level of confidence for estimates.
If possible, require action plans to be presented to the group.
Explain the follow-up mechanism. Collect action plans at the pre-determined
follow-up time. Summarize the data and calculate the ROI.
Developing ROI with Action Planning
201
Performance Contract Process Steps
The participant and supervisor mutually agree on a subject for improvement.
A specific measurable goal(s) is set. The learning participates in the program. The contract is discussed, and plans are
developed to accomplish the goals.
and . . .
202
Performance Contract Process Steps
After the program, the participant works on the contract against a specific deadline.
The participant reports the results of the effort to his supervisor.
The supervisor and participant document the results for the staff.
203
The Performance Contract Should Be:
Written
Understandable (by all involved)
Challenging (requiring a concentrated effort to achieve)
Achievable
Largely under the control of the participant
Measurable and dated
204
Program:3-Day Leadership Workshop
Audience: 50 Middle Level Managers (2 Groups)
Follow Up: Anonymous questionnaire in 3 months to collect application and impact data, using a 5-
page questionnaire
Assignment:
1. How many responses do you need?2. How are you going to ensure you receive the
appropriate number response?
Response Rate Exercise
205
Progress with objectives Action plan implementation Relevance of program Perceived value Use of materials Knowledge/skill enhancement Skills used Changes with work Linkage with output measures
Follow-Up Questionnaire Checklist
206
Follow-Up Questionnaire Checklist
Other Benefits Barriers Enablers Management support Other solutions Recommendations for target
audience Suggestions for improvement Other comments
207
Improvements/accomplishments
Improvement linked with program
Monetary impact
Confidence level
Follow-Up Questionnaire Checklist
(OPTIONAL)
208
Impact Questions for Follow-Up Evaluation
1. How did you use the material from this program?
2. What influence did it have in your work? Team?
3. What is the specific measure influenced? Define it.
4. What is the unit value of the measures? (Profit or Cost)
5. What is the basis of this value?
6. How much did the measure change since the program was conducted?
and . . .
209
Impact Questions for Follow-Up Evaluation
7. What is the frequency of the measure? Daily, Weekly, Monthly, Etc
8. What is the total annual value of improvement?
9. What are the other factors that could have caused this total improvement?
10.What percent of the total improvement can be attributed to this program?
11.What is your confidence estimate of the above data? 0% = No confidence; 100% = Certainty
Performance Contract Sample
210
211
Option 1, When You Don’t Have a Clue
Option 2, When the Measure is in a Defined Set
Option 3, When the Measure is Known
212
Increasing Response Rates
Provide advance communication Clearly communicate the reason for the
questionnaire Indicate who will see the results Show how the data will be integrated Keep the questionnaire simple and brief Make it easy to respond Use the local manager to help distribute the
questionnaires and show support Let the target audience know that they are part of
a carefully selected sampleand . . .
213
Use one or two follow-up reminders Have the introduction letter signed by a top
executive Enclose a giveaway item with the questionnaire Provide an incentive for quick response Send a summary of results to target audience Distribute questionnaire to a captive audience Consider an alternative distribution channel Have a third party gather and analyze data.
Increasing Response Rates
and . . .
214
Communicate the time limit Consider paying for the time it takes to complete
the questionnaire Review the questionnaire at the end of the formal
session Carefully select the survey sample Allow completion of the survey during work hours Add emotional appeal
Increasing Response Rates
and . . .
215
Design questionnaire to attract attention, with a professional format
Let participants know what actions will be taken with the data
Provide options to respond Use a local coordinator to help distribute and
collect questionnaires Frame questions so participants can respond
appropriately and make the questions relevant
Increasing Response Rates
216
First Bank
1. Is this situation unusual? Please explain.
2. Should the CEO drop the issue?
3. What are some approaches to resolve this dilemma?
4. What would you do?
217
Isolating the Effects of a Program
A. Control group
B. Trend line analysis
C. Forecasting
D. Participant’s estimate
E. Use of customer input
F. Expert estimates
Matching Exercise
218
Systems/ProceduresChanges
Several Factors Contribute to an Improvement After a Program in Conducted
External Factors
ManagementAttention
Incentives
HRPrograms
TOTALIMPROVEMENT
AFTERPROGRAM
EFFECT OF HR ON IMPROVEMENT
219
Techniques to Isolate the Effects of Programs
Use of a control group arrangement Trend line analysis of performance data Use of forecasting methods of performance data Participant’s estimate of impact (percent) Supervisor’s estimate of impact (percent) Management’s estimate of impact (percent) Use of experts/previous studies Calculating/estimating the impact of other factors Use of customer input
220
Financial Services
1. What are the major problems with the implementation of a control group arrangement illustrated in this case?
2. How can these problems be tackled on a practical basis?
3. Will the same strategy of using control groups work at your organization? Explain.
221
Use of Control Groups
Customer service training
Six sites chosen for program evaluation
Each site had a control group and an experimental group randomly selected
Experimental group received training, control group did not
Collected customer service data for both groups at the same time
222
Control Group Design
Control Group
ExperimentalGroup
M1
M1 Program
M2
M2
223
Post-Test Only, Control Group Design
ExperimentalGroup Measurement
Measurement
Program
Control Group
224
Ideal Experiment Design
Group A
Group B
Program
Group C
M1 M2
M1 M2
Program M3
225
Control Group Problems
It is inappropriate in many settings
Selection of groups
Contamination of control group
Duration / timing
Influences are inconsistent
Too research-based for some organizations
226
2%
1%
J F M A M J J A S O N D J
REJECT RATE
1.85% Pre Program Average
Projected Average –Using Pre Data as a
Base 1.45%
.7% Post Program Six-Month Average
CPI Program Conducted
MONTHS
Micro Electronics, Inc.
227
Questions for Discussion
1. Approximately what improvement in reject rate has resulted from the program?
2. How reliable is this process?
3. When can this process be used?
228
Post-Program Average
Pre-Program Average
Time
Co
mp
lain
ts
SexualHarassment
Prevention Program
Formal Internal Complaints of Sexual Harassment
Healthcare, Inc.
O N D J F M A M J J A S O N D J F M A M J J A S O
Projected Value
229
Actual Average 94.4%
Use of Trend Line AnalysisShipment Productivity
Months
Per
cen
t O
f S
ched
ule
Sh
ipp
ed
J F M A M J J A S O N D J
100.00%
95.0%
90.0%
85.0%
Team Training Program
Average of Trend Projected 92.3%
Pre Program Average87.3%
Trend Projection
230
Conditions for Trend Line Analysis Use
Pre-program data available Data items are stable Pre-program influences expected to
continue No new influences enter the post-program
period except for program
231
Woody’s
1. What is the impact of the sales training program on sales?
2. Is this process feasible in your organization? Explain.
232
1800
1600
1400
1200
1000
800
600
400
200
3 9 12 15 18 21 24 27 30 33 36
•••$1500
$1340
$1100
Pro
gram
Con
duct
ed
Impact of Training Program $160
•Impact of Advertising $240
Y = 140 + 40x
233
National Bank
234
Monthly Increase: 175
Contributing Factors
Average Impact on Results
Average Confidence
Level
Sales Training Program 32% 83%
Incentive Systems 41% 87%
Goal Setting/Management Emphasis
14% 62%
Marketing 11% 75%
Other 2% 91%
235
Questions for Discussion
1. What is the number of new credit card accounts per month that can be attributed to the sales training program?
2. Is this a realistic process to estimate of the impact of the program on the increased sales?
3. How could this process be improved?
236
Using Estimates to Isolate the Effects of a Program
Describe the task and the process. Explain why the information was needed
and how it will be used. Ask participants to identify any other factors
that may have contributed to the increase. Have participants discuss the linkage
between each factor and the specific output measure.
and . . .
237
Provide participants with any additional information needed
Obtain the actual estimate of the contribution of each factor. The total must be 100%.
Obtain the confidence level from each employee for the estimate for each factor (100%=certainty; 0%=no confidence). The values are averaged for each factor.
Using Estimates to Isolate the Effects of a Program
238
The Power of Estimates
Research
Comparison with other methods
Handling objections
Management reactions
Participant reactions
239
Key Issues with Estimates
Use as a last resort
Use most credible source for data
Collect data in an unbiased way
Adjust for error
Report it carefully
240
Credibility of Data
• Which of these items have the most credibility? Rank them.
• Why are these items credible or not credible?
• List all the factors that influence the credibility of data.
• Why are we uncomfortable using estimates in our programs?
241
Credibility of Outcome Data is Influenced by the:
Reputation of the source of the data Reputation of the source of the study Motives of the researchers Personal bias of audience Methodology of the study Assumptions made in the analysis Realism of the outcome data Type of data Scope of analysis
242
Other Isolation Methods
Supervisors
Managers
Experts
Previous studies
Customers
243
Use of Participants’ & Managers’ Estimate of Training’s Impact
ISDN knowledge, skills, or experience graduates had before they attended the training 13% 14%ISDN knowledge, skills or experience graduates gained from the training 37% 36%ISDN knowledge, skills, or experience graduates acquired on their own after the training 16% 12%ISDN reference material or job aids unrelated to the training, e.g. bulletins, methods & procedure documentation 7% 9%Coaching or feedback from peers 18% 18%Coaching or feedback from graduates’ managers 2% 5%Observation of others 7% 6%
Factor Participants Managers
244
National Computer Company (A)
245
Questions for Discussion
1. Is this an appropriate opportunity for using a control group? Explain.
2. What factors should be considered when selecting the groups?
3. What other options should be explored?
4. When should the attempt to use control groups be abandoned?
246
National Computer Company (B)
247
42%
40%
38%
36%J F M A M J J A S O
VO
LUN
TAR
Y T
UR
NO
VE
R R
AT
E
MONTH
Δ
Pro
gra
m I
mp
lem
enta
tio
n
248
Questions for Discussion
1. Can a trend line analysis be used?
2. What conditions must be met for this approach to be used?
3. How credible is this approach?
249
National Computer Company (C)
250
38%
36%
34%
32%
30%
4% 5% 6% 7%
VO
LUN
TAR
Y T
UR
NO
VE
R R
AT
E
UNEMPLOYMENT RATE
Y = 50 – 3(X)
251
Questions for Discussion
1. How can this data be used to isolate the effects of the HR program?
2. How much of a reduction in voluntary turnover is attributed to the increase in the unemployment rate?
3. What cautions and concerns should be considered?
252
National Computer Company (D)
253
Contributing FactorsImpact on Results
Average Confidence
Level
HR Program 30% 80%
Unemployment rate 50% 100%
Management Emphasis 5% 70%
Competition 15% 90%
254
Questions for Discussion
1. Who should provide the input on this isolation estimate?
2. How should the data be collected?
3. What makes this process credible?
4. What makes this process not so credible?
255
Wisdom of Crowds
In this case, the average estimate is near perfect
Estimates are used everywhere
Set up your own experiment
Estimates should be adjusted
Estimates are okay – defend them; don’t prefer them
256
Multi National, Inc. (A)
1. Critique the way in which the data was analyzed to develop the final value. What would you have done differently?
2. Do you think that program benefits should be communicated without the cost of the program? Explain.
3. What cautions or concerns should be addressed when communicating impressive results from training programs?
257
1. What is the ROI of this program?
2. How does this value compare with the one previously reported? Which value would you use?
3. Is there a way to integrate the two studies?
4. How do you assess the credibility of this process?
Multi National, Inc. (B)
258
Examples of Hard Data
Output
Costs
Time
Quality
259
Objectively based
Easy to measure and quantify
Relatively easy to assign monetary values
Common measures of organizational performance
Very credible with management
Characteristics of Hard Data
260
Examples of Soft Data
Customer Service
Employee Development/ Advancement
Work Climate/
Satisfaction
Initiative/
Innovation
Work Habits
261
Subjectively based in many cases
Difficult to measure and quantify, directly
Difficult to assign monetary values
Less credible as a performance measure
Usually behaviorally oriented
Characteristics of Soft Data
262
Converting Data to Money
A. Profit/savings from output
B. Cost of quality
C. Employee time as compensation
D. Historical cost/savings from records
E. Expert input
F. External database
G. Linking with other measures
H. End user/performer estimation
I. Management estimation
J. Estimation from HR staff
Matching Exercise
263
1. Unit of improvement
2. Value of each unit (V)
3. Unit performance change (Δ)
4. Annual performance level change (Δ P)
5. Improvement value (V times Δ P)
Five Steps to Convert a Measure to Money
ExampleSpend about 4 minutes with your team to calculate the annual monetary value of improvement in grievances.
Step 1: 1 Grievance
Step 2: V = $6,500
Step 3: Δ P = Reduction of 7 grievances per month due to the program
Step 4: A Δ P =
Step 5: A Δ P x V=
264
265
Converting Data Converting output to contribution – standard value Converting the cost of quality – standard value Converting employee’s time – standard value Using historical costs Using internal and external experts Using data from external databases Using participants’ estimates Linking with other measures Using supervisors’ and managers’ estimates Using staff estimates
266
Data Conversion Issues
Use the most credible sources
If two credible sources are available, use the most conservative option
Adjust for the time value of money
Know when to stop this process
267
Standard Values are Everywhere Finance and Accounting Production Operations Engineering IT Marketing and Customer Service Procurement Research and Development HR
Examples of TechniquesConvert Data to Monetary Value
268
Data Conversion Techniques ExamplesStandard Valueso Output to Contributiono Cost of Qualityo Employee Time
Sales - Profit margin Donations - Overhead margin Unproductive man hours -
Hourly Wage* Repackaging – Standard
value based on time savings (hourly wage)
OSHA fines – Fines associated with incident
Unit Per Person Per Hour – Profit of one additional product produced per person per hour at same cost
Examples of TechniquesConvert Data to Monetary Value
269
Historical Costs Sexual harassment grievances – Litigation costs
Food spoilage – Cost to replenish food inventory
Turnover marine engineers – Average replacement costs plus separation costs
Internal / External Experts Electric utility rate – Internal economist
Life – Internal risk manager External Databases Turnover mid-level manager –
ERIC Turnover restaurant wait staff –
Examples of TechniquesConvert Data to Monetary Value
270
Link with Other Measures Employee satisfaction – Linked to customer satisfaction linked to profit
Customer complaints regarding baggage mishandling – Percent complaints linked to percent who will not repurchase seat on airline linked to lost revenue
Estimationso Participanto Supervisors/Managerso Staff
Unexpected absence – Supervisor estimate (basis provided) x confidence adjustment
Unwanted network intrusions – Participant estimate (basis provided) x confidence adjustment
271
Cost of a Sexual Harassment Complaint
35 Complaints35 Complaints
Actual Costs from RecordsActual Costs from Records
Additional Estimated Costs from Staff
Additional Estimated Costs from Staff
Legal Fees, Settlements, Losses, Material, Direct
Expenses
Legal Fees, Settlements, Losses, Material, Direct
Expenses
EEO/AA Staff Time, Management Time
EEO/AA Staff Time, Management Time
$852,000 Annually$852,000 Annually
Cost per complaint = $24,343$852,000
35
272
Where to Find Experts
The obvious department
They send the report
It’s in the job title
The directory
Ask
273
What Makes an Expert Credible?
Experience
Neutrality
No conflict of interest
Credentials
Publications
Track record
274
Converting Data Using External Database
Cost of one turnover
Middle Manager $70,000 annual salary
Cost of turnover 150%
Total cost of turnover $105,000
275
Finding the Data
Search engines
Research databases
Academic databases
Industry / trade databases
Government databases
Commercial databases
Association databases
Professional databases
276
Positive CorrelationC
usto
mer
Sat
isfa
ctio
n
Revenue
277
Classic Relationships
Job satisfaction
Job satisfaction
Job satisfaction
Organization commitment
Engagement
Customer satisfaction
Conflicts
Turnover
Absenteeism
Customer satisfaction
Productivity
Productivity
Revenue
Productivity
vs.
vs.
vs.
vs.
vs.
vs.
vs.
278
Linkage with Other Measures
AttitudeAbout
the Job
CustomerRetention
EmployeeRetention
CustomerRecommendations
EmployeeBehavior
AttitudeAbout theCompany
Service
Helpfulness
CustomerImpression
Merchandise
Value
Return on Assets
Operating Margin
Revenue Growth
5-Unit Increase
in EmployeeAttitude
Drives Drives
1.3-Unit Increase in Customer
Impression
0.5 Increasein Revenue
Growth
A Compelling Place A Compelling Place A Compelling Place to Work to Shop to Invest
279
Estimating the Value
Use the most credible source Check for biases Discuss the value in general terms Provide information to assist in the estimates Collect data in a non-threatening way Adjust for the error
280
Turnover Cost Summary
Entry level – hourly, non-skilled 30-50%Service / Production workers – hourly 40-70%Skilled hourly 75-100%Clerical / Administrative 50-80%Professional 75-125%Technical 100-150%Engineers 200-300%Specialists 200-400%Supervisors / Team Leaders 100-150%Middle Managers 125-200%
Job Type / Category Turnover Cost Ranges
281
Turnover Costs Summary
Exit cost of previous employee
Recruiting cost Employment cost Orientation cost Training cost Wages and
salaries while training
Lost productivity
Quality problems Customer
dissatisfaction Loss of expertise/
knowledge Supervisor’s time
for turnover Temporary
replacement costs
282
Converting Data: Questions to Ask
What is the value of one additional unit of production or service?
What is the value of a reduction of one unit of quality measurement (reject, waste, errors)?
What are the direct cost savings? What is the value of one unit of time improvement? Are cost records available? Is there an internal expert who can estimate the
value?and. . . .
283
Converting Data: Questions to Ask
Is there an external expert who can estimate the value?
Are there any government, industry, or research data available to estimate the value?
Are supervisors of program participants capable of estimating the value?
Is senior management willing to provide an estimate of the value?
Does the staff have expertise to estimate the value?
284
Short-Term Solutions
Defined in terms of the time to complete or implement the program
Is appropriate when this time is a month or less
Is appropriate when the lag between Levels 3 and 4 is relatively short
Reflects most HR solutions
285
When Estimating Time for Long-Term Solutions
Secure input from all key stakeholders (sponsor, champion, implementer, designer, evaluator)
Be conservative
Have it reviewed by Finance & Accounting
Use forecasting
286
Converting Your Level 4 Measures to Money
Level 4 Measure
Isolation Technique(s)
Data Conversion Technique(s)
287
Total Fitness Company
1. Calculate the annual savings from the improvement.
2. Is this a credible process?
288
Absenteeism linked to program
Absence days prevented
Monetary Value
7% - 4% = 3%3% X 40% =
1.2%
240 days X 120 employees X 1.2% = 346 days
346 days X $105/day = $36,330 or346 X $90/day = $31,140
289
Data Conversion TestIs there a standard value?
Is there a method to get there?
With minimum
resources?
Add to numerator
Move to intangible benefits
No
Yes
No
Yes
Yes
Move to intangible benefits
No
290
Move to intangible benefits
NoConvince it’s credible in 2
minutes?
Yes
Yes
Convert data and add to numerator
291
Reasons for Developing Cost Data
To determine the overall expenditure To determine the relative cost To predict future program costs To calculate benefits versus costs To improve the efficiency To evaluate alternatives To plan and budget To develop a marginal cost pricing system To integrate data into other systems
292
Issues About Tracking Costs
Monitor costs, even if they are not needed for evaluation
Costs will not be precise
Use a practical approach
Minimize the resources to track costs
Estimates are acceptable
Use caution when reporting costs
Do not report costs of a program without reporting benefits (or at least have a plan)
293
How Much Should You Spend on HR?
Overall Expenditures
• Total Expenditures
• Total – Human Capital
• % of Payroll
• % of Revenues
• % of Operating Costs
• Expenditures per Employee
294
How Much Should You Spend on HR?
Functional Area
• Needs Assessment
• Development
• Delivery/Implementation
• Operation/Maintenance
• Evaluation
295
Questions for Discussion
1. Is there a significant difference between estimated and actual costs? Explain.
2. How did you determine what your targets would be?
3. What should you spend?
296
Overall Cost Categories
Analysis costs
Development costs
Delivery costs
Operating / Maintenance costs
Evaluation costs
297
Tabulating Program Costs Recommended Items
Needs assessment (prorated) Development costs (prorated) Program materials Facilitator / coordinator costs Facilities costs Travel / Lodging / Meals Participants’ time (salaries and benefits) Administrative / Overhead costs Operations / Maintenance costs Evaluation costs
298
Prorating Cost
Life cycle approach
Initial cost plus annual updates
299
Example of Prorating
Leadership 101
5-year life cycle
200 participants per year
$75,000 initial development costs
2 groups of 25 are being evaluated at the ROI level
How much development costs should be charged to the ROI project?
300
Overhead Allocation ExamplePortion of budget not allocated to specific projects $548,061
Total number of days dedicated to specific projects/programs 7,450
Per day overhead allocation $______
What is the total overhead allocation for the program that takes 3 days to complete?
301
Cost Estimating Worksheet
Costs Classification Matrix
302
Federal Information Agency (A)
1. What types of data should be collected for application and implementation?
2. What business impact measures should be collected?
3. What is the time frame for data collection?
4. Which cost categories should be utilized in capturing the actual cost of the program?
5. Can the value of this program be forecasted? If so, how?
303
1. Please calculate the actual cost of the program for 100 participants. Assume a 5% dropout rate each year.
2. Most of these costs are estimated or rounded off. It this appropriate? Explain.
3. What issues surface when developing cost data? How can they be addressed?
Federal Information Agency (B)
304
Cost Benefit Analysis Return on Investment Payback Period Discounted Cash Flow Internal Rate of Return Utility Analysis Consequences of not providing
learning systems
Different Approaches
MostCommon
305
Defining the Benefit Cost Ratio
Benefit/Cost Ratio =Program BenefitsProgram Costs
Example
Program Benefits = $71,760Program Costs = $32,984
BCR = 2.1756
306
Defining the Return on Investment
ROI (%) =Net Program Benefits
Program Costs
ExampleNet Program Benefits = $38,776Program Costs = $32,984
ROI =
X 100
117%
307
Defining the Payback Period
Payback Period =Total InvestmentAnnual Savings
Example
Total Investment = $32,984Annual Savings = $71,760
Payback Period = .85 X 12 = 10.2 months
X 12
308
ROI Target Options
Set the value as with other investments, e.g. 15%
Set slightly above other investments, e.g. 25%
Set at break even - 0%
Set at client expectations
309
A Rational Approach to ROI
Keep the process simple
Use sampling for ROI calculations
Always account for the influence of other factors
Involve management in the process
Educate the management team
Communicate results carefully
Give credit to participants and managers
Plan for ROI calculations
310
The Journey to Increased Accountability
Time
Accountability
Level 1 (Reaction)
Level 2 (Learning)
Level 3 (Application)
Level 4 (Business Impact)
Level 5 (ROI)
Profit CenterNormal
311
Cautions When Using ROI
Take a conservative approach when developing both benefits and costs.
Use caution when comparing the ROI in HR with other financial returns.
Involve management in developing the methodology.
Fully disclose the assumptions and methodology.
and . . .
312
Cautions When Using ROI
Approach sensitive and controversial issues with caution.
Teach others the methods for calculating the return.
Recognize that not everyone will buy into ROI. Do not boast about a high return. Choose the place for the debates. Do not try to calculate the ROI on every program.
313
Improper Use of ROI
ROI – return on information
ROI – return on intelligence
ROI – return on involvement
ROI – return on inspiration
ROI – return on implementation
ROI – return on initiative
314
ROI Myths
ROI is too complex for most users. ROI is too expensive, consuming too many
critical resources. If senior management does not require ROI,
there is no need to pursue it. ROI is a passing fad. ROI is too subjective ROI is for post analysis only
315
The Potential Magnitude of an ROI
ROI
1,500 % +
When
1. A need
is identified
with
2. a performance
gap existing or a
new requirement
introduced
3. and an effective solutio
n is implemented
at the rig
ht time fo
r the rig
ht people
at a re
asonable
cost
and
4. the solution
is applied and
supported in
the work
setting
and
5. linkage exists
to one or more
business
measures
316
Guiding Principles1. When a higher level evaluation is conducted, data must be
collected at lower levels.
2. When an evaluation is planned for a higher level, the previous level of evaluation does not have to be comprehensive.
3. When collecting and analyzing data, use only the most credible sources.
4. When analyzing data, choose the most conservative among alternatives.
5. At least one method must be used to isolate the effects of the meeting.
6. If no improvement data are available, it is assumed that little or no improvement has occurred.
317
7. Estimates of improvement should be adjusted for the potential error of the estimate.
8. Extreme data items and unsupported claims should not be used in ROI calculations.
9. Only the first year of benefits should be used in the ROI analysis of short-term projects.
10. Meeting costs should be fully loaded for ROI analysis.
11. Intangible measures are defined as measures that are purposely not converted to monetary value.
12. The results from the ROI methodology must be communicated to all key stakeholders.
Guiding Principles
318
Typical Intangible Measures Linked with Programs
Job satisfaction Organizational
commitment Climate Engagement Employee complaints Recruiting image Brand awareness Stress Leadership effectiveness Resilience
Caring Career minded Customer satisfaction Customer complaints Customer response
time Teamwork Cooperation Conflict Decisiveness Communication
319
Identification of Intangible Measures: Timing and Source
Needs Assessment
ROI Analysis Planning
Data Collection
Data Analysis
1 2 3 4
IntangibleMeasures
IntangibleMeasures
IntangibleMeasures
IntangibleMeasures
320
Issues with Intangibles
May be the most important data set
Are not converted to money by definition
Are usually not subjected to “isolating”
Must be systematically addressed
Must be reported “credibly”
321
Reporting Intangibles
Usually presented as a table
Must indicate how the data were collected
Use rules to decide if a measure should be listed
Be prepared for further analysis
322
Communication Challenges
Measurement and evaluation are meaningless without communication
Communication is necessary for making improvements
Communication is a sensitive issue
Different audiences need different information
323
Communication Principles
Keep communication timely Target communication to specific audiences Stay unbiased and modest with the message Carefully select communication media Keep communication consistent with past practices Incorporate testimonials from influential individuals Consider your function’s reputation when
developing the overall strategy Use language your audience understands
324
Audience Selection Questions
Are they interested in the program? Do they really want to receive the information? Has someone already made a commitment to
them regarding communication? Is the timing right for this audience? Are they familiar with the program? How do they prefer to have results
communicated? Are they likely to find the results threatening? Which medium will be most convincing to them?
325
Common Target Audiences
Reason for Communication Primary Target Audience
Secure approval for program Client, top executives
Gain support for the program Immediate managers, team leaders
Build credibility for the staff Top executives
Enhance reinforcement of the program Immediate managers
Enhance results of future programsParticipants
Show complete results of the program Key client team
Stimulate interest in programs Top executives
Demonstrate accountability for client
expenditures All employees
Market future programs Prospective clients
326
Complete Report General information Methodology for impact study Data analysis Costs Results Barriers and enablers Summary of findings Conclusions and recommendations Exhibits
327
The Impact Study Serves Several Purposes:
The method of communicating results, only for those audiences needing detailed information.
As a reminder of the resources required to produce major studies.
As a historical document of the methodology, instruments, and processes used throughout the impact study.
A teaching and discussion tool for staff development.
328
Select Media
Impact Studies• Full report• Executive summary• General overview• One-page summary
Meetings• Executive meetings• Manager meetings• Staff meetings• Panel discussions• Best practice meetings
Internal Publications• Announcements• Bulletins• Newsletters• Magazines
Progress Reports• Schedules• Preliminary results• Memos
and . . . .
329
Select Media
Case Studies Program Brochures Scoreboards Electronic Media
• E-mail• Web sites• Video• blogs
330
Impact Study Outline General Information
• Objectives of study• Background
Methodology for Impact Study Levels of evaluation ROI Process Collecting data Isolating the effects of the program Converting data to monetary values Costs Assumptions (Guiding Principles)
Builds credibility
for the process
331
Impact Study Outline Results
General information Response profile
The results with six
measures: Levels 1-5
and Intangibles
Participant reaction Learning Application of skills / knowledge Barriers Enablers Business impact
General comments Linkage with business measures
Costs ROI calculation Intangible benefits
332
Impact Study Outline
Summary of Findings
Conclusions and Recommendations
Conclusions
Recommendations
Exhibits
333
Communicating with Senior Management
Do they believe
you
Can they
take it
334
Purpose of the Meeting
Create awareness and understanding of ROI
Build support for the ROI methodology
Communicate results of study
Drive improvement from results
Cultivate effective use of the ROI methodology
335
Meeting Ground Rules
Do not distribute the impact study until the end of the meeting
Be precise and to the point
Avoid jargon and HR speak
Spend less time on the lower levels of evaluation data
Present the data with a strategy in mind
336
Presentation Sequence
1. Describe the program and explain why it is being evaluated
2. Present the methodology process
3. Present the reaction and learning data
4. Present the application data
5. List the barriers and enablers to success
6. Address the business impact
337
Presentation Sequence
7. Show the costs
8. Present the ROI
9. Show the intangibles
10. Review the credibility of the data
11. Summarize the conclusions
12. Present the recommendations
338
Communication Progression
First 2 ROI Studies
3-5 ROI Studies
6 Plus ROI Studies
Detailed Study
Executive Summary
One Page Summary
Meeting
No Meeting
No Meeting
339
ROI Impact Study: One-Page Summary
Program Title: Preventing Sexual Harassment at Healthcare, Inc.
Target Audience: First and Second Level Managers (655)
Secondary: All employees through group meetings (6,844)
Duration: 1 day, 17 sessions
340
Brief Reports
Executive Summary
Slide Overview
1-page Summary (see example)
Brochure
341
Electronic Reporting
Website
blogs
Video
342
Mass Publications
Announcements
Bulletins
Newsletters
Magazines
343
Case StudyInternal Use
Communicate results
Teach others
Build a history
Serve as a template
Make an impression
344
Case StudyExternal Publication
Provide recognition to participants
Improve image of function
Enhance brand of department
Enhance image of organization
345
Micro Level Scorecard Macro Level Scorecard
0
1
2
3
4
0
1
2
3
0 Indicators
1 Reaction
2 Learning
3 Application
4 Impact
5 ROI
Intangibles
0
1
2
3
4
346
Building a Macro Scorecard
Provides macro-level perspective of success Serves as a brief report versus detailed study Shows connection to business objectives Integrates various types of data Demonstrates alignment between programs,
strategic objectives, and operating goals
347
Seven Categories of Data
Indicators
Reaction and Planned Action
Learning
Application
Business Impact
ROI
Intangibles
348
Potential Reporting
0. Indicators
1. Number of Employees Involved
2. Total Hours of Involvement
3. Hours Per Employee
4. Training investment as a Percent of Payroll
5. Cost Per Participant
349
Potential Reporting
I. Reaction and Planned Action
1. Percent of Programs Evaluated at this Level
2. Ratings on 7 Items vs. Target
3. Percent with Action Plans
4. Percent with ROI Forecast
350
Potential Reporting
II. Learning
1. Percent of Programs Evaluated at This Level
2. Types of Measurements
3. Self Assessment Ratings on 3 Items vs. Targets
4. Pre/Post – Average Differences
351
Potential Reporting
III. Application
1. Percent of Programs Evaluated at This Level
2. Ratings on 3 Items vs. Targets
3. Percent of Action Plans Complete
4. Barriers (List of Top Ten)
5. Enablers (List of Top Ten)
6. Management Support Profile
352
Potential Reporting
IV. Business Impact
1. Percentage of Programs Evaluated at This Level
2. Linkage with Measures (List of Top Ten)
3. Types of Measurement Techniques
4. Types of Methods to Isolate the Effects of Programs
5. Investment Perception
353
Potential Reporting
V. ROI
1. Percent of Programs Evaluated at This Level
2. ROI Summary for Each Study
3. Methods of Converting Data to Monetary Values
4. Fully Loaded Cost Per Participant
354
Potential Reporting
Intangibles
1. List of Intangibles (Top Ten)
2. How Intangibles Were Captured
355
Appropriate Level of Data
1 2 3 4 5
Adjust program design Improve program delivery Influence application and impact Enhance reinforcement Improve management support Improve stakeholder satisfaction
Recognize and reward participants Justify or enhance budget
Develop norms and standards Reduce costs Market programs Expand implementation to other areas
Use of Evaluation Data
356
Delivering Bad News
Never fail to recognize the power to learn and improve with a negative study.
Look for red flags along the way. Lower outcome expectations with key
stakeholders along the way. Look for data everywhere. Never alter the standards. Remain objective throughout the process.
and . . .
357
Delivering Bad News
Prepare the team for the bad news. Consider different scenarios. Find out what went wrong. Adjust the story line to “Now we have data
that shows how to make this program more successful.” In an odd sort of way, this becomes a positive spin on less-than-positive data.
Drive improvement.
358
Analyze the Results of Communication
Observe reactions
Solicit informal feedback
Collect formal feedback
Monitor blogs
Make adjustments
359
ROI Possibilities
Pre-Program ROI forecast End-of-Program ROI forecast with Level 1
Data End-of-Program ROI forecast with Level 2
Data Follow-Up ROI forecast with Level 3 Data Follow-Up ROI evaluation with Level 4 Data
360
ROI at Different Levels Data Collection
Timing - (Relative Cost toROI with: to the Initiative) Credibility Accuracy Develop Difficulty
Least Least Least LeastCredible Accurate Expensive Difficult
Pre-Program BeforeForecast
Level 1 Data During
Level 2 Data During
Level 3 Data After
Level 4 Data AfterMost Most Most Most
Credible Accurate Expensive Difficult
361
IDENTIFYINTANGIBLEBENEFITS
ESTIMATECHANGEIN DATA
ISOLATETHE EFFECTS
OF THEPROGRAM
CONVERTDATA TO
MONETARYVALUES
CALCULATETHE RETURN ON
INVESTMENT
TABULATEPROGRAM
COSTS
Pre-Program Forecast ROI Model
362
Retail Merchandise CompanyQuestions for Discussion
1. Is a pre-program forecast possible?
2. Which groups should provide input to the forecast?
363
“Expert” Input for Estimate
Sales Increase Estimate (Δ)
Forecasted ROI
Sales AssociatesDept. ManagersStore ManagersSr. Executive AnalystVendorMarketing AnalystFinance StaffBenchmarking Data
0%5%
10%15%12%25%4%2%9%
-100%-30%33%110%95%
350%-40%-80%22%
364
Retail Merchandise CompanyQuestions for Discussion
1. Assess the credibility of each “expert” group.
2. Is there any additional information you need?
3. How would you present this to senior management to make a decision to implement the program?
365
Steps for Pre-Program ROI Forecast
Develop Level 3 and 4 objectives, with as many specifics as possible
Estimate/Forecast monthly improvement in Level 4 data (ΔP)
Convert Level 4 measure to monetary value (V)
Develop the estimated annual impact for each measure (ΔPxVx12)
Estimate fully-loaded program costsand . . .
366
Steps for Pre-Program ROI Forecast
Calculate the forecasted ROI using the total projected benefits
Use sensitivity analysis to develop several potential ROI values with different levels of improvement (ΔP)
Identify potential intangible benefits
Communicate analysis with caution
Steps to Pre-Program Forecast
SourceMonthly Change Value
Annual Change Cost ROI
SME $25,000 $500 $6,000 $5,000 20%Vendor $50,000 $1,000 $12,000 $5,000 140%Participant $30,000 $600 $7,200 $5,000 44%Supervisor $28,000 $560 $6,720 $5,000 34%
367
Measure: SalesProfit Margin: 2%
368
Sensitivity AnalysisPotential Sales
Increase (Existing Customers)
Potential Complaint Reduction (Monthly
Reduction) Expected ROI
$25,000
$25,000
$25,000
$50,000
$50,000
$50,000
$30,000
$30,000
$30,000
10
20
30
10
20
30
10
20
30
60%
90%
120%
90%
120%
150%
120%
150%
180%
369
Input to Forecast
Previous experience with same or similar programs
Supplier/Designer experience in other situations
Estimates from supplier/designer
Estimates from SMEs
Estimates from client/sponsor
Estimates from target participants
370
Forecasting ROI from a Pilot Program
Develop Level 3 and 4 objectives
Design/Develop pilot program without the bells and whistles (or use a supplier program)
Conduct the program with one or more “typical” groups
Develop the ROI using the ROI Process model for Level 4 post-program data
Make decision to implement based on results
371
Level 1 Measures
Program content Materials Facilitator / coordinator Relevance / importance Perceived value Amount of new information Recommendation to others Planned improvements Opportunity for forecast
372
Important Questions to Ask onFeedback Questionnaires
Planned ImprovementsPlease indicate what you will do differently on the job as a result of this program1.________________________________________________________2.________________________________________________________3.________________________________________________________
As a result of any change in your thinking, new ideas, or planned actions, please estimate (in monetary values) the benefit to your organization (i.e., reduced absenteeism, reduced employee complaints, better teamwork, increased personal effectiveness, etc.) over a period of one year. __________________
What is the basis of this estimate?_______________________________________
What confidence, expressed as a percentage, can you put in your estimate?(0%=No Confidence; 100%=Certainty) ____________________%
373
ROI with Level 1 Data
What knowledge or skills have been improved? What actions are planned with the improved
knowledge and skills? Which measures will be influenced? What impact, in monetary units, will this improvement
have in the work unit? What is the basis for this estimate? What level of confidence do you place on this
estimate?
Then, compare total “adjusted” benefits with program costs.
At the end of the program, ask participants:
374
Participant No.
Sales Increase Estimate Basis
Confidence Level
1 $20,000 Sales 90%
2 $9,000 2 sales per day 80%
3 $50,000 Sales increase 70%
4 $10,000 3 sales daily 60%
5 Millions 4 sales each day 95%
6 $75,000 More sales 100%
7 $7,500 3 more sales 80%
8 $25,000 4 sales – 1 sale 75%
9 $15,000 One more sale 30%
10 $20,000 2 new sales 80%
11 $45,000 Sales 90%
12 $40,000 2 sales each day 70%
13 0 No increase 60%
14 $150,000 Many new sales 95%
15 Unlimited Additional sales 50%
16 $37,000 More sales and satisfaction 100%
375
Retail Merchandise CompanyQuestions for Discussion
1. What is your strategy for analyzing this data?
2. How reliable is this data?
3. How could you use this data?
376
Level 2 Evaluation
Tests Opportunity for forecast Skill practices Self reports Exercises Observations during the training program Checklists by facilitator Team assessments
377
ROI with Level 2 Data
Develop an end-of-program test that reflects program content
Establish a relationship between test data and output performance for participants
Predict performance levels of each participant with given test scores
Convert performance data to monetary value
Compare total predicted value of program with program costs
378
Relationship Between Test Scores and Performance
On-the-Job Performance
90
80
70
60
1 2 3 4 5
Test Scores
379
On-the-Job Sales Increase
90
80
70
60
5% 10% 15% 10% 25%
Test Scores
Relationship Between Test Scores and Sales Performance
380
Retail Merchandise CompanyQuestions for Discussion
1. Calculate the forecasted ROI.
2. How reliable is this estimate of ROI at Level 2? What other issues might need to be considered in this process?
3. Is this information useful? If so, how should the information be used?
381
Projected Benefit
$9,698 x .14 x .02 x 48 = $1,303
BCR =$1,303
$687= 1.9
ROI = 90%
382
Retail Merchandise CompanyQuestions for Discussion
1. What is the ROI for this program?
2. How credible is this approach to calculating ROI?
3. Could this same approach be used to forecast the value prior to the implementation of the program?
383
ROI Calculation
Net BenefitsCosts
ROI = X 100 =
=BCR = BenefitsCosts
384
ROI Calculation
ROI = X 100 = $687
$3,242 - $687372%
= 4.72 BCR =$687
$3,242
385
ROI with Level 3 Data
Develop competencies for the target job. Indicate percentage of job success that is covered in
the program. Determine monetary value of competencies, using
salaries and employee benefits. Compute the worth of pre- and post-program skill
levels. Subtract post-program values from pre-program
values. Compare the total added benefits with the program
costs.
386
Advantages of Forecasting
Increases the usefulness of data collection
Focuses attention on business outcomes
Monitors the path to success
Compares forecast to actual results to improve forecasts
387
Forecasting Realities
If you must forecast, forecast frequently
Consider forecasting an essential part of the evaluation mix
Forecast different types of data
Secure input from those who know the process best
Long-term forecasts will usually be inaccurate
388
Forecasting Realities
Expect forecasts to be biased Serious forecasting is hard work Review the success of forecasting routinely The assumptions are the most serious
error in forecasting Utility is the most important characteristic of
forecasting
389
Barriers to ROI Use and Implementation
After mastering the ROI model, it is appropriate to examine implementation in
more detail. Please take a few moments to identify the barriers to implementation. List all the “things” that can prevent a successful
implementation. Be candid.
390
Overcoming the Barriers
Now, identify the actions needed to minimize, remove, or go around the barriers. List all the “steps” that need to be taken to
overcome the barrier.
391
Implementation Issues
Resources (staffing / budget)
Leadership (individual, group, cross functional team)
Timing (urgency, activities)
Communication (various audiences)
Commitment (staff, managers, top executives)
392
Typical Barriers
I don’t have time for additional measurement and evaluation.
An unsuccessful evaluation will reflect poorly on my performance.
A negative ROI will kill my program. My budget will not allow for additional
measurement and evaluation. Measurement and evaluation are not part of my
job.and . . .
393
Typical Barriers
I didn’t have input on this process.
I don’t understand this process.
Our managers will not support this process.
Data will be misused.
The data are too subjective.
394
Building Blocks to Overcome Resistance
Utilizing Shortcuts
Monitoring Progress
Removing Obstacles
Preparing the Management Team
Initiating the ROI Projects
Tapping into a Network
Preparing the Staff
Revising Policies and Procedures
Establishing Goals and Plans
Developing Roles and Responsibilities
Assessing the Climate for Measuring ROI
395
Assessing the Climate for Results
Survey staff (team members)
Survey staff from management perspective
Develop gaps (actual vs. desired)
Plan actions
396
Identifying Champions
You cannot do it alone
Champions have a passion for accountability
Consider a champion from each area
Network the champions
Recognize the champions
397
Measurement and Evaluation Implementation Project Plan for a Large
Petroleum Company
Team formed Jan
Policy developed Feb-Apr
Targets set Jan-Feb
Workshops developed Mar-Jul
Application Evaluation Project (A) Apr-Sept
Impact Evaluation Project (B) Jun-Jan
Impact Evaluation Project (C) Sept-Mar
and . . .
398
Measurement and Evaluation Implementation Project Plan for a Large
Petroleum Company
ROI Project (D) Nov-Aug
Staff trained Aug-Jan
Vendors trained Feb-Apr
Managers trained May-Aug
Support tools developed Apr-May
Evaluation guidelines developed Feb-Jun
399
Responsibilities for Champions
Designing data collection instruments
Providing assistance for developing an evaluation strategy
Analyzing data, including specialized statistical analyses
Interpreting results and making specific recommendations
and . . .
400
Responsibilities for Champions
Developing an evaluation report or case study to communicate overall results
Providing technical support in any phase of measurement and evaluation
Assisting in communicating results to key stakeholders
401
Responsibilities for Team Members
Ensure that the needs assessment includes specific business impact measures.
Develop application objectives and business impact objectives for each program.
Focus the content of the program on the objectives of business performance improvement; ensuring that exercises, case studies, and skill practices relate to the desired objectives. and . . .
402
Responsibilities for Team Members
Keep participants focused on application and impact.
Communicate rationale and reasons for evaluation. Assist in follow-up activities to capture business
impact data. Provide assistance for data collection, data
analysis, and reporting. Design simple instruments and procedures for data
collection and analysis. Present evaluation data to a variety of groups.
403
Getting Team Members Involved
Developing plans
Establishing responsibilities
Designing tools and templates
Selecting programs for higher level evaluation
Driving changes / improvements
404
Participant Responsibilities
Actively participate
Learn what’s needed
Apply and implement program
Secure results
Provide data
405
Conduct Several Studies
Cover a variety of areas
Move from simple to complex
Mix up Levels 3, 4, and 5
Avoid political issues early in the process
406
Conduct Workshops and Briefings
1 to 1½-hour briefings
1-day workshops
2-day workshops
Special topics
407
Creating an ROI Network
Within the organization
Within the local area
Within the community
408
Typical Network Issues
Communication methods
Membership rules
Meeting times
Topics / Issues
Monitoring / Managing
409
Typical Network Topics
Tool / Template sharing
Collaborative projects
Research / Benchmarking
Sounding board
Project critiques
Technology review
410
Key ROI Issues
Time
Cost
Complexity
Accuracy
Credibility
Lack of Skills
411
Cost-Saving Approaches to ROI
Plan for evaluation early in the process Build evaluation into the process Share the responsibilities for evaluation Require participants to conduct major steps Use short-cut methods for major steps Use sampling to select the most appropriate
programs for ROI analysis
and . . .
412
Use estimates in the collection and analysis of data
Develop internal capability to implement the ROI process
Utilize web-based software to reduce time
Streamline the reporting process
Cost-Saving Approaches to ROI
413
Tools and Templates
Instruments
Costs
Analysis
Reporting
414
Technology
Reaction / Learning surveys
Test design
Follow-up surveys
Statistics packages
ROI software
Scorecards
415
Suggested Evaluation Targets
Level
Level 1 - Reaction
Level 2 - Learning
Level 3 - Application
Level 4 - Business Impact
Level 5 - ROI
Target
100%
60%
30%
10-20%
5-10%
416
Worksheet – Project/Program Selection Criteria
List each project/program that fits Level 3 criteria in the left column. Rank each project/program in its category as High Priority (HP), Special Attention (SA), or Business as Usual (BAU).
Compliance Project/ProgramCustomer Service Project/ProgramSales ProgramCall Center or other Customer Transaction ProgramOrganization Sponsored Certification Program
417
Level 3 Priority Ranking
High Priority• Project/Program clearly must be evaluated at
Level 3 in the short term. Special Attention
• May not be evaluated at Level 3 in the short term, but there are enough issues that an assignment will be made to assess the situation.
Business as Usual• Continue with current strategy for this program.
418
Worksheet – Project/Program Selection Criteria
List each project/program you are considering evaluating in the left column. Rank each program as 1, 2, 3, 4, or 5 for each of the ten criteria.
Life Cycle of Project/ Program
Operational Objectives
Strategic Objectives Costs Audience Size
Visibility Investment of time Needs Assessment
Conducted Management Interest Quality of Data
Collection Processes
419
Criteria for Selecting Programs for Levels 4 and 5 Evaluation
Life cycle of the solution Linkage of solution to operational goals and
issues Importance of solution to strategic objectives Top executives are interested in the evaluation Cost of the solution Visibility of the solution Size of the target audience Investment of time
420
Results-Based Policy Statement
Provides focus for the staff
Communicates results-based philosophy
Sets goals and targets for evaluation
Determines basic requirements
Serves as a learning tool
421
Results-Based Policy Key Elements
Purpose / Mission / Direction Evaluation targets Evaluation support group functions Responsibility for results Management review of results Follow-up process Communication of results
422
Evaluation Procedures and Guidelines
Show how to utilize tools and techniques
Guide the design process Provide consistency in the process Ensure that the appropriate methods
are used Keeps the process on track Place emphasis on the desired areas
423
Management Influence
Commitment usually refers to the top management group and includes its pledge or promise to allocate resources.
Management support refers to the action of the entire management group and reflects the group’s attitude towards the HR process and staff.
and . . .
424
Management Influence
Management involvement refers to the extent to which executives and managers are actively engaged in the HR process in addition to participating in the program.
Management reinforcement refers to the actions designed to reward and encourage a desired behavior.
425
Why Managers Don’t Support Your Programs
No results
Too costly
No input
No relevance
No involvement
No time
No preparation
Lack of knowledge about HR
No requirements
426
Management Action
Target GroupScope Payoff
Commitment Top executives All programs Very high
Support Mid managers, 1st Level managers
Several programs
High
Reinforcement 1st Level managers
Specific programs
Moderate
Involvement All levels Specific programs
Moderate
427
The Results Commitment Relationship
Top Management Commitment
Successful Programs
Business Results
428
CEO Commitment Checklist
429
Ten Commitments
1. Develop or approve a mission
2. Allocate the necessary funds
3. Allow employees time to participate
4. Become actively involved
5. Support the learning effort
430
Ten Commitments
6. Position the function
7. Require evaluation
8. Insist on cost effectiveness
9. Set an example
10. Create an atmosphere of open communication
431
Why Programs Don’t Work
1. Immediate manager does not support the program.
2. The culture in the work group does not support the program.
3. No opportunity to use the program.4. No time to implement the program.5. Didn’t learn anything that could be applied
to the job.6. The systems and processes did not
support the program.
432
Why Programs Don’t Work
7. Didn’t have the resources available to use the program.
8. Changed job and the program no longer applies.
9. This is not appropriate in our work unit.
10. Didn’t see a need to use the program.
11. Could not change old habits.
433
Questions for Discussion
1. When considering the situation, what specifically can be done to enhance the program success?
2. How important is the role of the manager of participants in programs?
3. What implication does this have for your programs?
434
The Transfer of Successto the Job
Before During After
Manager
Participant
Facilitator/ OrganizerR
OL
E-P
LA
YE
RS
TIMEFRAME
435
Ideal Management Support
Gives endorsement and approval for participants to be involved in program.
Volunteers personal services or resources to assist in the program’s implementation.
Makes a pre-program commitment with the participant concerning expected efforts.
Reinforces the behavior change resulting from the program.
Conducts a follow-up on program results. Gives positive rewards for participants who
experience success with the program.
436
Ideal Reinforcement
Helping the participant diagnose problems to determine if the program is needed
Discussing possible alternatives to help the participant apply the skills and implement the program
Encouraging the participant to implement the program
Serving as a role model for the proper use of the skills
Providing positive rewards to the participant when the program is successfully implemented
437
Levels of Management Support
Supportive: strongly and actively supports all of our efforts.
Responsive: Supports programs, but not as strongly as the supporting manager.
Non-Supportive: Privately voices displeasure with our programs on a formal basis.
Destructive: Works actively to keep participants from being involved in our programs.
438
ROI: Tools vs. Relationships
Tools
Relationships
Program DevelopersProgram CoordinatorsProgram FacilitatorsProgram AdvisorsProgram Managers
ParticipantsSupervisors
Managers
439
Types of Management Involvement
As members of advisory committees As members of task forces As subject matter experts As participants As program leaders As evaluators As program sponsors As purchasers of services In a newly-defined role In rotational assignments
440
Potential Manager Involvement
Steps in the Process Opportunity Strategy
Conduct Analysis High Taskforce
Develop Measurement and Evaluation System
Moderate Advisory committee
Establish Program Objectives
High Advisory committee
Develop Program Moderate Taskforce
Implement Program High Program leader
441
Potential Manager Involvement
Steps in the Process Opportunity Strategy
Monitor Costs Low Expert input
Collect and Analyze Data
Moderate Expert input
Interpret Data and Draw Conclusions
High Expert input
Communicate Results Moderate Manager as participant
442
Concerns About HR From a Key Manager
Results are not there
This is not my responsibility
I don’t have time for HR
I don’t understand what you do
No respect for HR
443
Managers Workshop Objectives
See the results of HR. Understand his or her responsibility for HR. Identify areas for personal involvement in the HR
process. Develop specific behaviors to support and
reinforce program objectives. Realize the importance of the HR function in
achieving departmental, division, and company goals.
After completing this workshop, each manager should:
444
Steps to Develop a Partnership
1. Assess the current status of partnership relationships.
2. Identify key individuals for a partnership relationship.
3. Learn the business.
4. Consider a written plan.
5. Offer assistance to solve problems.
6. Show results of programs.
7. Publicize partners’ accomplishments and successes.
445
Steps to Develop a Partnership8. Ask the partner to review needs.
9. Have partner serve on an advisory committee.
10. Shift responsibility to the partner.
11. Invite input from the partner about key plans and programs.
12. Ask the partner to review program objectives, content, and delivery mechanisms.
13. Invite the partner to conduct or coordinate a program or portion of a program.
14. Review progress and re-plan strategy.
446
Partnering Principles
1. Have patience and persistence throughout the process.
2. Follow win-win opportunities for both parties.
3. Deal with problems and conflicts quickly.
4. Share information regularly and purposefully.
5. Always be honest and display the utmost integrity in all the transactions.
447
Partnering Principles
6. Keep high standards of professionalism in each interaction.
7. Give credit and recognition to the partner routinely.
8. Take every opportunity to explain, inform, and educate.
9. Involve managers in as many activities as possible.
448
Annual HR Review Agenda Review of previous year’s HR programs Methods / levels of evaluation Results achieved from programs Significant deviations from expected results Basis for determining HR needs for next year Scheduled programs Proposed methods / levels of evaluation Potential payoffs Problem areas in the HR process Concerns from top management
449
Action Plan for Improvement
Develop a plan of implementation for improving measurement and evaluation in your organization. Consider all of the items
included in this and other modules. Identify a particular time frame and key
responsibilities.
450
Issue Actions Time Responsibility
1. Perception of HR2. Needs assessment/analysis3. Objectives4. Reaction measures5. Learning measures6. Application measures7. Impact measures8. ROI measures9. Use of technology10. Communicating results11. Management influence12. Staff development13. Roles / responsibilities
451
International Car Rental