Upload
others
View
6
Download
0
Embed Size (px)
Citation preview
Human ResourcesProgram-Evaluation
Handbook
Jack E. Edwards(l.S. General Accounting Office
John C ScottApplied Psychological Techniques, Inc.
Nambury S. RajuIllinois Institute of Technology
The opinions expressed in this book are those ofjhe authors and do not necessarilyreflect the views of the U.S. General Accounting/Office or the Federal government.
y /©\SAGE Publicationsi International Educational and Professional PublisherThousand Oaks • London • New Delhi
Contents
Preface xxiii
5URCES PROGRAM EVALUATION
Overview of Program EvaluationDale S. Rose, E. Jane Davidson
Program Evaluation in Human ResourcesEvaluation Myths
Myth 1: It Is Impossible to Measure . . .Myth 2: There Are Too Many
'Variables to Do a Good StudyMyth 3: No One Is Asking
for Evaluation, So Why Bother?Myth 4: Negative Results Will Hurt My Program
Key DistinctionsProcess Versus Outcome EvaluationProgram Improvement Versus Program SelectionProgram Evaluation Versus Utility Analysis
Who Does Program Evaluation?Choosing Criteria for SuccessPractical Design Considerations
Standards of ProofDesigning an Adequate EvaluationMeasurement Issues
Reliability and Validity ^>Quantitative and Qualitative Data
Costs and BenefitsIdentifying Human Resource NeedsConsidering Cost-Benefit Trade-OffsConcluding Comments on Costs and Benefits
Utilization
1
3
455
6
677899
10131415151616171818192021
Evaluation Readiness 22Communicating Results 22Applying Findings 23
Conclusion 24
2. Job "Analysis—The Basis for DevelopingCriteria for All Human Resources Programs 27Peter Y. Chen, Jeanne M. Carsten, Autumn D. Krauss
Uses of a Proactive Job Analysis Program 28Assessing the Need for a Job
Analysis Program and Preparing for It 29Conducting a Job Analysis Program 30
Competence of Job Analysts 31Sources and Number of SMEs 32Methods of Collecting Information 32
Questionnaires 33O*NET Database 34
Steps to Collect Job Information 35Task-generation Interviews and Survey 36KSAO-ldentification Interviews and Survey 39
Applications for Job Analysis Results 42Application to Personnel'Selection 42Application to Training 42Application to Performance
/ Evaluation and Competency Modeling 43• ' Application to Employee' Physical and Psychological Well-Being 44
Other Applications 44Conclusion 45
3. Criteria for Human Resources Program Evaluation 49Stephen David Steinhaus, L. A. Witt
Common Approaches and Pitfalls 50One Measure to Serve All Masters 50Getting Past the Obvious 51Ramifications of Selecting Poor Criteria 52
Characteristics of Good Criteria 53Reliability, Validity, and Other
Measurement Factors 54Reliability and Validity 54Measures Based on Clearly Observable Events 55Measurable 55
/' Freedom From Bias 56Relevance 56
/ Meaningfulness to Stakeholders 57./
Focus on Value as Opposed toReturn on Investment as a Proxy Measure 57
Actionable Results 58Practicality 59
Practicality and Costs 59Realistic and Credible Goals 59Organization Politics 60
Practical Steps in Criterion Development and Implementation 60Involve a Broad Project Team 62Clarify Program Goals and Expected Impacts 62Review All Available Data 63Involve Stakeholders Other Than the
HR Program Evaluation Team 63Develop Data Collection Strategy and Tools 64Implement Data Collection 64Analyze Criterion Measurements 65Communicate Results 66
Final Comments 67
PART II: STAFFING 69
4. Recruitment " 71Michael M. Harris, Elliot D. Lasson
Understanding the Recruitment Process 72Recruitment Sources 73
Traditional Sources 75Employee Referrals 75Print Ad 75Search Firms 76College Campus Recruitment 76Radio Ads 77
Internet-Based Approaches 77Job Boards 77E-Recruiting 77Relationship Recruiting 78
Evaluating the Recruitment Function 78Using Recruitment Outcomes for Evaluation 79
Assessing Costs 80Advantages of Estimating Costs 80Potential concerns in Estimatin.g^Costs 81
Using Applicant Predictors and Criteria for Evaluation 81Assessing Predictor and Criterion Results 81Advantages of Evaluating Predictors and Criteria 82Potential Concerns When
Evaluating Predictors and Criteria 82
Using Applicant Perceptions of the RecruitmentProcess for Evaluation 83
Assessing Applicant Reaction 83Advantages of Assessing Applicant Perceptions 84Potential Concerns in
Assessing Applicant Perceptions 84Using Organizational Reputation for Evaluation 84
Assessing Organizational Reputation 85Advantages of Assessing
Organizational Reputation 86Potential Concerns When Assessing
Organizational Reputation 86Conclusions 86
5. Setting Standards 89Andrew J. Falcone, Nambury S. Raju
Setting Standards for Program Evaluation 90Criterion-Referenced
Versus Norm-Referenced Approaches 91Nonmeasurement Aspects of Standard Setting 93
Evaluation of the Standard-Setting Programs 94Subject Matter Expert (SME) Selection 94
' Job Experience and Competency 94Geographic Representation 95Number of SMEs 96
; Job Analysis 97Examination Specifications 97
Standard-Setting Procedures 98Angoff Method 99Nedelsky Method 102Bookmark Method 104
Summary 106
6. Evaluating Personnel Selection Systems 109Scott B. Morris, Russell Lobsenz
Program Evaluation Process 110Forming the Evaluation Team 110Evaluation Criteria 111
Reliability 112Types of Reliability Estimates 113
Test-Retest Method 113Alternate Forms Method 113Internal Consistency Methods 114Interrater Reliability 114Generalizability Theory 114
Interpreting Reliability 115
Reliability CoefficientStandard Error of Measurement (SEM)Effect of Study Design
ValidityCriterion-R'elated ValidityContent-Oriented ValidityConstruct ValidityValidity Generalization (VG)Selection Decisions
Cutoff Scores, Ranking, and BandingCombining Scores From
Multiple Employment TestsTest Administration PracticesFairness, Bias, and Discrimination
BiasItem BiasTest Bias
Illegal DiscriminationPerceived Fairness
Utility AnalysisConclusion
115115115116117118119120121122
122122123123124124124126126127
Selecting Managers and Executives:The Challenge of Measuring Success 130Rob Silzer, Seymour Adler
Selection Context 132Context of Management and Executive Roles 132
Role Complexity and Change 132Management Versus Executive Positions 132Impact of the Individual 133
Selection Considerations 133Multiple Stakeholders 133Sequential Selections and Candidates 134Levels of Fit 134
Evaluating Selection Design 135Evaluating Target Competencies 135Evaluating Assessment Tools 136
Evaluating Selection Administration 140Design of Selection Administration 140Records and Documents /? 141Implementation of the Selection Process 141
Evaluating Selection Decisions 143Data Interpretation 143
Behavioral Indicators 144Actual Behavior 144
Data Integration 145
Evaluating Selection Outcomes 148Conclusions 150
PART III: EVALUATINGAND REWARDING EMPLOYEES 153
8. Performance Appraisal and Feedback Programs 155Janet L. Barnes-Farrell, Angela M. Lynch
Goals of Appraisal and Feedback Systems 156Organizational Perspectives and Goals 156Appraisers' Perspectives and Goals 158Workers' Perspectives and Goals 160
Functions of Performance Appraisal 161Evaluating Performance
Appraisal Measurement Functions 162What Should Be Measured? 162Who Should Measure? 162How to Measure? 165
Evaluating the CommunicationFunction of Performance Appraisal 169
Corporate Communication Function 169z Individual Performance Expectations and Feedback 171
Role and Preparation of the Appraiser 171Timing and Frequency of
/ the Performance Appraisal Communication 171Communicating What Is Expected 172Communicating How the Individual Performed 172
Summary and Conclusions 175
9. The Evaluation of 360-Degree Feedback Programs 177John C. Scott, Manuel London
An Overview of 360-Degree Feedback 178Administering the Program and
Using the Resulting Information 178Frequency and Method of Delivery 179Underlying Assumptions
About the Benefits of 360-Degree Feedback 179Criteria for Evaluating 360-Degree Feedback Systems 180
Survey Design 181Process Components /f 181Survey Results 182
Interrater Agreement andSelf-Other Discrepancy Scores 182
Relationships Between 360-DegreeFeedback and Other Performance Measures 183
Isolating the Unique Contributionof 360-Degree Feedback as Part ofa Comprehensive Development Program 183
Methods for Evaluating theQuality of the, 360-Degree Program 184
Reviewing Archival Records 184Stakeholder Assessments 185Benchmarking Analyses 186
Evaluators of the Survey Program 186Organizational Leaders 187Internal HR Staff 187External 360-Degree Feedback Assessment Experts 188
Evaluating the Quality and Long-TermEffects of 360-Degree Feedback 188
Attitudes About the Process 188Awareness of Performance
Dimensions and Performance Management 189Creating a Feedback Culture 189Tracking Change in 360-Degree Feedback Ratings 190
Examining Summary Data andTracking Change Across the Organization 190
Assessing Sensitivity to Others' Ratings 191Longitudinal Study 191Recommendations and Conclusion 193
10. Compensation Analysis 200Mary Dunn Baker
Who Should Be Involved in thePreparation of Compensation Analyses? 201
Pay Elements Included in a Compensation Study 202Methods of Analyzing Compensation 203
Simple Pay Equity Analyses 204Organizationwide "Raw"
Average (Median) Salary Comparisons 204What Factors Influence Pay? 205
Fair Labor StandardsAct (FLSA) Average Pay Comparison 207
Average Pay Comparisons by Grade 207Job Title Cohort Analysis 208
Criticisms of Simple Pay Analyses yf 208Applying Inferential Statistical Tests
to Simple Pay Models 210Complex Pay Equity Techniques—Multiple
Regression Analysis 211Explanatory Factors 212
How Are Regression Analyses Structured? 213Dangers of Using an Overall
Regression Method to Assess Pay Equity 215Consider Practical as Well as Statistical Significance 218How Well Does the Regression Model Fit the Data? 219
Tainted Variables 220Common Root Causes of Compensation Disparities 220
Artificial Pay Differences 220Employment Policies and Practices 221
Summary 222
PART IV: EMPLOYEE EFFECTIVENESS 223
11. Conducting Training Evaluation 225Miguel A. Quihones, Scott Tonidandel
Overview of Training Evaluation 226A Five-Step Model of Training Evaluation 228
Step 1: Identify Training Objectives 228What Are Training Objectives
and Why Do We Need Them? 228Three Components of Training Objectives 228Writing Training Objectives 229
Step 2: Develop Evaluation Criteria 230Importance of the Criteria 230Kirkpatrick's Levels 231Additional Evaluation Criteria 232Matching Criteria to Training Objectives 233
Step 3: Select an Evaluation Design 233Classical Experimental Designs 233Alternative Designs 235Selecting an Optimum Design 237
Step 4: Assess Change Due to Training 238An Illustrative Example 238Choosing an Analytic Strategy 239
Step 5: Perform a Utility Analysis 239Calculating Training Program Costs 240Calculating Program Benefits 240Calculating the Utility of a Training Program 241
Summary and Conclusions 241
12. Succession Management ' 244Michael M. Harris, Manuel London,William C. Byham, Marilyn Buckner
What Is Succession Management? 245Methods for Evaluating Competencies 246
Multisource (360-Degree) Feedback Surveys 247Acceleration CentersSM 248Providing Feedback to Pool Members 249
Determining Appropriate Developmental Activities 250Role of the CEO 250Line Manager Involvement 251Identifying the Organizational Level to Be the
Target of the Succession Management Process andthe Current and Future Requirements 251
Selection Decisions 252Additional Considerations 252
Evaluating Succession Management 253A Case Example 253Conclusion 260
13. A Practical Guide to Evaluating Coaching: TranslatingState-of-the-Art Techniques to the Real World 262David B. Peterson, Kurt Kraiger
Research on Coaching 263Challenges and Issues in Evaluating Coaching 265
Purpose 265Design •• 265Return on Investment and
the Impact of Coaching 267A Practical Guide to Evaluating Coaching 269
Step 1: Lay the Foundation 269Step 2: Design the Process 273
Recommendations 275Step 3: Implement the Process 277Step 4: Analyze the Data 278Step 5: Present the Findings 279
Final Comments 280
PART V: TEAM ANDORGANIZATIONAL EFFECTIVENESS 283
14. Team Performance 285Wendy S. Becker, John E. Mathieu
Framework for the Chapter 286Team Designs Are NOT Panaceas ^f 286
Performance Evaluation as a General Process 287Measurement Framework
for Understanding Team Performance 288Getting Started: How to Develop Team
Performance Measures 290
Five-Step Process for DevelopingTeam Performance Measures 290
Step 1: Review Existing Organizational Measures 291Step 2: Define Team Measurement Factors 291Step 3: Identify and Weight
Team Member Activities 291Step 4: Develop Team Performance
Measures and Standards 292Step 5: Create a Feedback System 293
Sources of Measurement in Teams 295The Future of Team Performance Evaluation 297Conclusions 298
15. The Evaluation of Job Redesign Processes 301Steven F. Cronshaw, Sidney A. Fine
Five Principles of Job Redesign Evaluation 305Principle 1: Job Redesign and Its Evaluation Must
Be Understood From a Systems Perspective 306The Work Organization as a Systems Component 306The Worker as a Systems Component 306The Work as a Systems Component 308
Principle 2: The Worker Is the Most' Significant Factor in Effective Job Redesign 308
Principle 3: Job Redesign and ItsEvaluation Are Continuous Processes 309
,-:• Principle 4: A Realistic and Practical Understanding; of the Work System Is Needed to Effectively
Use Evaluation Results 310Principle 5: Conditions Before and During the
Job Redesign Must Be Considered in Evaluation 311Worker Criteria for the Evaluation of Job Redesign Programs 311
Adequate Discretion in Decision Making 312Opportunity to Learn on the Job and Keep on Learning 312Job Variety 313Mutual Support and Respect 314Experienced Meaningfulness of the Work 314A Desirable Future for the Worker 315
Management Criteria for the Evaluationof Job Redesign Programs 316
Reduction of Bottlenecks and Production Problems 317Improvement of Work Team Functioning 317Compliance With Government Laws and Regulations 318
The Summative Evaluation of Job Redesign 319Bringing Together Worker and
Management Criteria in Successful Job Redesign 319Conclusions 320
16. Organization Development 322Allan H. Church
Overview of Organization Development 324A Process for Evaluating OD Interventions 327
Scoping / 328Purpose of the Evaluation 328The Role of Evaluator and Key Stakeholders 329Timing of the Evaluation 330
Designing 330Determining the Level of Impact to Evaluate 331Identifying the Evaluation Methods 332Deciding on Data Source and Level of Detail 332
Collecting and Analyzing Data 333Working With International Populations 334Ensuring Collection of the Right Amount of Data 334
Communicating 335Telling a Compelling Story 000Maintaining Balance and Integrity 336Understanding Reactions to Feedback 337
Several Case Examples 337Case 1: Formative Evaluation
Feedback Saves the Day 337Case 2: A Case of Poor Scoping 338Case 3: Showing That Survey/ Action Planning Really Works 339
Conclusion 340
17. Evaluating Diversity Programs 343Paul Rosenfeld, Dan Landis, David Dalsky
Evaluating Diversity Programs: Barriers and Benefits 344Barriers: Reasons Diversity
Programs Might Not Be Evaluated 345Superficial Commitment to Diversity 345Ignorance Is Bliss: Fear of
What Might Be Learned 345Impact, Cost, and Time Involved in Evaluation 345
Benefits: Reasons Diversity Programs ShouldBe Evaluated 346
Determines Impact, Detects Deficiencies,and Identifies Areas for Improvement 346
Signals Commitment 346Fends Off the Critics 347
Evaluating Diversity Programs: A Six-Step Plan 347Step 1. Form the Diversity Evaluation Team 347
Internal Versus External Evaluators 348
Step 2. Develop theEvaluation Plan and Measures of Success 349
Developing the Evaluation Plan 349Identifying Measures of Success 350
/ Step 3. Obtain Commitment fromOrganizational Leaders 350
Step 4. Gather Data 351Policy and Procedure Documents 351Demographic Breakouts Showing
Trends Over Time 352Survey Findings 353Individual and Focus Group Interviews 355Naturalistic Observations 356Best Practices From Organizations
That Are Recognized as Leaders in Diversity 357Step 5. Analyze Evaluation Data 358Step 6. Prepare an Evaluation Report With Action Plan 359
Develop Presentation and Evaluation Report 359Develop and Implement Action Plan 360
Summary and Conclusions 361
PART VI: ORGANIZATIONAL COMMUNICATIONS 363
18. Evaluating Organizational Survey Programs 365/ Jack E. Edwards, Bruce M. Fisher
Methods for Gathering Evaluation Data 366Reviewing Archival Records 366Interviewing Stakeholders 367
Evaluators of the Survey Program 367Internal Survey Staff 367Consortia 368External Experts 368Organization Leaders 369Rank and File Employees 369
Criteria for Judging Survey Program Quality 369Qualifications of the Survey Staff 369Questionnaire Quality 371
Bad Items 371Inadvertent Mistakes 372Respondent Inquiries and Concerns 372International Concerns 373
Generalizability of Survey Findings 373Response Rates 374Precision of Findings 375
Data Analysis and Presentation of Findings 376Analyses and Statistics 376
Presenting Findings 377Benchmarking and Best Practices 378Decisions and Changes Linked to Survey Findings 379Timeliness and Cost 379
Rapidity With Which a Survey Can Be Conducted 380Cost 381
Summary and Conclusions 383
19. A Practical Guide toEvaluating Computer-Enabled Communications 387J. Philip Craiger, Virginia Collins, Alex Nicoll
Dimensions of Communication Technologies 388Dimensions 388Groupware 390
Evaluating Corporate Needs 390Strategies for Selecting Among a Set of Alternatives 391
Compensatory Model 391Noncompensatory Model 392Application of the Models 393
Results of the Compensatory Model 393Results of the Noncompensatory^ Model 393W/iich Model to Use? ' 394
Prevalent Communication Technologies 394Videoconferencing 395Discussion Groups 396Technology-Based Training 398Instant Messaging 398Electronic Mail (E-mail) 401Corporate Web Sites 401
Computer-Enabled Communication: Impact and Policies 402Evaluating Corporate Communications Policies 402
Acceptable Use Policies 403Netiquette 403Policies Regarding the
Monitoring of Communications 404Conclusion 405
20. Customer Service Programs 407L. A. Witt, Paulette Henry, Margareta Emberger
The Role of Human Resources in Customer/Service 408Identifying Stakeholders (Who) 410
Evaluators of the Customer Service Program 410Staff Departments 411Line Departments 411External Groups 411
Working With the Stakeholders 411
Selecting the Evaluation Criteria (What)Internal Customer Measures
Performance Managementand Performance Discipline Data
/ Employee AttitudesMedical Incidents
External Customer MeasuresWallet ExpansionCustomer RetentionReferralsRequests for Rework and Complaints
Customer AttitudesLinking HR Programs With Customer
Service Outcomes (Why)Summary
412413
413415418418419419420420420
422422
PART VII: HEALTH AND WORK/LIFE BALANCE 427
21. Health and Safety Training Programs 429Michael J. Burke, Jill Bradley, Harold N. Bowers
A Systems Approach to Health and Safety Training 430t Assessing Training Needs and the Regulatory
Nature of Health and Safety Training 430Developing Instructional Objectives 432
/ Selecting and Designing Training Course Contentand Delivering Training 433
Enhanced Work Planning and ContinuousImprovement Through Training Program Evaluation 433
Measures of Health and Safety TrainingProgram Effectiveness 434
Guidelines for Assessing On-the-Job Behavior (STEP-3)Associated With Health and Safety Training 437
Planning an Evaluation of On-the-Job Behavior 439Developing and Administering New Training Program
Evaluation Forms 440Analyzing Data, Following Up With Participants,
and Reporting Results 443Issues Concerning the Transfer of Health
and Safety Training 443Conclusion /? 444
22. Work/Life Balance Policies and Programs 447E. Jeffrey Hill, Sara P. Weiner
Why Evaluate Work/Life Policies and Programs? 447Historical Overview 448
Work/Family Focus on Child Care (1970s-1980s)Broad Work/Life Focus (1980s-1990s)Work/Life Business Imperative (Late 1990s
to the Present)Evaluating Work/Life Policies and Programs
Step 1: Identify ObjectivesProgrammatic ObjectivesOrganizational ObjectivesIndividual Objectives
Step 2: Determine MethodsQuantitative Methods: Human Resources
DatabasesQuantitative Methods: SurveysQualitative Methods
Step 3: Gather and Analyze the DataP1. Offer the Best (or Superior, or
Competitive, or Average) Work/life Programsin the Industry
P2. Promote Work/Life AwarenessP3. Ensure Work/Life UsageP4. Improve Continuously01 . Improve Recruiting02. Increase Retention of the Best Talent03. Motivate Employees to Contribute Their Best
/ 04. Raise Productivity05. Move to a Results-Based Culture11. Fulfillment on and off the Job12. Manageable Workload13. Balance Work and Personal Life14. Seek Synergy
Step 4: Link Analysis to Bottom Line MeasuresROIBreak-Even Analysis
Step 5: Make Recommendations Based onthe Work/Life Evaluation
Summary
448449
449450451451451452452
453453454454
454455455455455456457457458458458459459460460460
460464
PART VIII: ISSUES SPANNINGHUMAN RESOURCES PROGRAMS 469
23. Evaluation of Human Resource Information Systems 471Jeffrey M. Stanton,Timothy V. Nolan, John R. Dale
Brief Historical Overview of HRISs 474Primary Research Strategies for Evaluating an HRIS 476
Assessors Who Can Conduct HRIS Evaluations 477Internal Groups . 477External HRIS Experts 478Consortium Participation 479
Criteria for Judging HRIS Quality 479Financial Criteria 481Human Infrastructure 484Technical Quality of System Functioning 485Reactions 486Value-Added Functions 487Benchmarking and Best Practices Criteria 488
Integrating Criteria and Reporting Evaluation Results 489
24. Global Human Resource Metrics 493Helen De Cieri, John W. Boudreau
Talentship: A Decision Science for HR 494A Strategic Approach to the Measurement of Global HR 495A Model for Global HR Metrics 496
External Factors Influencing Global HR Metrics 502Organizational Factors Influencing HR in MNEs 504Linking Elements 506
Impact , ' 506/ Effectiveness 507
Efficiency 508Outcomes: MNE Concerns and Goals 510
/ Summary and Conclusions 511
25. Strategic Planning for Human Resources 514Edward J. Kelleher, F. Stephen Cobe
Key Strategic Planning Issues for HR 515The Strategic Planning-HR Interface 515
Strategic Planning Orientations 516Business Paramount 516Corporate Command 517Corporate Strategies 517Strategic Networks 517
The Strategic Management Process 518HR Roles in the Strategic Management Process 520
Role 1: Implementation of Corporate andBusiness Strategies and HR Program Development 520
Implementation-Roles in HR 522Testing the Implementation Models 523Program Development Models 525
Role 2: HR Strategies 526/ Corporate Development Stages Model 526
j Role 3: HR Participation in Change Management 527
Role 4: HR Participation in Acquisitions and Mergers 529Evaluation of HR Strategy 530
Self-Audit Questionnaire for HR Strategy 530HR Strategy Benchmarking 531Achievement of Expected Values of Performance 532
Conclusions 533
Glossary: Definitions of Technical and StatisticalTerms Commonly Used in HR Program Evaluations 537
Chet Robie
Nambury S. Raju
Index 551
About the Editors 561
About the Contributors 563