Rights Contact Login For More Details
- Wiley
More About This Title The Value of Learning: How Organizations Capture Value and ROI and Translate It into Support, Improvement, and Funds
- English
English
- English
English
THE AUTHORS
Patricia Pulliam Phillips is president and CEO of the ROI Institute, Inc., the leading source of ROI competency building, implementation support, networking, and research. An expert in measurement and evaluation, she provides support to organizations around the world that want to prove the value of their programs.
Jack J. Phillips is a world-renowned expert on accountability, measurement, and evaluation. He provides consulting services for Fortune 500 companies and major global organizations.
- English
English
List of Exhibits, Figures, and Tables xxi
Preface xxvii
Acknowledgments xxxi
Chapter One: Building a Comprehensive Evaluation Process 1
Challenges 2
Global Evaluation Trends • Measurement and Evaluation Challenges • Benefits of Measurement and Evaluation • The Myths of Measurement and Evaluation
Key Steps and Issues 13
Stakeholders • Levels and Steps • Chain of Impact • ROI Process Model • Objectives • Evaluation Planning • Data Collection • Analysis • Isolation of the Effects of Learning and Development • Conversion of Data to Monetary Values • The Cost of Programs • The Return on Investment Calculation • Intangible Benefits • Data Reporting • Operating Standards • Implementation Issues
Final Thoughts 31
Chapter Two: Defining Needs and Objectives: Ensuring Business Alignment 32
The Challenge 32
Business Alignment Issues • Begin with the End in Mind • Required Discipline • The Needs Analysis Dilemma
Payoff Needs 36
Key Questions • Obvious vs. Not So Obvious • The Reasons for New Programs • Determining Costs of the Problem • The Value of Opportunity • To Forecast or Not to Forecast
Business Needs 44
Determining the Opportunity • Defining the Business Measure—Hard Data • Defining the Business Need— Soft Data • Using Tangible vs. Intangible—A Better Approach • Finding Sources of Impact Data • Identifying All the Measures • Exploring “What If... ?”
Job Performance Needs 53
Analysis Techniques • Taking a Sensible Approach
Learning Needs 58
Subject-Matter Experts • Job and Task Analysis • Observations • Demonstrations • Tests • Management Assessment
Preference Needs 61
Key Issues • Impact Studies
Levels of Objectives for Programs 65
Reaction and Planned Action • Learning Objectives • Application and Implementation Objectives • Business Impact Objectives • ROI Objectives • The Importance of Specific Objectives
Final Thoughts 73
Chapter Three: Measuring Inputs and Indicators 74
Measuring Input and Indicators 75
Defines the Input • Reflects Commitment • Facilitates Benchmarking • Explains Coverage • Highlights Efficiencies • Provides Cost Data
Tracking Participants 78
Tracking Hours 80
Tracking Coverage by Jobs and Functional Areas 81
Tracking Topics and Programs 82
Tracking Requests 84
Tracking Delivery 85
Tracking Costs 86
Pressure to Disclose All Costs • The Danger of Costs Without Benefits • Sources of Costs • Learning Program Steps and Costs • Prorated Versus Direct Costs • Employee Benefits Factor • Major Cost Categories • Cost Reporting
Tracking Efficiencies 94
Tracking Outsourcing 95
Tracking for the Scorecard 96
Defining Key Issues 97
Input Is Not Results • Reports to Executives Should Be Minimized • The Data Represent Operational Concerns • This Data Must Be Automated
Final Thoughts 98
Chapter Four: Measuring Reaction and Planned Action 100
Why Measure Reaction and Planned Action? 101
Customer Service • Early Feedback Is Essential • Making Adjustments and Changes • Predictive Capability • For Some, This Is the Most Important Data • Comparing Data with Other Programs • Creating a Macro Scorecard
Sources of Data 106
Participants • Participants’ Managers • Internal Customers • Facilitators • Sponsors/Senior Managers
Areas of Feedback 107
Content vs. Non-Content • The Deceptive Feedback Cycle • Key Areas for Feedback
Timing of Data Collection 114
Early, Detailed Feedback • Pre-Assessments • Collecting at Periodic Intervals • For Long Programs with Multiple Parts
Data Collection with Questionnaires and Surveys 115
Questionnaire/Survey Design • Intensities • Questionnaire/Survey Response Rates • Sample Surveys
Data Collection with Interviews and Focus Groups 123
Improving Reaction Evaluation 123
Keep Responses Anonymous • Have a Neutral Person Collect the Forms • Provide a Copy in Advance • Explain the Purpose of the Feedback and How It Will Be Used • Explore an Ongoing Evaluation • Consider Quantifying Course Ratings • Collect Information Related to Improvement • Allow Ample Time for Completing the Form • Delayed Evaluation • Ask for Honest Feedback
Using Data 127
Building the Macro-Level Scorecard
Shortcut Ways to Measure Reaction and Planned Action 129
Final Thoughts 130
Chapter Five: Measuring Learning and Confidence 132
Why Measure Learning and Confidence? 132
The Importance of Intellectual Capital • The Learning Organization • The Learning Transfer Problem • The Compliance Issue • The Use and Development of Competencies • The Role of Learning in Programs • The Chain of Impact • Certification • Consequences of an Unprepared Workforce
The Challenges and Benefits of Measuring Learning 137
The Challenges • The Benefits
Measurement Issues 140
Objectives • Typical Measures • Timing • Cognitive Levels of Bloom’s Taxonomy
Data Collecton Methods 144
Questionnaires/Surveys • Objective Tests • Criterion- Referenced Tests • Performance Tests • Technology and Task Simulations • Case Studies • Role Playing/Skill Practice • Assessment Center Method • Exercises/Activities • Informal Assessments
Administrative Issues 157
Reliability and Validity • Consistency • Monitoring • Pilot Testing • Readability • Scoring • Reporting • Confronting Test Failures
Using Learning Data 161
Final Thoughts 162
Chapter Six: Measuring Application and Implementation 163
Why Measure Application and Implementation? 163
The Value of Information • A Key Transition Time • The Key Focus of Many Programs • The Chain of Impact • Barriers and Enablers • Reward Those Who Are Most Effective
Challenges of Measuring Application and Implementation 166
Linking Application with Learning • Designing Data Collection into Programs • Applying Serious Effort to Level 3 Evaluation • Including Level 3 in the Needs Assessment • Developing ROI with Application Data
Key Issues 168
Methods • Objectives • Topics to Explore • Sources • Timing • Responsibilities
The Use of Questionnaires 172
Progress with Objectives • Relevance/Importance of the Program • Knowledge/Skill Use • Changes with Work/Action Items • Improvements/Accomplishments • Define the Measure • Provide the Change • Monetary Value • Total Impact • List of Other Factors • Improvements Linked with the Program • Perceived Value • Links with Output Measures • Success of the Program Team • Barriers and Enablers • Management Support • Appropriateness of Program and Suggestions for Improvement • Checklist • Improving Response Rates
Data Collection with Interviews 189
Types of Interviews • Interview Guidelines
Data Collection with Focus Groups 191
Applications for Focus Group Evaluation • Guidelines
On-the-Job Observation 193
Guidelines for Effective Observation
The Use of Action Plans and Follow-Up Assignments 196
Developing the Action Plan • Successful Use of Action Plans • Action Plan Advantages and Disadvantage
The Use of Performance Contracts 203
Transfer of Learning 204
Developing ROI for Level 3 206
Data Use 209
Final Thoughts 211
Chapter Seven: Measuring and Isolating the Impact of Programs 212
Why Measure Business Impact? 213
Higher-Level Data • Breaking the Chain of Impact • A Business Driver for Many Programs • Show Me the Money Data • Easy to Measure • Common Data Types
Types of Impact Measures 215
Hard Versus Soft Data • Tangible Versus Intangible • Scorecards • Specific Measures Linked to Programs
Business Performance Monitoring 219
Identify Appropriate Measures • Convert Current Measures to Usable Ones • Developing New Measures
The Use of Action Plans to Develop Business Impact Data 221
Set Goals and Targets • Define the Unit of Measure • Place a Monetary Value on Each Improvement • Implement the Action Plan • Provide Specific Improvements • Isolate the Effects of the Program • Provide a Confidence Level for Estimates • Collect Action Plans at Specified Time Intervals • Summarize the Data and Calculate the ROI • Advantages of Action Plans
Use of Performance Contracts to Measure Business Impact 227
The Use of Questionnaires to Collect Business Impact Data 228
When You Don’t Have a Clue • When the Measure Is a Defined Set • When the Measure Is Known • Response Rates
Selecting the Appropriate Data Collection Method for Each Level 235
Isolating the Effects of the Program 237
Identifying Other Factors: A First Step • Using Control Groups • Using Trend-Line Analysis • Forecasting • Using Estimates • Calculating the Impact of Other Factors
Use of the Techniques 255
Final Thoughts 255
Chapter Eight: Identifying Benefits and Costs, and Calculating ROI 257
Why Calculate Monetary Benefits? 258
Value Equals Money • Impact Is More Understandable • Money Is Necessary for ROI • Monetary Value Is Needed to Understand Problems • Key Steps to Convert Data to Money
Standard Monetary Values 262
Converting Output Data to Money • Calculating the Cost of Quality • Converting Employee Time Using Compensation • Finding Standard Values
Data Conversion When Standard Values Are Not Available 271
Using Historical Costs from Records • Using Input from Experts to Convert Soft Data • Using Values from External Databases • Linking with Other Measures • Using Estimates from Participants • Using Estimates from the Management Team • Using Staff Estimates
Technique Selection and Finalizing the Values 282
Use the Technique Appropriate for the Type of Data • Move from Most Accurate to Least Accurate • Consider the Resources • When Estimates Are Sought, Use the Source with the Broadest Perspective on the Issue • Use Multiple Techniques When Feasible • Apply the Credibility Test • Review the Client’s Needs • Is This Another Project? • Consider a Potential Management Adjustment • Consider the Short-Term/Long-Term Issue • Consider an Adjustment for the Time Value of Money
Why Monitor Costs? 288
Why Measure ROI? 289
Fundamental Cost Issues 290
Monitor Costs, Even If They Are Not Needed • Cost Will Not Be Precise • Disclose All Costs • Fully Loaded Costs • Reporting Costs Without Benefits
Cost-Tracking Issues 293
Prorated Versus Direct Costs • Employee Benefits Factor
Major Cost Categories 294
Initial Analysis and Assessment • Development of Solutions • Acquisition Costs • Application and Implementation Costs • Maintenance and Monitoring • Support and Overhead • Evaluation and Reporting
Cost Accumulation and Estimation 296
Basic ROI Issues 296
Definition • Annualized Values: A Fundamental Concept
BCR/ROI Calculations 297
Benefit/Cost Ratio • ROI Formula • ROI Targets • ROI Is Not for Every Program
Other ROI Measures 303
Payback Period • Discounted Cash Flow
Final Thoughts 304
Chapter Nine: Measuring the Hard to Measure and the Hard to Value: Intangible Benefits 307
Why Intangibles Are Important 308
Intangibles Are the Invisible Advantage • We Are Entering the Intangible Economy • More Intangibles Are Converted to Tangibles • Intangibles Drive Programs
Measurement and Analysis of Intangibles 310
Measuring the Intangibles • Converting to Money • Identifying Intangibles • Analyzing Intangibles
Customer Service 316
Team Effectiveness 319
Cooperation/Conflict • Decisiveness/Decision Making • Communication
Innovation and Creativity 320
Innovation • Creativity
Employee Attitudes 324
Employee Satisfaction • Organizational Commitment • Employee Engagement
Employee Capability 326
Experience • Knowledge • Learning • Competencies • Educational Level • Attention
Leadership 333
360-Degree Feedback • Leadership Inventories • Leadership Perception
Job Creation and Acquisition 335
Productivity Versus Job Growth • Importance of Job Creation and Growth • Recruitment Sourcing and Effectiveness • Recruitment Efficiency
Stress 339
Networking 343
Final Thoughts 345
Chapter Ten: Reporting Results 348
Why the Concern About Communicating Results? 348
Communication Is Necessary to Make Improvements • Communication Is Necessary to Explain Contributions • Communication Is a Politically Sensitive Issue • Different Audiences Need Different Information
Principles of Communicating Results 350
Communication Must Be Timely • Communication Should Be Targeted to Specific Audiences • Media Should Be Carefully Selected • Communication Should Be Unbiased and Modest • Communication Must Be Consistent • Testimonials Are More Effective Coming from Respected Individuals • The Audience’s Opinion of the Program Will Influence the Communication Strategy
The Process for Communicating Results 352
The Need for Communication 354
Planning the Communications 355
The Audience for Communications 356
Basis for Selecting the Audience
Information Development: The Impact Study 359
Communication Media Selection 361
Meetings • Interim and Progress Reports • Routine Communication Tools • E-Mail and Electronic Media • Program Brochures and Pamphlets • Case Studies
Presenting Information 364
Routine Feedback on Program Progress • The Presentation of Results to Senior Management • Streamlining the Communication • Building Scorecards
Reactions to Communication 372
Using Evaluation Data 372
Final Thoughts 373
Chapter Eleven: Implementing and Sustaining a Comprehensive Evaluation System 375
Why the Concern About Implementing and Sustaining Evaluation? 375
Resistance Is Always Present • Implementation Is Key • Consistency Is Needed • Efficiency Is Necessary
Implementing the Process: Overcoming Resistance 377
Assessing the Climate 378
Developing Roles and Responsibilities 379
Identifying a Champion • Developing the Evaluation Leader • Establishing a Task Force • Assigning Responsibilities
Establishing Goals and Plans 382
Setting Evaluation Targets • Developing a Timetable for Implementation
Revising or Developing Policies and Guidelines 383
Preparing the L&D Team 384
Involving the L&D Team • Using Measurement and Evaluation as a Learning Tool • Teaching the L&D Team
Initiating Evaluation Studies 385
Selecting the Initial Program • Developing the Planning Documents • Reporting Progress • Establishing Discussion Groups
Preparing the Sponsors and Management Team 387
Removing Obstacles 388
Dispelling Myths • Delivering Bad News
Monitoring Progress 390
Final Thoughts 390
Appendix: How Results-Based Are Your Workplace 391 Learning and Performance Programs?0020An Assessment for the L&D Staff 391
Glossary 403
Index 407
About the Authors 423
- English
English
—Michael J. Offerman, president, Capella University
"Understanding the value of learning is critical for all business professionals. This book provides specific tools and techniques for evaluating learning effectiveness. A must read for anyone interested in the value of learning."
—Tamar Elkeles, Ph.D., vice president, Learning and Development,
QUALCOMM