Latest Resources

Optimizing Investments in Learning

by Peggy Parskey, Associate Director, Center for Talent Reporting & John Mattox, II, PhD, Managing Consultant, Metrics That Matter

You might wonder why L&D leaders need guidance for optimizing investments in learning. Recent research from The Learning Report 2020 (Mattox, J.R., II & Gray, D. Action-Learning Associates) indicates that L&D leaders struggle to convey the impact of L&D. When asked what they would do better or differently to communicate value to business leaders, 36% of L&D leaders indicated they would improve processes related to communication. More telling though is, 50% indicated they would improve measurement. To convey value, leaders need to a story to tell. Without measurement there is not much of a story.

This is where Talent Development Reporting principles (TDRp) are so relevant. It begins with a measurement framework that recommends gathering three types of data: efficiency, effectiveness, and outcomes. Efficiency data tells the story of what happened and at what cost. How many courses did L&D offer? How many people completed training? How may hours of training did learners consume? What costs did we incur? Effectiveness data provides feedback about the quality of the learning event (e.g., instructor, courseware, and content) as well as leading indicators of the success of the event (e.g., likelihood to apply, estimates of performance improvement, estimates of business outcomes, and estimates of ROI). Outcomes are the business metrics that learning influences such as sales, revenue, customer satisfaction, and employee satisfaction. TDRp recommends gathering data for all three types of measures so L&D leaders can describe what happened and to what effect.

The table below (Table 11.1 in Learning Analytics) shows a variety of measures for each category.

Key Performance Measures for L&D

Efficiency MeasuresEffectiveness MeasuresOutcome Measures

  • Number of people trained

  • Number of people trained by learning methodology (instructor led, e-learning, virtual)

  • Reach (percentage of people trained in the target population

  • Cost of training per program

  • Decrease in costs

  • Cost of training per learner

  • Cost of training per hour


  • Satisfaction with training

  • Knowledge and skills gained due to training

  • Intent to apply learning on the job

  • Expectation that training will improve individual performance on the job

  • Expectation, that individual performance improvement to lead to organization performance improvement

  • Return on expectations


  • Increase in customer satisfaction

  • Increase in employee performance

  • Decrease in risk

  • Increase in sales

  • Increase in revenue

Source: Learning Analytics ©2020, John R. Mattox II, Peggy Parskey, and Cristina Hall. Published with permission, Kogan Page Ltd.

In addition to this guidance about what to measure, TDRp provides guidance on how to report results using methods familiar to business leaders. Using these measurement and reporting approaches, L&D leaders can tell a robust story to business leaders that they can connect with.

In April 2020, Learning Analytics, Second Edition (by John Mattox, Peggy Parskey, and Cristina Hall). The book helps L&D leaders improve the way they measure the impact of talent development programs. The second edition includes a chapter on Optimizing Investments in Learning, where TDRp is discussed in detail. TDRp is featured heavily in the book because it helps L&D leaders connect their efforts to business outcomes by measuring the right things and reporting them in a way that business leaders understand.

In Chapter 11, the authors connect TDRp to the Portfolio Evaluation methodology. This approach implies that business leaders are interested in how learning and development programs impact four business areas: growth, operational efficiency, foundational skills, and risk. By aligning courses with one of these business areas and using TDRp, L&D leaders can demonstrate the effectiveness of the courses in preparing the workforce to improve each business area (portfolio).

The book also provides guidance on how to use TDRp to spur L&D leaders to act on the data they report. The book indicates L&D leaders need to shift reporting in three ways to make it more actionable (see graphic below). Reports should provide:

  • Analytics that improve business decisions
  • Insights that link to evolving business challenges
  • Information that connects talent data to business impacts

Using Data to Spur Action

Using Data to Spur Action

Source: Learning Analytics ©2020. Reproduced with permission, Kogan Page, Ltd.

Another critical aspect of conveying a meaningful message that drives action is to tell a compelling story. Chapter 11 of the book includes three critical elements of a data-driven story:

  • Scene Setting—Connect all data and results back to the desired business outcomes and goals.
  • Plot Development—Emphasize logical and clear connections between learning results and outcomes; focus on the central message of the results, not peripheral findings; and note any areas where L&D is creating value for the organization.
  • Resolution—Clearly and succinctly outline the justification for key findings; suggest improvements, recommendations, and nest steps. Continue the conversation on how to help L&D improve.
2020 CTR Annual Conference

LEARN MORE ABOUT THE USE OF DATA AND LEARNING ANALYTICS


Join us at the 2020 Virtual CTR Annual Conference, October 27, for the session, Everything You Wanted to Know about Learning Analytics but Were Afraid to Ask.

LEARN MORE >>

Why We Need Human Capital Metrics and Reporting in Company Disclosures

by Jeff Higgins
Founder and CEO, Human Capital Management Institute

Can we agree that the coronavirus pandemic is above all a human crisis requiring people-centric measurement and solutions?

The last great crisis in the U.S. was the financial crises of 2008-2009. If the pandemic is a uniquely human crisis, the most powerful driver in our economic recovery is what we do next with our human capital and people practices.

Given the importance of people and human capital practices in organizations, does it not make sense that such a critical resource and source of value creation be included in public disclosures?

Without standard human capital metrics, everyone is flying blind (risk planning, investors, policy makers, and corporate executives).

Human capital risk planning is a new focus for many companies. It’s a topic frequently mentioned but has seen little action beyond companies adding boilerplate legal risk language in public reporting.

Reliable, validated human capital reporting has long been a black hole for investors, including institutional investors representing employee pension funds. This may change, as investors and policy makers press for insightful evidence of risk mitigation and future sustainability including global standards like ISO30414 Human Capital Reporting Guidelines (December 2018) and US SEC proposed rule on human capital disclosure (August 2019). Recently the SEC has reaffirmed its want for enhanced disclosure and more focus on workforce health and welfare.

“We recognize that producing forward-looking disclosure can be challenging and believe that taking on that challenge is appropriate,”  explained the SEC in an April 2020 letter.

While critical for investors, human capital reporting is just as important for employees. Think about a person deciding where to work. Wouldn’t they want to know key human capital metrics to make their decision, such as the diversity of employees and leaders, the amount invested in developing employees, the percentage of employees and leaders who take advantage of training, the retention or turnover rate, and culture measures, such as leadership trust index or employee engagement score?

It is critical to note that organizations disclosing more about their human capital have historically performed better in the market.

EPIC research shows:

Reliable human capital data is often lacking yet raw data is readily available, along with proposed ISO HC reporting guidelines.

The enhanced usage of people data and analytics has been a positive business trend long before the COVID-19 crisis. But stakeholders, including executives and investors, have not always had information that was relevant, valuable, and useful for decision making.

Turning people data into decisions requires skill, effort, discipline, and money, which is lacking in many HR departments. Priorities have often been elsewhere in building employee engagement and experience metrics, along with other “hot” technology driven vendor offerings.

Even experts acknowledge that the value of human capital is not always easy to measure. Nevertheless, ISO guidelines do exist for internal and external human capital reporting. Until now, relatively little has been offered publicly in most U.S. companies.

“We may not always want to know what the data tells us,” said one member. As we have seen with the COVID-19 crisis, not wanting to know what the data tells us, can be a recipe for disaster and  not one that leads to economic recovery.

The Need for Transparency Has Never Been Greater

Many of the critical metrics and practices needed most are not transparent to the very stakeholders (such as employees, investors and government policy makers) who must start and sustain the economy recovery. 

Moving forward, enhanced transparency around workforce management and human capital measures will be needed. Without such data, organizations will have a difficult time adapting to changing markets and workforce needs and building critical partnerships. Such data is vital to building trust within the workforce and attracting, engaging, and retaining talent.

Increasingly, as organizations collect more data about their workforce, employees will want to benefit from that data.

“We saw less concern about privacy when we saw the workforce data was shared and being used to protect the workforce,” said one member.

Investing In and Developing the Workforce Has Never Been More Important

The power of learning, development, and people investment has never been greater nor more directly tied to economic results. A greatly changed world requires a new level of learning investment, adoption and innovation.  

Will organizations see the need for learning investment and people development as a continuing need? Will they measure it to better manage it? Will they begin to see human capital measures as a means for learning about what is working and what is not? Will they begin to see human capital reporting as a leading indicator of recovery? 

2020 CTR Annual Conference

IN-DEPTH LOOK AT ISO'S L&D AND HR METRICS FOR HUMAN CAPITAL REPORTING

Join us at the 2020 Virtual CTR Annual Conference for this session where industry experts share the definitions of the 23 ISO disclosure metrics and how to use these to in your human capital reporting strategy.

LEARN MORE >>

Three Measures You Should Capture Now During COVID-19

I hope everyone is healthy and managing through these very trying times. And, I hope the same for your family and friends. I know this pandemic has not been easy.

All of us by now are well into our Covid-induced new world where we have shifted from instructor-led training (ILT) to virtual instructor-led training (vILT) and eLearning, supplemented by content available through our employee portals. On top of all of these changes, you have probably had to create new courses on safety and return-to-work procedures. Perhaps you have provided guidance to your leaders about how to manage remotely in stressful times. Some have even gone into cost reduction mode to help offset the loss of income.

While you’ve been really busy, I hope are ready to ‘come up for air’ now. If so, here are a few measurements to think about. If you have not already implemented them, there is still time to collect the data before it gets lost.

Make Time to Collect and Analyze Data

First, be sure you are collecting data on your vILT and any new courses you have put online. You may be using a new platform like ZOOM, that were not set up to connect with your learning management system (LMS), and the number of participants may not be recorded and not be receiving typical follow-up survey. In this case, be sure to set up a system to capture the number of participants manually—perhaps by asking instructors to send in counts. And, consider sending a survey out to at least a sample of participants—even if you have to do it manually. If you cannot easily generate the email list of participants, send to a sample of all employees and make it more general. For example, ask whether you have taken a vILT course, online course, or downloaded content in the last three months. Then, get feedback on each.

Feedback, especially for vILT, is important so you can compare to ILT and understand what participants like better about it as well as what they dislike. Surprisingly, some organizations are reporting higher or equal scores for vILT. And these results are from vILT which was probably rushed into production (i.e., presentations from ILT that were used and repurposes for vILT). Imagine how much better vILT could be if it were actually designed for virtual delivery. This is the data you and your leaders will need to decide if you want to permanently change your mix to less ILT in favor of more vILT when the pandemic is over.

Capture Your Efficiency Savings

Second, be sure to capture the efficiency savings as a result of switching to vILT and online courses. Typically, vILT and eLearning will be less expensive because there is no instructor or participant travel and the course may be shortened. To be fair you should really only count the savings where an alternative to ILT was provided. Of course, there will be savings from simply cancelling ILT with no replacement, but that is not really a savings. It is just a reduction in offerings. You can estimate the cost of ILT and the cost of vILT so you can calculate the difference. Don’t be afraid of estimating—it’s a common business practice.  If you have too many offerings to make a calculation for each course, come up with the  average ILT and vILT cost, find the difference, and multiply it by the number of participants.

Capturing the dollar savings from switching to vILT is important because there is real value there and it should factor into your decision about continuing with a mix of learning modalities after the pandemic ends. The savings will be especially large for organizations where participants or instructors incur travel and lodging costs.

Calculate Your Opportunity Costs

Third, calculate the reduction in opportunity costs by switching to vILT and e-learning. Opportunity cost is the value of participants’ time while in class and traveling to and from class. This adds up—especially for half-day or full-day courses and can easily exceed the accounting costs (i.e., room rental, supplies, instructor, etc.) for some courses.

Calculating opportunity cost is simple. Take the average hourly employee labor and related cost (including benefits, employer paid taxes, etc.) from HR and multiply it by the reduction in hours from switching to vILT and e-learning. For example, suppose you replaced an 8-hour ILT with a 5-hour vILT, which also saved participants one hour of travel time.  The reduction in hours would be 3+1=4 hours. Multiply the 4 hours by the number of participants and by the average hourly compensation rate. You can use the average hours for an ILT course and compare it to the average for vILT or e-learning to make the calculation at an aggregate level.

It’s important to account for savings in travel time—because employee’s time is valuable. By eliminating travel, you are giving time back to employees when you move to vILT or e-learning. Share the opportunity cost savings with your senior leadership, CFO, and CEO along with the savings in course costs. The opportunity cost savings, in particular, is likely to be an impressive number. Think about this savings when you decide whether to go back to ILT or keep using vILT.

I hope you find these measures helpful. You really need them to make an informed decision about what to do when the pandemic ends. I think the profession has a great opportunity to use this experience to become much more efficient and effective in the long run.

L&D Leadership in a Time of Great Change

[FREE VIRTUAL CONFERENCE] July 14 & 15, 2020. 12:00 – 4:15 pm EDT. This two-day virtual conference details the specific tasks L&D organizations have to carry out to achieve significant and continuous workforce productivity. Presentations will detail how to organize these tasks so that training executives can perform them systematically, purposefully, with understanding, and with a high probability of accomplishment.

Hear case studies, winning strategies, evidence-based results from strategic management leaders and learning executives charged with the awesome responsibility of managing L&D in today’s new business climate. Register Today!

Talent Leaders Virtual Exchange Americas

Virtual Event | June 17, 2020
REGISTER NOW

The Talent Leaders Virtual Exchange will give talent executives the much needed opportunity to hear how their peers are dealing with unprecedented disruption during this global pandemic. In a time when the situation changes almost daily, it’s important that talent executives are making quick decisions in the moment while also planning for a post pandemic world.

The Talent Leaders Virtual Exchange will offer insights from many perspectives on what is working now and what strategies are being put into place for the future. Topics covered include:

  • Communicating with a concerned, remote workforce
  • Lessons learned in the face of a global pandemic
  • Capturing the voice of the employee in a crisis
  • How talent leaders are delivering value during the time of COVID-19
  • Managing your employer brand in uncertain times
  • Preparing for a post-pandemic workforce

>> REGISTER

 

Forward to the Past—Reflections on How Our Profession Evolves (Or Not)

If you follow this blog, you know that I’ve been struggling this past year with the profession’s new focus on upskilling employees for future needs. I have been trying to understand two things:

  1. How future needs can be reliably identified now, and
  2. Assuming needs can be identified, why would anyone want to provide training now, when the skills and knowledge will not be needed or applied for years to come—resulting in scrap rates of 100%.

To me, this seems to go against everything we have learned over the last 20 years in terms of good performance consulting and designing with application and impact in mind.

I was excited to see multiple topics on upskilling and reskilling at a recent CLO Symposium. I looked forward to learning more about this topic, given its incredible popularity.

During the sessions, I had expected to learn that the identifiable future skills would be tangible skills—like programming or new manufacturing techniques (which would be difficult to predict with enough specificity to train for today). Instead, the future skills identified by the presenters were all soft skills like communication, teamwork, innovation, creative thinking, and problem solving. This reminded me of current research that cited soft skills are very important to future success.

That same research also indicated that employers are looking for those same soft skills in their employees today. If these skills are needed today, we can use our performance consulting tools to identify the gaps and design learning to address them. This all means that these soft skills can be applied today, which means we are really not talking about future skills—we are talking about skills to meet current needs, which are considered important for the future.

The CLO Symposium presenters shared these skills in the context of  being ‘new’—as in organizations should start to train employees on them. This is where I felt I was fast forwarding to the past. When Caterpillar U was founded in 2000, we had a brand new LMS containing hundreds of courses. In the ‘old days’ training organizations used to brag about how many courses they had in their ‘paper’ catalog. Guess what was in that catalog…soft skill courses like communications, team building, writing, and problem solving. The catalog also included a lot of job function skills too, but the point is, soft skill courses are not new. They have always been in demand and always will be. So, can we please apply a little historical context and be more careful about how we define  ‘future skills’? What we are really talking about is ‘forever skills.’

As I reflect on the past, it’s interesting to note what happened to learning leaders bragging about how many courses they offered. Company executives began asking for results. They wanted to know how these hundreds of courses specifically aligned to business goals and needs. Executives demanded measures beyond how many courses were offered. This led to ‘strategic alignment’ where the emphasis is how courses proactively and specifically align to meet the goals of the enterprise.

For many years, strategic alignment had been a hot topic at industry conferences and many books have been written on the topic. How many times did I hear strategic alignment mentioned at the CLO Symposium conference? Zero. I understand that focus areas change and evolve, but we don’t seem to be building on our past. Instead, we’re re-inventing it as with identifying soft skills as future skills. We will advance more quickly and soundly as a profession if we have a sense of our past and can skip the re-learning phases.

Let’s remember that soft skills will always be important as will strategic alignment. There is no need to cycle back and forth and rediscover the value of each. A holistic future can include all that we have learned which will provide the strongest foundation upon which to build, and then we can focus more of our energy on inventing what is truly new and special.

Thoughts on the Coronavirus and What It Means for L&D

The situation surrounding COVID-19 is evolving on a daily basis. Uncertainty is high and it may be weeks or even several months before we know how all this plays out. Several things are clear, though.

First, the US and world are likely headed into an economic “timeout” and quite possibly a recession. Hopefully, this will be brief but even a brief downturn will cause significant harm. Many in the service industries are going to lose their job or have their hours severely reduced. Other companies will experience falling sales as people cut back or delay their spending. Most organizations have already restricted travel. Even if the worst of the virus is over by summer and travel restrictions are lifted, companies may implement cost reduction measures for the remainder of the year to partially offset a decline in revenue for the first half of the year.

So, if your organization is one which is negatively impacted, now is the time to implement the recession planning that we have talked about for years. Be prepared to start reducing variable expenses, including the use of vendors and consultants. Where appropriate, shift from instructor-led learning to virtual ILT, e-learning, and performance support. Prioritize your work and be prepared to focus on only the most important. Get advice from your governing body or senior leaders on where they would like you to focus your efforts. If you have not already done so, clearly align your programs to the top goals and needs of your CEO.

Second, COVID-19 presents an excellent opportunity to make significant progress in moving away from instructor-led learning and towards virtual ILT and e-learning. Most organizations have been moving in this direction for years but say they are not yet at their desired mix of ILT to vILT and e-learning. Well, now is the perfect opportunity to make this transition. Travel restrictions will prevent employees from traveling to a class and will also prevent instructors from flying to class locations. And social distancing discourages bringing employees together for training which is why so many universities have announced that they are shifting entirely to remote learning for at least the next month or so. The private sector should be equally responsive. Use the existing platforms to conduct classes virtually and ramp up the marketing of your e-learning and portal content.

Third, the virus also presents a once-in-a-lifetime opportunity to highlight performance support. What can we in L&D do to help our colleagues adapt to this new world and still do their jobs? People will be working remotely from home, perhaps for the first time. What performance support do they need? Most will not need a 30-minute e-learning course on working remotely but they will need steps to take to get connected. They will need help setting up virtual meetings. What can you provide them to make their life easier and to prevent a flood of calls to IT or HR for support? And, building on this, moving forward what opportunities do you have to replace some of your existing ILT or e-learning with performance support? Take this opportunity to truly integrate performance support into all that you do.

On a personal note, I wish each of you the best and hope you stay healthy. The situation will stabilize and eventually return to normal, but in the meantime let’s see what we can do to help each other through these challenging times.

Top Five New Year’s Resolutions for the Learning Profession

January is always a good time to reflect on the past and ponder the future. It’s also a good time to make some resolutions to improve in the coming year and decade. Here are my top five suggestions for the profession in the areas of measurement, analytics, reporting, and running learning like a business.

1. Resolve to measure more at levels 3 (application) and 4 (impact).

In November, ATD released its latest survey on evaluation that showed 60% of respondents evaluated at least one program at level 3 which is up from 54% in the 2015 study. Likewise, 38% measured at least one program at level 4 which is a slight increase from 35% in 2015. The good news is that the numbers are moving in the right direction. The bad news, is that they should be much higher. Over 80% measure levels 1 and 2. This should be the target for level 3 as well. And, we should measure level 3 for more than just a few programs. It should be measured and managed for all important programs. Once we start measuring application at a higher rate, we can turn our attention to measuring more programs at level 4. Phillips has shown us that this measure is what CEOs most want to see from us and we’re not giving it to them. We need to fix this.

2. Continue our shift from order takers to strategic business partners.

This will require us to have a discussion with the CEO before the year starts, to discover the goals fro the coming year and the goal owners. We need to talk with the goal owner to determine if learning has a role to play, and if it does, work with their staff to recommend an appropriate program. The goal owner then needs to approve the program and we need to reach agreement on the planned impact, other key measures of success, timing, and roles and responsibilities. One of the most important responsibilities of the goal owner will be to ensure that supervisors reinforce the learning with their employees in order to achieve a high application rate. All of this should be completed before the new year starts.

3. Be more disciplined and honest about how well aligned courses are with your organization’s goals.

Most learning departments would say they are well aligned but really, they’re not. What do I mean by “alignment”? It means the courses were designed (or purchased) in response to an identified need in support of a goal. If you say you are “aligned” to your organization’s top five goals, you should have a strategic alignment table which lists the top five goals—and under each one, you show the programs designed (or purchased) specifically to help achieve that goal. Instead, most use backward mapping, which starts with the course catalog and then seeks to map a course to all relevant goals. Since most courses indirectly support most goals, many conclude they are well aligned. For example, you might find that courses to improve communications skills are “aligned” to each of the top five goals (like increasing sales). The two approaches are very different—which explains why in surveys such a high percentage report they are aligned.

4. Be more sophisticated in your analysis of data.

Historically, data has been aggregated and reported as a total or average. Say for example, we report that the average participant reaction (level 1) is 76% favorable or that the average application rate is 51%. These are excellent summary measures but often do not tell the entire story. Perhaps the overall 76% favorable rating is due to the majority of participants being very satisfied but a sizable minority being very dissatisfied. We would want to know this information so we can explore the reasons (i.e., wrong target audience, bad instructor, etc.). Look at the distribution, not just the average or total and then use the information at a “by-name” level to take action (i.e., letting the supervisor know they are going to have to work harder with an individual to get application). This is the potential of microdata, which is really just a fancy term for data at the individual employee level. Ken Phillips is doing interesting work in this area.

5. Create and use more management reports.

Today, the industry primarily generates scorecards and dashboards. Scorecards show historical data in tabular form (like number of participants or courses for the last six months) while dashboards include some visual and/or interactive displays. Both however are designed to inform by sharing historical results. In other words they are “backwards” looking. Of course, it is important to know where we have been so we need scorecards and dashboards—however we also need to look forward. For this, we have management reports that show the plan (or targets) for the year, comparison of year-to-date results with plan, forecast (how the year is likely to end), and forecast compared to plan. They are designed to help leaders manage their initiatives to deliver promised results by answering two questions: “Are we on plan year to date?” and “Are we going to end the year on plan?” Management reports complement scorecards and dashboards by providing that forward look which can be used to actively manage initiatives.

 

Corporate Learning Week 2020

 

 

 

Corporate Learning Week | March 23-26 | Austin TX | www.clweek.com

This year’s Corporate Learning Week (CLW) conference celebrates 25 years as an essential gathering for senior corporate learning leaders and innovators. If you’re a learning, training and development professional who wants to increase performance, maximize profits, make more strategic and tactical decisions, and offer more effective training and development to future-proof your organization’s workforce knowledge shortages—then, this conference is for you.

This year’s event will focus on helping L&D leaders to utilize innovative technology to deliver measurable, quantifiable results—results that empower diverse talent and ultimately drive business outcomes. In short, CLW 2020 will help address the fundamental questions and challenges that remain in today’s environment:

  • How do you make your L&D more effective in face of disruption?
  • How do you find and realize new training potential in a rapidly digital environment?
  • How do you align your training and development outcomes to corporate goals?

CLW 2020 will equip L&D practitioners with practical skills to go from strategy-setting to practical execution to communicate the business outcomes and results back to senior executives.

Reserve your spot today!

Upskilling Revisited: What Strategy Makes the Most Sense?

In last month’s blog I shared my skepticism about some of the current upskilling initiatives. I have been thinking more about it since then. In fact, it is hard not to given the near daily references to the need for upskilling. Just this morning I saw a reference to a World Economic Forum study on the future of jobs which indicated that 54% of employees will require significant reskilling or upskilling by 2022.

Several things come to mind. First, I wonder about the terminology. How should we define these two terms? The term “upskilling” seems to suggest providing a higher level of skill while “reskilling” suggests a different skill, which may be higher, lower, or the same in terms of competency. Is that what we mean? And how does this differ from what we have been doing in the past? Haven’t you been reskilling and upskilling your employees for years? Isn’t that what the term training means? (Unless you have been “downskilling” your employees!) Is “reskilling” just the new term for “training”? Don’t get me wrong. If you can get a larger budget by retiring the term “training” and emphasizing “reskilling or upskilling”, then go for it. But let’s be careful about how we use the terms.

Second, I wonder about the methodology being used to conclude that upskilling is needed. As I mentioned last month, one of my worries is that we are not doing a needs analysis to confirm this need or identify precisely what the need is. I know some organizations are hiring management consultants to tell them what skills will be needed in the future but how do the consultants know what your needs will be? I know other L&D departments are asking their own organization leaders the same question, but how good is the knowledge of your own leaders? My concern is that this methodology will generate the type of common, somewhat vague, needs we now hear about all the time like digital literacy or fluency.

Third, what exactly do we do once we have identified these vague needs? A traditional needs analysis would identify the performance that needs to be improved and then, assuming learning does have a role to play in improving that performance, recommend training specifically designed to close the gap. If we were to provide training to improve digital fluency, exactly what will be improved? Are we back to increasing competency simply to increase competency? Training for training’s sake? So, my concern here is that even if we increase competency in these areas of “need”, nothing of measurable value will improve.

Fourth, what is the time frame? Put another way, how exactly is the question being asked of leaders about future skills? Is it about “future skills” with no time period specified, or is it about skills that will be needed in one year, five years, or ten years? When does the “future” start? It seems to me that the further out we go, the more likely we are to mis-identify the actual needs, especially given the rapid pace of change. Even if we could identify a need one or two years in the future, do you really want to develop a course today to address it, given that the need may evolve? Moreover, if you begin upskilling employees today, they are not going to have an opportunity to apply the new skill for one or two years, by which time they will have forgotten what they learned.

Here is my suggestion for discussion. Doesn’t it make more sense to focus on the skills needed today and in the very near-future and to deploy the learning at the time of need rather than months or years ahead of time? Ask leaders what skills their employees need today or will need in the very near-future, and then follow up with a proper needs analysis to confirm or reject their suggestions. If we can do a better job meeting the current and very near-future needs of the workforce, won’t our employees always be well positioned for the future? And isn’t a reskilling/upskilling initiative tailored to specific employees and delivered at the time of need likely to be far more impactful than an approach based on vague needs for a large population delivered months or years before the new skills will actually be used?

Reskilling—Are We Thinking About This Clearly?

It seems that “reskilling” is on everyone’s mind these days. How often do we hear about an organization’s need to reskill its workforce for the future? And, this isn’t just coming from L&D—it is also coming from senior leaders. So, reskilling is the shiny new object that is attracting a lot of attention; and as my regular readers know, the brighter that new object, the more skeptical I become that we are on the right path.

Before sharing my doubts about reskilling, let me first acknowledge that there are situations where it makes perfect sense. For example, suppose that a manufacturing plant is going to be modernized in six months. The current employees do not have the skills to run the new automated plant but some would be interested in learning. I think this would be a perfect case for reskilling so the company can keep its existing employees who are loyal and committed to the success of the organization. By all means, provide the training for the new skills they will need to succeed. And, if there are not enough positions in the newly automated factory, provide new skills for the remaining employees to fill other positions in the company. So, I don’t have any issue with this use of reskilling where specific, existing employees need new skills to do a different job because their current job is about to be eliminated.

Let’s explore the other end of the reskilling spectrum, where I do have serious doubts about our current path. Some organizations are looking ahead and trying to define what skills will be needed in the future, often five to 10 years in the future.

In my experience, it is sometimes difficult to get good direction and consensus on skills needed in the short term (like for the next year), yet alone for the future. I have a hard time imagining how the process will work to define skills needed in five to 10 years. Who will L&D rely on to define these needs? Department heads, the CFO, CEO, governing body, a consulting company? How good is their forecast likely to be? Do these people have a track record for forecasting skills or will this be their first time? I can say from my experience at Caterpillar that making accurate long-term economic sand sales forecasts was virtually impossible. So, in a fast-changing world, I am skeptical of our ability to accurately forecast skills five to 10 years out.

Even if we could accurately forecast skills, does it make any sense to start training for those skills today? The assumption here is that the skills are not needed today but will be needed in the future. Given what we know about how quickly knowledge dissipates and about the importance of immediately applying what was learned, why would we start training now for a future need? Wouldn’t it be better to wait until just before the need actually exists and then offer the training, combined with the opportunity to immediately apply it and receive reinforcement from management?

Now, at this point, some may say that the time frame is not five to 10 years but instead just a few years—maybe even just one year ahead. Furthermore, some may say that leaders can readily identify the skills their current employees have and that these skills will be even more in demand in the future. Examples of this may be processes like design thinking, adaptability, communication and teaming. In this case, we can use our standard needs analysis and performance consulting process to identify the skills gaps and address them. Now however, I would contend that we are no longer talking about reskilling, but instead traditional training.

I believe reskilling connotes new skills that are not needed in one’s job today, but will be needed for a different job either today or in the future. And, I think it makes perfect sense to provide these new skills to existing employees whenever there is a good fit and when the need is immediate. I don’t believe we should spend much time trying to identify skills that will not be needed for five to 10 years down the line; and I certainly don’t believe we should start training for those skills today. Instead, let’s focus our reskilling efforts on those employees who need (or want) to find a new job in the next 12 months and who require new skills. For the vast majority of employees who simply want to keep learning to improve in their present position, let’s just continue to call this “training”.

Human Capital Disclosure May Soon Be Mandated by the SEC

In an earlier blog I talked about the ISO standards for human capital reporting which were published in December. These called for the voluntary public disclosure of measures for both large organizations and small/medium-sized organizations. While some European and Asian governments are likely to adopt the ISO recommendations as law, the United States Congress was never likely to follow suit so it appeared that adoption in the U.S. would be voluntary and slow.

The U.S. Securities and Exchange Commission just published proposed rule making which, if implemented, will bring human capital disclosure to U.S publicly traded companies much sooner than anyone imagined. This is a game changer for our profession and a VERY BIG DEAL!

On August 7 the SEC proposed to fundamentally change the way publicly traded companies report. Under the current rules, the SEC specifies 12 items that must be included in the narrative description which accompanies the financial statements. Companies have to address all that apply to them. For example, these items include a discussion of the principle products and services offered, new products and segments, sources and availability of raw materials, whether the business is seasonal, competitive conditions, and the current backlog of firm orders. The last required item is the number of employees, which is the only human capital measure currently required for disclosure.

The SEC proposes to move away from an explicit list of required items and instead adopt a principles-based approach where each company must discuss whatever is material with the understanding that the list of topics will differ by industry and company. The SEC does, however, share a non-exclusive list of items that it believes will apply to most companies. This list includes five items from the current list of 12 and two new items: HUMAN CAPITAL and compliance with government regulations. (The other five items are revenue-generating activities, development efforts for new products, resources material to the business, any business subject to re-negotiation or cancellation, and seasonality.) Furthermore, simply disclosing the number of employees is no longer sufficient. In their own words:

“Item 101(c) (1) (xii) [the current rules] dates back to a time when companies relied significantly on plant, property, and equipment to drive value. At that time, a prescriptive requirement to disclose the number of employees may have been an effective means to elicit information material to an investment decision. Today, intangible assets represent an essential resource for many companies. Because human capital may represent an important resource and driver of performance for certain companies, and as part of our efforts to modernize disclosure, we propose to amend Item 101 (c) to refocus registrant’s human capital resources disclosures. Specifically, we propose replacing the current requirement to disclose the number of employees with a requirement to disclose a description of the registrant’s human capital resources, including in such description any human capital measures or objectives that management focuses on in managing the business.” (page 48 of the proposed rule Modernization of Regulation S-K Items 101, 103, and 105)

Did you just feel the earth shifting below your feet? You should have. The importance of this proposed rule for our profession simply cannot be overstated. Many in the profession have worked years to increase the visibility and use of human capital measures. The time may finally have come for the U.S.

While it is true that the SEC will not prescribe specific human capital measures, it does provide some examples, including measures for attraction, development and retention of personnel. The test for disclosure, however, is clear: materiality. Can you imagine any CEO or CFO telling analysts, the public and their own employees that people are not a material contributor to the company’s success, especially after saying for years that people are the company’s greatest asset? I don’t think so. So, if this rule making is finalized, human capital disclosure is coming and coming soon.

Most companies will rely on their heads of HR as well as accounting for guidance on what to include in their narrative on human capital. If for no other reason than risk mitigation, these leaders in turn will look to the human capital profession for guidance. And they will find ISO30414:2018 which are the human capital reporting standards published in December 2018. These standards recommend the external reporting of 23 measures for large companies and 10 for small medium. These measures will be a natural starting point as companies decide what to discuss so if you don’t yet have a copy, get one, and be prepared to proactively help your organization be a leader in human capital reporting. The ISO document is available for purchase at https://www.iso.org/standard/69338.html.

The proposed SEC rule is available at https://www.sec.gov/rules/proposed/2019/33-10668.pdf. The rule is 116 pages, but the section on human capital is under section IIB7 pages 44 -54.

Can you Justify your Learning and Development Projects?

By Jack J. Phillips Ph.D., Chairman, ROI Institute

Daily headlines in the business press focus on the economy. While it is booming in some areas, other areas are slowing, and economic uncertainty exists everywhere. During uncertainty, executives must take steps to make sure the organization can weather the storm—whenever or wherever it occurs.

One way executives meet this
challenge is to ensure that expenditures represent investments and not just
costs. If an expenditure is considered a cost, it will be frozen, reduced, or
in some cases eliminated. If it is considered an investment that is producing a
return, it will be protected and possibly even enhanced during difficult times.
For example, many learning and development budgets are now being frozen or
reduced, even with record profits. While this seems illogical, it happens. Now
is the time to reflect on your budget and your programs. Can you withstand top
executive scrutiny? Are you ready for ROI?

The most used evaluation system
in the world is the ROI Methodology® to measure impact and ROI for a few major
programs. The ROI Certification® is the process to develop this valuable
capability. This valuable certification provides participants with the skills
needed to analyze the return-on-investment in practical financial terms. The
results are CEO and CFO friendly. Over 15,000 managers and professionals have
participated in this certification since it was launched in 1995, underscoring
the user-friendly nature of this system.

Don’t have the time or budget?
Several approaches are available to reduce the amount of time and cost needed
to develop this capability. For more information on ROI Certification, contact
Brady Nelson at brady@roiinstitute.net.

ROI Institute is the global
leader in measurement and evaluation including the use of return on investment
(ROI) in non-traditional applications. This methodology has been used
successfully by over 5,000 organizations and in over 70 countries.

Bridge the Gap from Training to Application with Predictive Learning Analytics

by Ken Phillips, CEO, Phillips Associates

In my previous blog post, I discussed the concept of scrap learning and how it is arguably the number one issue confronting the L&D profession today. I also provided a formula you could use to estimate the cost of scrap learning associated with your training programs.

In this post, I’ll share with you a revolutionary methodology I’ve been working on for the past several years called Predictive Learning Analyticts™ (PLA). The method enables you to pinpoint the underlying causes of scrap learning associated with a training program. It consists of three phases and nine steps to provide you with the data you need to take targeted corrective actions to maximize training transfer (see figure below). 

While the specific questions and formulae for the scores are proprietary, I hope you can apply the concepts in your organization using your own survey questions and your own weighting for the indexes. Even if you adopt a simpler process, the concepts will guide you and the article will give you an idea of what is possible.

Phase 1: Data Collection and Analysis

Unlike other training transfer approaches which focus mostly on the design and delivery of training, PLA offers a holistic approach to increasing training transfer. Built on a foundation of three research-based training transfer components and 12 research-based training transfer factors (see chart below), PLA targets the critical connection among all these elements. In short, PLA provides L&D professionals with a systematic, credible and repeatable process for optimizing the value of corporate learning and development investments by measuring, monitoring, and managing the amount of scrap learning associated with those investments.

Training Transfer Components & Training Transfer Factors

Phase 1: Data Collection & Analysis

The objective of phase one, Data Collection & Analysis, is to pinpoint the underlying causes of scrap learning associated with a training program using predictive analytics and data. Five metrics are produced and provide L&D professionals with both direction and insight as to where corrective actions should be targeted to maximize training transfer. The five measures are:

  • Learner Application Index™ (LAI) scores
  • Manager Training Support Index™ (MTSI) scores
  • Training Transfer Component Index™ (TTCI) scores
  • A scrap learning percentage score
  • Obstacles preventing training transfer

Data for calculating the first three measures: LAI, MTSI, and TTCI scores, is collected from program participants immediately following a learning program using a survey. The survey consists of 12 questions based on the 12 training transfer factors mentioned earlier. Data for calculating the final two measures are collected from participants 30 days post program using either a survey or focus groups and consists of the following three questions:

  1. What percent of the program material are you applying back on the job?
  2. How confident are you that your estimate is accurate?
  3. What obstacles prevented you from utilizing all that you learned if you’re not applying 100%?

Waiting 30 days post program is critical because it allows for the “forgetting curve” effect—the decline of memory retention over time—to take place and provides more accurate data.

LAI Scores

LAI scores predict which participants attending a training program are most likely to apply, at risk of not applying and least likely to apply what they learned in the program back on the job. Participants who fall into the at-risk and least likely to apply categories are prime candidates for follow-up and reinforcement activities. Examples include email reminders, micro-learning or review modules, and coaching or mentoring to try and move them into the most likely to apply category.

MTSI Scores

MTSI scores predict which managers of the program participants are likely to do a good or poor job of supporting the training they directed their employees to attend. Managers identified as likely to do a poor job of supporting the training are prime candidates for help and support in improving their approach. This help might take the form of one-on-one coaching; a job aid explaining what a manager should do before, during, and after sending an employee to training; or creating a training program which teaches managers how to conduct pre- and post-training discussions with employees.

TTCI Scores

TTCI scores identify which of the three training transfer components and the 12 training transfer factors affiliated with them are contributing the most and least to training transfer. Any components or factors identified as impeding or not contributing to training transfer are prime candidates for corrective action.

Scrap Learning Percentage

The scrap learning percentage score identifies the amount of scrap learning associated with a training program. It provides a baseline score against when follow-up scrap learning scores can be compared to determine the effect targeted corrective actions had on increasing training transfer.

The obstacles data identifies barriers participants encountered in the 30 days since attending the training program that prevented them from applying what they learned back on the job. Waiting 30 days to collect the data allows for the full range of training transfer obstacles to emerge. For example, some are likely to occur almost immediately—I forgot the things I learned—while others are likely to occur laters—I never had an opportunity to apply what I learned. Frequently mentioned obstacles are prime candidates for corrective actions to mitigate or eliminate them.

Phase 2: Solution Implementation

The objective of phase two: Solution Implementation, is to identify, implement, and monitor the effectiveness of corrective actions taken to mitigate or eliminate the underlying causes of scrap learning identified during phase one. Here is where the “rubber meets the road,” and you have an opportunity to demonstrate your creative problem-solving skills and ability to manage a critical business issue to a successful conclusion. Following the implementation of the corrective actions, it is now time to recalculate the amount of scrap learning associated with the training program. You can then compare the results to the baseline scrap learning percentage calculated during phase one.

Phase 3: Report Your Results

The objective of the third phase: Report Your Results, is to share your results with senior executives. Using the data you collected during phases one and two, it is time to show that you know how to manage the scrap learning problem to a successful conclusion.

In Sum

Scrap learning has been around forever, however what is different today is that there are now ways to measure, monitor, and manage it. One of those ways is through Predictive Learning Analytics™. Alternatively, you might employ the concepts to build your own simpler model. Either way, we an an opportunity to reduce scrap learning.

If you would like more information about the Predictive Learning Analytics™ methodology, email me at: ken@phillipsassociates.com. I have an ebook that covers the method and a case study illustrating how a client used the process to improve the training transfer of a leadership development program.

Business Alignment: A Critical Success Factor for L&D Organizations

by Peggy Parskey, Associate Executive Director, Center for Talent Reporting

Business Alignment
Signpost with Alignment wording

If you have attended a Learning and Development (L&D) industry conference within the past three years or listened to a panel of senior L&D leaders, it’s likely that someone has raised the topic of alignment. Numerous blogs from ATD, SHRM or even technology vendors confirm that alignment is clearly top of mind.

Given the attention paid to the topic of alignment, you might think that L&D has nailed down the principles, process, and outputs of business alignment. Unfortunately, you would be wrong. We haven’t nailed down this process by a long shot. And therein lies the problem: The L&D industry does not have a consistent definition of what alignment is, let alone how to achieve it.

Do a quick Google search on L&D business alignment and you will get thousands of articles on the topic. Click some of the links and you will find as many suggested approaches to alignment as search results. Some authors suggest that business alignment is about getting the right KPIs to support business goals. Other suggest that alignment is about doing a gap analysis between the current and future state to identify the needed training programs. Some bloggers recommend conducting interviews with business leaders as input to an L&D strategy. And a few organizations with whom we have worked view alignment as a simple mapping exercise: “We need to grow sales this year, we have a bunch of sales programs. Voila, Alignment!”

With all of these possible approaches, how should an L&D leader navigate the alignment process?

Principles of Effective Alignment

At least four principles should guide leaders who are grappling with achieving business alignment:

  1. Alignment isn’t a nice to have, it’s a must have
  2. Alignment requires engagement and commitment from both L&D and business leaders
  3. Alignment has both strategic and tactical components
  4. Alignment isn’t simply about what L&D offers but also how it offers them

1. Alignment Isn’t a Nice to Have

Strategic and tactical alignment is critical to ensure that L&D is investing its scarce resources in the right place at the right time on the right programs. When L&D doesn’t execute this process or doesn’t manage it effectively, the consequences can be significant. If L&D delivers the wrong programs to the wrong audiences, the organization will lack the needed capability to achieve its goals. Furthermore, other organizations now have to pick up the slack created by L&D. Non-L&D functions may develop shadow learning organizations, or hire external resources to fill the gap left by L&D. If L&D wants to fulfill its purpose to develop the needed capability for the current and future workfoce and improve L&D performance, then effective alignment is the critical success factor.

2. Alignment Requires Mutual Engagement

Discussions about L&D alignment often imply that the business has a peripheral role to play. L&D leaders get the strategic goals from above or they interview a few business leaders to understand priorities. At that point, the business seems to disappear from view.

If L&D leaders want to achieve effective alignment, the business can’t be a bystander. Senior business leaders must engage with L&D not only to communicate their priorities but also to ensure L&D has a role to play in achieving those priorities. Moreover, the business is an influential voice to ensure that L&D has the appropriate resources and support to deliver on its promise. Furthermore, the business will have the primary responsibility for reinforcement, without which there is likely to be little application to the job. If the business doesn’t understand its role, then it is incumbent on L&D leadership to spell it out and create shared accountability for success.

3. Alignment Has Both Strategic and Tactical Components

The alignment process must start at the strategic level. Business leaders establish strategic priorities, set goals and then allocate resources to achieve them. Based on these priorities. L&D then determines if and where it plays a role. If the organization plans to launch a new product line, L&D and the business must agree that L&D should own the process to train employees on these new products. If the organization wants to capture new demographic for its products, the business and L&D may agree that L&D has a minimal role to play. Regardless of the objective, business and L&D leaders need to clarify if L&D has a role as well as the importance and urgency of that role.

Having reached agreement on the strategic priorities, L&D must also ensure it aligns tactically. As an example, imagine that $500K is allocated to develop customer experience competencies. L&D practitioners then dive into the details to assess specific organizational needs. During this process, performance consultants discover that a primary root cause of underdeveloped customer experience capability is the lack of quality resources for call center personnel. At this point, L&D discovers that its requirements have changed. Training, while necessary, is not at the heart of the under performance. At this point, L&D may simply turn over the requirement to the business to develop the needed content for call center employees and invest its scarce resources elsewhere.

The point of this example is that what appears to be development need at the strategic level, may not translate into a development requirement when practitioners study the requirements more fully. It is incumbent on L&D practitioners to adjust their approach to ensure they stay aligned.

4. Alignment Isn’t Just About What L&D Offers

Alignment isn’t just about what programs L&D offers, but increasingly, how it offers them. In a fast-paced world, instructor-led training (ILT) has a long development cycle, is expensive to produce, and requires a large time investment for learners. Yet, according to ATD’s 2018 State of the Industry Report, ILT methods still comprise 67% of all training hours with self-paced at 29% and mobile learning a mere 2%.

Learners need content at their fingertips. They need a rich reservoir of material in different forms (white papers, how-to guides, videos, checklists, case studies) that are easy to find and easy to consume. Content must be high quality and useful on the job.

Alignment requires not simply that L&D builds capability and improves performance, but also does it in the most efficient and effective manner appropriate to the need.

Conclusion

Alignment is critical to ensure the L&D function meets the needs of the business by building the necessary skills to run the business and achieve its strategic objectives. This post focused on the key principles at the heart of alignment. In subsequent posts we will explore the importance of treating alignment as a continuous process and building skills to manage the end-to-end process effectively.

I recommend you learn more about alignment from respected industry leaders who can provide guidance on how to achieve meaningful alignment with the business. Below are four resources you should check out:

The Business of Learning by Dave Vance. See Chapter 4 which discusses strategic alignment in depth.

Attend the CTR Measurement and Reporting Workshop, which addresses the importance of alignment and how to engage business partners in the process.

Read the recently published IDC PlanScape document on the importance of L&D alignment to maximize the impact of training investment.

Read the upcoming 2nd Edition of “Learning Analytics,” which will be published in February 2020. (The current edition may be found here. Look for the second edition at www.explorance.com in February 2020. The second edition has a chapter devoted to the topic of strategic and tactical alignment.)

We’d like to hear from you. If you’re having business alignment challenges, don’t hesitate to email Dave Vance or Peggy Parskey.

The Greatest Issue Facing L&D Today

Scrap Learning

by Ken Phillips

What is arguably the top issue facing the L&D profession today?

The answer is scrap learning or the gap or difference between training that is delivered but not applied back on the job. It’s the flip side of training transfer and is a critical issue for both the L&D profession and the organizations L&D supports because it wastes money and time—two precious organization resources.

Now, you might be wondering, “How big is the problem?”

Two empirical studies, one by KnowledgeAdvisors in 2014 and one by Rob Brinkerhoff and Timothy Mooney in 2008, found scrap learning to be 45 percent and 85 percent respectively in the average organization. To add further credibility to these percentages, I’ve conducted three scrap learning studies over the past few years and found the scrap learning percentages associated with three different training programs in three separate organizations to be 64 percent, 48 percent, and 54 percent respectively. Added together, these five research studies results in an average scrap learning figure of approximately 60 percent in the average organization.

To further highlight the magnitude of the scrap learning problem, consider its effect on the amount of wasted organizational dollars and time. According to the 2018 ATD State of the industry research report, the average per employee organization training expenditure in 2017 was $1,296 and the average number of training hours consumed per employee was 34.1. Using the KnowledgeAdvisors and Brinkerhoff scrap learning percentages mentioned above, you can see in the table below just how much scrap learning costs the average organization in wasted dollars and time.

Cost of Scrap Learning in Wasted Dollars & Time

Average per employee training expenditure (=) $1,296(X) 45% scrap learning(=) 583 wasted $
The average per employee training expenditure (=) $1,296(X) 85% scrap learning(=) 1,102 wasted $
The average number of training hours consumed per employee (=) 34.1(X) 45% scrap learning(=) 15 wasted hours
Average per employee training expenditure (=) $1,296(X) 85% scrap learning(=) 29 wasted hours

Taking all of this data into account reminds me of James Lovell’s famous quote during his Apollo 13 mission to the moon when an oxygen tank aboard the space capsule exploded putting both the flight and crew in great peril: “Houston, we have a problem!”

If you would like to take a crack at estimating the cost of scrap learning associated with one of your training programs, you can use the Estimating the Cost of Scrap Learning Formula below. To gain the most useful insight, you should make every effort to collect the most accurate data possible for each of the input variables. Also, when selecting an estimated percentage of scrap learning associated with the program, variable 4 in the formula, you should get input from several people familiar with the program such as other L&D colleagues, participants who previously attended the program or perhaps even the managers of program participants and then compute an average of their estimates. Gaining the input of others will increase both the accuracy and credibility of the estimate and remove any concerns that the scrap learning percentage is merely your opinion.

Estimating the Cost of Scrap Learning Formula

Wasted Participant Dollars

The length of a learning program in hours _____
X the number of programs delivered over 12 months _____
X the average number of participants attending one program _____
= the cost of scrap learning in wasted time _____
X the average hourly participant salary + benefits _____
= the cost of wasted participant dollars _____ (A)

Wasted L&D Department Dollars

Administrative expenditures (e.g., materials, travel, facility, facilitator, delivery platform, food, etc.) for one program _____
X the number of programs delivered over a 12-month period _____
X the estimated percent of scrap learning (45-85%) associated with the program _____
= the cost of wasted L&D department dollars _____ (B)

Total Cost of Scrap Learning

Cost of wasted participant dollars (A) _____
+ cost of wasted L&D department dollars (B) _____
= total cost of scrap learning _____

With this information in hand, you are now ready to pinpoint the underlying causes of scrap learning and take targeted corrective actions to mitigate or eliminate these causes and maximize training transfer. How to do this will be part two of this blog article.

Change the Conversation about Funding Measurement

The L&D profession continues to struggle to get budget and attention for more robust measurement strategies. Learning professionals, particularly those with measurement or analytics responsibilities, appreciate the value of measurement and know that further budget would be well spent but often cannot convince senior leaders in learning to make the investment, let alone senior leaders outside learning. So, how do we make the case for more resources?

My advice is that we take a stealth approach. While some senior leaders will readily see the reason for more robust measurement, in my experience most will not. Senior leaders always have many more requests for budget and staff than they can grant, and typically measurement by itself doesn’t rise to the top of the priority list. And, since you’re already providing learning, measurement seems like an add-on or a ‘nice to have,’ but not something that is essential. This is especially true if most of the measurement currently being done is simply to report activity or historical results.

The alternative is to change the conversation. Instead of talking about measurement, talk about what will be required to deliver the planned results from the learning initiative. In other words—talk about management rather than measurement. Of course, you cannot manage without measures, so management will be the trojan horse that gets in measurement. This approach makes measurement the means to the end. It also focuses on the most important purpose or use of measurement which is to manage.

How would this work? Start with programs aligned to your organization’s key goals or needs. For these initiatives you need to partner closely with the goal owner, like the head of sales or manufacturing. Both parties need to agree on program specifics like learning objectives, target audience, and completion date. Most importantly, they need to agree on mutual expectations for the impact of the learning, which may be the isolated impact of the Phillip’s approach or the more subjective expectation of the Kirkpatrick approach.

In either case, both parties will need to agree on all the relevant measures and their targets that are required to deliver the planned impact. These measures would include efficiency measures like number of participants, completion rate, completion date, and cost as well as effectiveness measures like participant reaction, learning, and application rate. These measures will have to be managed to plan the development, delivery, reinforcement, and impact stages in order to deliver the ultimate measure, which is impact. (ex. The program will have to be completed by all participants by April 30 with a 90 percent initial pass rate a 90-day application rate of 85%.)

Notice that you now have a robust measurement strategy which is an integral part of the management for this initiative. In fact, you will not be able to meet expectations without it and consequently you should refuse to do the learning if a senior leader suggests stripping out the measurement. However, since measurement was not presented separately (no budget or staff identified for it, simply part of the plan), the whole issue of stripping it out is not likely to come up. You can follow the same approach for other learning initiatives including those not directly aligned to the goals of the organization. In every case, you should identify some measure of success and should identify the relevant measures and the targets required to deliver the success.

This is the stealth approach—treat measurement as a means to the end—embed measurement in all key initiatives by getting agreement with goal owners and senior leaders upfront on the planned outcome and on targets for all the relevant efficiency and effectiveness measures. In addition to measures to ensure expectations are met, you may need to employ measurement and analytics upfront to better understand the issue, optimum learning solution, or modality. Again, measurement will be integrated into the management of the initiative. It will not be a stand-alone item and will not be presented as such.

You may already have a dedicated measurement group to serve as an integral support staff; however, every program manager needs to work with goal owners and leaders to agree on a measure of success and the relevant efficiency and effectiveness measures and their targets. They need to be comfortable with measurement. If you want to expand your measurement efforts, look for ways to do this as part of key programs and initiatives (the end) rather than just as expanding your measurement group (the means).

In conclusion, try changing the conversation from measurement (which senior leaders generally do not understand or value) to measurement or to help you with your measurement strategy. They really don’t care. Instead, engage them to manage their program to deliver planned results which they do care about and which, by necessity, will include measures. Try this approach and see how it works for you.

 

2019 CTR Conference a Great Success

We are just weeks back from our Sixth Annual CTR Conference
in Dallas. We weren’t sure it could top last year’s but I think it did. We had
117 participants and they were very enthusiastic and engaged. Lots of great
sessions and discussions. And of course some delicious food in Grapevine!

Patti Phillips kicked it off with a talk on ROI, emphasizing
how important impact and ROI are and how they can be calculated. She included
some very interesting history and the story of how she met Jack. When we came
back together later in the morning after breakouts, Peggy facilitated a panel
on scrap reporting and measurement literacy, and then Brenda Sugrue, 2018 CLO
of the Year, shared some thoughts from her career. Six more breakout sessions
followed in the afternoon leading up to the awards ceremony at the end of the
day.

CTR recognized Jack and Patti Phillips as the first
recipients of the Distinguished Contributor Award, acknowledging their truly
significant contributions to our field including the 1983 Handbook of Training
Evaluation, the concepts of isolated impact and ROI, and the successful
implementation of ROI in over 70 countries. Then we recognized the winners of
the TDRp Excellence Awards: SunTrust Bank for L&D Governance, and Texas
Health Resources and Heartland Dental for L&D Measurement. Jack and Patti
recognized three winners for ROI studies as well.

Day two began with Steven Maxwell of the Human Capital Management Institute sharing his perspective of what CEOs and CFOs want from learning as well as his thoughts on the newly released Human Capital Reporting standards, both of which got people thinking. After six more breakout sessions, Justin Taylor, General Manager and EVP of MTM, led a panel on the impact of learning to wrap up the conference. With a focus on just the measurement, reporting and management of L&D, everyone could focus on the issues of most concern to them and share ideas and questions with each other in detail. If you didn’t make this event, you missed hearing from industry thought leaders and leading practitioners.

Next year we are moving the conference to the fall, so mark your calendars for October 28-29, 2020. We will continue to begin CTR Week with a pre-conference workshop on measurement and reporting (October 26-27) and end the week with post-conference workshops..  Next year will be the 10th anniversary of TDRp and we hope you can join us for the celebration.

2019 TDRp Award Winners

The Center for Talent Reporting is pleased to announce the following 2019 TDRp Award winners:

Distinguished Contributor Award

Congratulations to Jack and Patti Phillips, CTR’s first Distinguished Contributor Award winners. This award recognizes outstanding and significant contributions to the measurement and management of human capital. Particularly noteworthy are Jack’s groundbreaking book, Handbook of Training Evaluation and his concepts on isolated impact and ROI. Jack and Patti’s ROI methodology has been implemented in hundreds of organizations in over 70 countries. They have forever changed the standard by which investments in human capital are measured.

2019 TDRp Award Winner for Learning Governance

Congratulations to Sun Trust, CTR’s 2019 TDRp Award winner for Learning Governance. Sun Trust earned this award for their best practices in running L&D like a business. In particular, Sun Trust provided evidence that L&D participates in a continuous robust enterprise-wide governance process that ensures L&D has a voice and input into enterprise priorities and funding. The governance council moreover has clear guidelines for funding initiatives and exercises ongoing oversight through disciplined financial budgeting, reporting, and forecasting.

2019 TDRp Award Winners for Learning Measurement

Congratulations to Texas Health and Heartland Dental who have been awarded the 2019 TDRp Award for Learning Measurement. 

Texas Health has demonstrated a depth and breadth of commitment to measurement through a Learning and Education Cabinet which directs, guides, monitors, evaluates, and reports quarterly on learning according to strategic business goals. In addition, they have established learning oriented measures not only at the program level but also at the CEO level. They have well-defined roles for measurement and strategic responsibilities.  They leverage Metrics that Matter™ technology to organize programs into business aligned portfolios that drive measurement planning and reporting.

Daniel Gandarilla, VP and Chief Learning Officer said this about the importance of measurement to Texas Health, “Measurement is one of the most critical aspects of the learning process. It allows us to understand what has happened, the effect, and to tell a more compelling story of how we play a critical role in the success of the organization. The best stories can tell both quantitative results and qualitative experiences. This method of storytelling makes a compelling case and lets everyone know that learning is a part of the business.”

Heartland Dental has demonstrated a commitment to measurement through well-defined measurement roles, the use of a portfolio model to segment their offerings and drive consistent decisions about what, when and how to measure their solutions. They focus on linking learning to business outcomes and demonstrating the “so what” to their business partners.  In addition, they have established specific goals for their measures and incorporate their goals into their regular reporting. Finally, they leverage Metrics that Matter (MTM) to gather and report data to their stakeholders.

Meredith Fulton, Heartland Dental’s Director of Education described their measurement journey, “Prior to establishing our robust measurement approach, we collected satisfaction and ad-hoc feedback from learners through simple “Smile Sheets”. We felt great about what we were doing, but we did not have a measurement strategy in place for collecting learner or manager feedback that could drive decision making for any course or program. Shifts in our organizational leadership and focus on company growth have helped pave the way to develop an evidence-based strategy.”

After working with MTM to establish a strong measurement process, Heartland has been able to see dramatic changes and an increased savings in Learning & Development. “We have also been able to create efficiencies in automating data collection and reporting. By incorporating MTM data and insights into company metrics we have been able to tell a deeper story of the impact of Learning & Development. Additional benefits include being able to socialize best practice metrics from the programs across various stakeholders and business leaders. Our compelling insights, shown in custom report, infographics, and dashboards have allowed us to elevate the conversation and our involvement in company decisions,” said Fulton.

Human Capital Reporting Standards

The International Standards Organization (ISO) released its first standard for human capital reporting in December. It is titled “Human Resource Management – Guidelines for Internal and External Human Capital Reporting.” The document is 35 pages long with guidance for internal and external reporting by both large and small organizations. The standard is the culmination of a great deal of work by an ISO working group representing experts from numerous countries over the last three years led by Stefanie Becker from Germany.

Although a number of measures still need to be defined in greater detail, the standard is a major achievement and an important first step towards bringing standardization of measures and reporting to the HR field. Accountants have long had definitions of measures and standard reports as well as guidance about how the measures should be used. This has served their profession well and a similar discipline would serve us well. Imagine that in the not-to-distant future everyone in our field might be using the same language for measures and would share a common understanding of how to use the measures in reports to better manage their operations. Imagine the rich benchmarking and best practice sharing that this standardization and reporting would allow. And imagine the potential for university students in our profession to graduate with this common knowledge just as accountants do today.

Of course, there are compelling business reasons to move in this direction as well. Today about 84% of the value of S&P 500 firms is attributable to intangible assets which have little visibility on the balance sheet. Just forty years ago the percentage was reversed with 17% of the value in intangibles and 83% of the value in tangible assets. So, in the “old days” an investor could look at a balance sheet and get a good feel for the underlying value of a company; namely, its physical assets like building, equipment, land, and investments. Today, human capital drives the value of the intangible assets which makes up most of the value of many companies. Human capital, however, does not appear on the balance sheet and appears on the income statement only as an expense to be minimized. This is why it is so important to provide visibility and transparency to human capital. Investors, customers and employees need to know more about the human capital in an organization.

The standards are completely voluntary although the hope is that leading organizations will adopt them to provide this greater transparency for their investors and employees. The working group recognized that measurement and reporting can be burdensome for small organizations so only the most important and basic (and easy to calculate) measures are recommended for reporting. The working group also recognized that some measures, while important to properly manage human capital internally, may not be appropriate for public sharing and thus are recommended for internal use only.

There are 54 measures in all with 23 recommended for external reporting by large organizations. Many large organizations are already measuring most of these but not reporting them publicly. It is hoped they will begin using the ISO definitions as they become available and that they will begin to publicly report some of the measures. In some countries the government may mandate adoption of the ISO standard but that is not the case for the United States where organizations will be free to decide which, if any, of the recommended measures they report internally or externally.

Of the 23 measures recommended for public reporting by large organizations, there are five for recruitment and turnover, five for diversity, three for compliance, and three for health, safety and well-being. Productivity and workforce availability each have two, and cost, leadership and skills & capability each have one. The one for L&D (skills & capability) is total training cost which is recommended for external reporting for both large and small organizations. While not an ideal measure since it focuses on inputs rather than outcomes, it is at least a beginning and gives some indication of how much an organization values its employees.

Four other L&D measures are recommended for internal reporting by large organizations: percentage of employees who participate in training, average formal training hours per employee, percentage of employees who participated in training by category, and workforce competency rate. The first two are also recommended for small organizations.

I recommend that everyone in our profession become familiar with the standard and, if your organization is not already reporting these measures internally today that you consider them for your measurement and reporting strategy going forward. In the future investors and employees will begin asking for these and you really should be reporting on them internally to best manage your human capital. You can read more about them here in Workforce magazine, and the standard is available for purchase from ISO or ANSI.

Making an Impact Through Learning

I hope you can join us in Dallas for our Annual Conference in five weeks. It will definitely be worth your time and investment. You will find a community of like-minded learning professionals who are passionate about the measurement, reporting and management of L&D. Leading industry thought leaders and practitioners will be there to share their thoughts and experiences with you. And, of course, you will learn a lot about what works and what doesn’t from fellow participants.

Our theme for this year’s conference is Making an Impact through Learning. We have selected speakers and topics to ensure that you will have the latest thinking from the best people in our profession on this always important topic. Like last year, we have scheduled plenty of time for networking so you can get your questions answered and learn from your colleagues.

Our keynoters this year are Patti Phillips and Jeff Higgins. I am sure you are familiar with Patti and Jack Phillips and their work on isolating the impact of learning and calculating ROI. (They will each also be conducting a breakout session.) You may not know Jeff who has a very interesting background as both an accountant and CFO and as a senior HR executive so he brings a unique perspective to our field. He has also been a contributor to the International Standards Organization’s (ISO) work on Human Capital Reporting standards which are just now being implemented around the world. You will learn a lot just from these two speakers!

You will also have the opportunity to hear from the current CLO of the Year, Brenda Sugrue, who will share her thoughts on her personal journey and on EY’s transformation. Other industry thought leaders include Jack Phillips on ROI, Ken Phillips on measuring application, Roy Pollock on ensuring results, John Mattox on the future of measurement, Laurent Balague on the best use of surveys, Peggy Parskey, Adri Moralis, and Justin Taylor on reducing scrap measurement and reporting, Jeff Carpenter on data analytics and adaptive learning, Cushing Anderson on adding value, and Kevin Yates on impact measurement.

In addition, we have some great practitioners lined up to share with you including Gary Whitney (formerly with IHG), Toni DeTuncq and Teddy Lynch (NSA), Laura Riskus (Caveo), Mark Lewis (Andeavor), and Dean Kothia (HP).

Last, please do consider joining us for our pre and post-conference workshops which round out CTR Week. We offer a two-day workshop on measurement and reporting and one on the Six Disciplines of Breakthrough Learning. And after the conference we offer four half-day workshops. So, come early or stay after the conference to invest in our own development.

If you have come to one of conferences or workshops in the past, welcome back. If this will be your first, we look forward to meeting you. Either way, you are an important part of this community and everyone will benefit from your questions and contributions. As you know, the Center for Talent Reporting is a nonprofit and our sole mission is to advance the profession. Each year’s CTR Week is an opportunity to come together, learn from each other, and grow. You will leave energized! Find out more about the conference and workshops at ctrconference.com. Hope to see you in Dallas!

A Look Back (and Ahead) on the Measurement and Management of L&D

The end of December is always a good time of the year to take a look back and ahead of the L&D profession. Looking back, I think we are blessed to have some great thought leaders who have provided a terrific foundation for both the measurement and management of L&D.

On the measurement side, I am particularly thinking of Don Kirkpatrick, who gave us the Four Levels and Jack Phillips, who gave us isolated impact for Level 4 and ROI for Level 5. I am also thinking about all of the work done by the Association for Talent Development (ATD) to promote measurement and to benchmark key measures through their annual industry survey.

On the management side, I’m grateful again for the contributions from Don Kirkpatrick and Jack Phillips and now, Jim and Wendy Kirkpatrick and Patti Phillips; as well as for their guidance in how to manage—particularly with respect to partnering closely with goal owners and focusing on what it takes to achieve Level 3 application. In addition, I appreciate the work by Roy Pollock and his associates in giving us The Six Disciplines of Breakthrough Learning, which is a must-read book for anyone focusing on the measurement and management of L&D.

And, there are many more—like Ken Phillips and John Mattox and others too. And beyond L&D, the HR profession, in general, has benefited tremendously from thought leaders like Jac Fitz-enz, Jeff Higgins, John Boudreau, and Wayne Cassio. Like Kirkpatrick and Phillips did for L&D, these thought leaders basically invented measurement for the rest of HR.

We are very fortunate to have this strong foundation built over the last 30+ years. Looking ahead, the question is, where do we go from here? As a profession, we now have well over 170 measures for L&D and over 700 for HR (in general). I don’t think we need more measures. What we do need, however, is a better way to utilize some of the measures we have—especially Levels 3 (application) and 4 (results or impact) and 5 (ROI) for L&D. Level 3 is the starting point and should be measured for all key programs. Research by Phillips clearly indicates that CEOs want to see impact and ROI more than any other measures, which will become increasingly urgent as the next recession draws closer (pencil in 2020 or 2021 for planning purposes). While some progress has been made over the last 10 years, it is not enough, so this remains a high priority for the profession moving ahead.

Another priority for us is to do a much better job managing learning. By this I mean running learning with business discipline, which starts by partnering closely with goal owners and agreeing on specific, measurable goals or targets for the learning (assuming of course that learning has a constructive role to play in achieving the business goal). And, once specific plans have been made, to execute those plans with the same discipline your colleagues in other departments use. This requires monthly reports and comparing results to plan so that corrective action can be taken as soon as possible, to get back on plan and deliver promised results.

Managing learning this way is hard for many and some simply do not want accountability. But, it is an area where the payoff of better performance and greater value delivered per dollar is huge. In fact, I would contend that it has a bigger payoff than even measuring at Levels 3-5.

To summarize, I think there is an opportunity to structure our departments differently to enable better management overall. To develop a close partnership with goal owners (like the head of sales, for example) and to really run learning like a business, there needs to be one L&D professional in charge of the program(s) identified to meet the business need. This person would:

  1. Meet with the goal owner initially and oversee the needs analysis
  2. Get agreement up-front with the goal owner on specific measurable plans for the learning program as well as roles and responsibilities for both parties
  3. Supervise the design, development, and delivery of the program
  4. Meet regularly with the goal owner to manage the successful deployment of the learning, including reinforcement by leaders in the goal owner’s organization

I know this may be a challenge in some organizations, but I think it is indispensable for a successful partnership and for accountability within L&D.

I truly enjoy being a part of the great profession and the opportunity to work with all of you. We have come a long way in a relatively short period of time and I believe the future is very bright if we continue to build on foundation that has been laid to take learning measurement, reporting, and management to the next level.

I look forward to what we can accomplish working together! Happy New Year and Best Wishes for the coming year!

New Sample Reports for L&D Now Available

CTR publishes sample L&D reports to provide guidance and ideas to practitioners. We are happy to announce that we’ve completely updated and revised these reports to reflect our latest thinking and the evolution of Talent Development Reporting principles (TDRp).

Get the latest reports here.

If you’re a CTR member, you have access to the editable, Excel versions, which may be used as a starting point or template to customize for your own purposes. Non-members may download a PDF version of the reports to use as a reference.

What’s New

The revised resources include samples for both private and government sectors. Samples are also provided for qualitative, quantitative and mixed outcome measures, including:

  • Lists for effectiveness, efficiency and outcome measures
  • Statements for effectiveness, efficiency and outcome measures
  • Four different Summary Reports, each using different types of outcome measures
  • Four different Program Reports which vary in complexity
  • One Operations Report

Some Tips for Use

We recommend practitioners start by compiling lists of the measures they anticipate using and then create one list for each type of measure. If the measure will be managed to a target or plan during the year, then one of the three reports is recommended. This will be the case for most of your important measures.

If the measure is to answer a question, determine if a trend exists or provide basic information, then a statement or scorecard is appropriate. These contain actual results, usually by month or quarter.

In contrast, reports will include a plan or target for each measure as well as year-to-date results and ideally a forecast of the measure is expected to end the year.

More Updates To Come

Work is underway to update the lists, statements and reports for the other HR disciplines, like leadership development and talent acquisition. If you have any questions or would like more information about these reports, please contact David Vance, Executive Director, DVance@CenterforTalentReporting.org

 

Impact and ROI…The Discussion Continues

This month, I continue the discussion  on ROI (level 5) and impact (level 4). After publishing last months article, some expressed that ROI was unnecessary, too confusing or just too complicated. I argued that it is a simple, straightforward calculation once you have isolated the impact of learning which can always be done using the industry standard participant estimation methodology.

So, let’s take a look at why impact and ROI are so important. L&D exists for many reasons, but I would suggest the most important is to help our organizations accomplish its business goals—like higher sales, increased productivity, reduced costs and greater customer or patient satisfaction. Of course, L&D is positioned to help achieve HR goals as well, like higher employee engagement, which in turn contributes indirectly to realizing all the business goals. Most L&D departments are also responsible for leadership development which is sometimes itself a business goal.

These are all very important, high-level business or HR goals. Organizations invest considerable time and money in learning to achieve these goals, so it follows that CEOs would want to know what they are getting for their investment. What difference does learning make? Is it worth doing? Jack Phillips asked CEOs what they most wanted to see from L&D and over 90% said impact and over 80% said ROI. Less than 10% were receiving any impact data at the time and only slightly more were getting any ROI information. The best practice here is to carefully align your learning to these goals, plan it in partnership with the goal owner, set targets for impact and ROI, and then jointly manage the program to ensure the targets are achieved, including the level of application required to deliver the planned impact and ROI.

From both a planning and a follow-up point of view, some measure of isolated impact and ROI are the only ways to ensure learning (not L&D, but learning, which must be co-managed by the goal owner and L&D in partnership) delivered on its promise. The CEO, head of L&D, program manager, and goal owner will want to know how much difference learning made and what opportunities can be identified for improvement or optimization. The upfront agreement on impact will drive a discussion on what is possible and what level of effort by all parties will be required. The upfront agreement on ROI is nothing other than the business case for the investment. Whether it is ROI or net benefits, all parties will want to be sure that benefits are expected to exceed the costs. If not, why would you proceed? Afterwards, everyone will be interested in what impact and ROI were achieved. Actuals will never exactly match plan, and that is okay, but hopefully they will be close to plan. If not, there is a learning opportunity since either the plan was off or execution of the plan was off.

So, impact and ROI are important for high-level goals. How about lower-level goals and day-to-day business needs? Here, I think the answer depends. Most organizations have limited resources, so those resources should be focused first on the most important organizational goals. Plus, lower level goals may not have easily identifiable outcomes and thus it becomes harder to identify impact. Some learning like compliance training, on-boarding, and basic skills training is simply essential and there is no need for a business case or ROI. For this essential learning the goal should be to conduct it as effectively and efficiently as possible. So, employ levels 1-3 to measure effectiveness and focus on minimizing the time and cost to be efficient.

In conclusion, impact and ROI are important for both learning professionals, organizational goals owners, and senior leaders like the CEO. These measures are important upfront to help plan the level of effort required and to reach agreement with the goal owner on roles and responsibilities. The higher the desired impact and ROI, the greater the effort will have to be. Once the planning is completed and the program is underway, the targets for impact and ROI can be compared to year-to-date results to determine if mid-course corrections are necessary. In other words, this is not a set it and forget it exercise. Programs must be managed monthly to achieve the planned impact and ROI. Last, at the end of the program once the impact and ROI have been estimated, there will be an opportunity to learn from any variances to plan and improve.

ROI: Still Evoking Confusion and Controversy

The Return on Investment (ROI) concept for learning has been around since Jack Phillips introduced it about forty years ago. You might think by now that at least the confusion over its use would have diminished even if the controversy had not. Not even close. I continue to see the concept abused and misused on at least a monthly basis.

Here is an example from a recent blog: “Anyone who is involved in the business world is familiar with the concept of ROI. It’s punchy, with its relatively simple calculation, and can make or break a purchasing decision. But when it comes to learning initiatives, gathering the necessary data to calculate ROI is difficult, to put it mildly.” The author goes on to say that learning initiatives implemented as an integral part of business strategy can be measured by the success of that strategy.

There are several issues with the blog. First, the author appears to be confusing ROI used in the business world with ROI used in the learning world. Typically, a financial ROI is calculated to make the business case for an investment, like a new product line or facility. The definition of this financial ROI is not the same as the learning ROI. The numerator of the financial ROI is the net present value (NPV) of the increase in profit due to the investment. The denominator is the cost of the asset required for the project, such as the cost of a new facility. This will be capitalized as an asset on the balance sheet and depreciated each year.

Contrast this with the learning ROI which has (usually) one year of contribution to profit in the numerator (so no need for NPV) and no asset whatever in the denominator. Instead, the denominator is the cost of the learning initiative which is an expense item from the income statement. So, two different definitions and the calculation for the financial ROI is actually more complicated than that for the learning ROI. Interestingly, it was exactly this difference in formulas which led my colleagues in accounting at Caterpillar to tell me that I could not keep referring to learning ROI as “ROI” since it was confusing our leaders. So, I renamed it Return on Learning or ROL in 2002. Take away here is to remember the two are not the same and let those in your organization know that learning ROI is calculated differently.

The next point by the author is that learning ROI is difficult to calculate. The ROI calculation itself is very easy: simply, the net benefit of the initiative divided by the total cost. Net benefit is the gross benefit minus total cost. Generally, total cost is easy to calculate, but the sticky part is the gross benefit which is the increase in profit before subtracting the cost of the initiative. The gross benefit, in turn, depends on the isolated impact of the learning initiative, like a 2% increase in sales due just to the learning. Likely, this is what the author had in mind when complaining about the difficulty of calculating ROI.

However, isolation need not be that difficult. The learning team can work with senior leaders to agree on a reasonable impact for planning purposes. And, once the program is completed, there are several methods available to estimate actual impact which do not require special training or hiring a consultant, Often, it will suffice to simply ask the participants and the supervisors what they believe the impact was and then reduce the estimate some to allow for the inherent error in any estimate. While not precise enough for an academic journal article, this is usually all we need in the business world to make appropriate decisions (like go/no-go after a pilot) or identify opportunities for improvement. It will also be close enough to determine if the investment in learning more than covered the total costs.

Last, the author suggests that if the learning is integrated into the business strategy, its success can be measured by the success of the business goal. I strongly agree that we should always start with the end in mind (the business goal) and design the learning so it directly supports the business goal. Further, we need to work in close partnership with the business goal owner to ensure that the learning is provided to the right target audience at the right time and then reinforced to ensure the planned impact is realized. While this does provide a compelling chain of evidence that learning probably had the intended impact, it does not tell us how much impact the learning had or whether the investment in learning was worthwhile. Instead of a measurement, then, we are simply left with a “mission accomplished” statement.

The question remains how much sales would have risen without any training. If sales would have increased by 10 percent without training, the training clearly was not worth doing. How about if sales would have increased 9% without training? Was it worth doing? We still need to isolate the impact of training and calculate net benefit or ROI to answer this ultimate question of value – not to “prove” the value of the training but to enable us to make better decisions about programs going forward and avoid investing when the return does not justify the cost.

So, the debate goes on. A friend asked me recently if I still believe in ROI. Yes, I do, but we need to use it wisely. It should not be used defensively to try to prove the value of an initiative or an entire department. In other words, it is not an exercise to deploy at the end of a program or fiscal year when you are under fire. Rather it should be used to help plan programs to ensure they deliver value and to identify opportunities for improvement going forward. It should be used to help decide which programs to continue and to identify ways to optimize programs. ROI will never take the place of careful planning, building relationships with goal owners or smart execution. You will always have to get the basics right first. Once you have done that, then ROI can be a powerful tool to make you a better manager.