How Difficult is Good L&D Measurement?

In the past, I have always said that good measurement of L&D is not that difficult. However, the more we have worked with individuals and organizations and the more workshops we teach, I am beginning to reconsider for three reasons.

1. Metrics are Not as Simple as They Appear

First, the metrics themselves. Some metrics are easy to understand and easy to measure. For example, the number of courses offered or used is easy to determine if you have Learning Management System (LMS) and not too difficult even if you use an Excel worksheet. However, many metrics are not as simple as they seem. For example, the number of participants where the user needs to know the difference between unique (no duplications allowed) and total participants (the same participant may be counted more than once). Another example is program cost, which sounds straightforward. It requires the user to understand how to calculate and use the fully-burdened labor and related rate to cost out the hours staff spend on learning development, delivery, and management.

Two more commonly used metrics fall into the category of easy-to-understand but complicated to measure. Level 2 learning, for example, sounds simple. Just calculate the average test scores, and you are done. In practice, however, many participants must keep taking the test until they receive a passing score. In this case, the final score doesn’t tell you much about how well the course or the test is designed. If you want to know whether a problem needs to be addressed, you need to report the score on the first attempt (or first-time pass rate) or the number of attempts required to pass.

Likewise, the Level 3 application rate sounds easy. Just ask participants if they applied what they learned. In many cases, however, the amount of content dictates that it would be better to ask the percentage of content they applied using a decile scale from 0%-100%. Then there is the issue of when to ask. Ideally, you will have a question about intent to apply as part of the post-event survey and a question about the actual application in a follow-up survey two to three months later.

Rounding out the challenge of being knowledgeable about measures is the category of metrics that are hard to understand, let alone measure. I think Level 4 impact falls into this category. To make life interesting, we have two very different versions of Level 4 in learning. Kirkpatrick defines Level 4 as results, meaning the business results and he advocates creating a compelling chain of evidence to show that learning contributed to the results. Phillips defines Level 4 as the isolated impact of learning—many in the profession don’t believe this can be measured. Jack and Patti Phillips provide five methods to isolate the effect of learning from all other factors, but some of these require statistics and good experimental design, which many don’t have.

2. The Number of Measurement Metrics Can Be Overwhelming

My second reason to reconsider the ease of L&D measurement is the sheer number of metrics. We have about 200 metrics for L&D alone and more than 700 for HR. This is a lot for anyone to master.

3. There Are Not Many Good Coaches/Teachers on the Topic of Learning Measurement

My third reason is the measurement staff and their leaders. Most are not exposed to L&D or HR measurement at university, so they learn it on the job. For them to learn on the job, they need a good teacher. True, many take workshops and read 100+ books on the topic, but this is usually not enough to master the concepts and practice of sound measurement. They need a good teacher or coach in their workplace to show them how to apply what they have learned and answer all the real-world questions that are going to come up.

Sadly, in many organizations, there may not be a good coach or teacher to help the person new to L&D measurement. Without this coaching, the staff is unlikely to master the measurement practice truly. Consequently, they will not be good coaches for those coming after them. So, as a profession, we do not seem to have reached a sustainable equilibrium where the experienced can teach and mentor the less experienced. In other words, those who should know either don’t know or don’t pass on what they know. This may explain why we continue to talk with so many who have only the most basic understanding of measurement and reporting, even though thousands have taken workshops and read books.

What do you think? Is good measurement difficult? What can we do about this?

We will continue this discussion at our Virtual Conference on November 2nd. Join us to learn more and share your thoughts.

The Critical Role of Leaders in L&D Measurement

I was talking with a colleague who had been asked to create a list of metrics for their organization to use. I asked if the leader had provided context or shared areas of interest or perhaps areas of concern where metrics would be important to better understand and manage the issue. They said no, the leader just wanted a list of metrics. Apparently, any list would do.

Unfortunately, this scenario is all too common. Leaders know they should have some measures and so they ask the measurement or analytics team to come up with a list or perhaps even a complete measurement strategy. In this case, the leader is not taking ownership for measurement and may not understand the critical role they play in the measurement and reporting process.

Ideally, measurement should be an extension and enabler of a leader’s management strategy. They should view measurement as essential to their success and champion it within the organization. As the CLO for Caterpillar, I could not succeed without measures and by extension, none of our efforts to improve learning and deliver value could succeed without measurement. We had to know our existing baseline to determine where improvement was necessary. Once we put plans in place, we had to measure monthly to know if we were on track to deliver the promised results by year-end. I literally could not manage without data.

A leader who is committed to running learning with business discipline will understand this. There are, however, good leaders who do not understand their role in measurement. They think the measurement strategy should be delegated to the subject matter experts—namely, the staff with experience in measurement and analytics. In these situations, staff will need to help their leader understand their important role in selecting measures and creating a measurement and reporting strategy. hopefully, the leaders will be willing to listen.

In addition to conveying how important measurement is, a good leader will provide direction in measurement selection and reporting. A good leader will share the measures of most interest to them or at least share their area of concern or focus so the measurement analyst can recommend appropriate measures. A good leader will also share the reason to measure which will dictate the type of report to use. The measures may be used simply to inform (for example, a number of courses or participants) but may also be used to monitor to ensure measures remain within acceptable bounds (for example, ensure participant reaction remains above 80% favorable). The leader must provide direction by identifying the most important measures and setting the thresholds for monitoring.

Leaders may also want to measure to manage programs or initiatives to a successful conclusion (for example, reaching 90% of the employees with learning or improving the application rate for key programs to 80%). This takes a special kind of report. Here again, leaders will play an integral role by setting plans or targets for all the key measures. They should not delegate this very important task to measurement analysts. Last, leaders may want to know how good a program was (for example, did it have an impact and was it worth doing?). This also requires a special type of report, and the leader will play an important role in reaching an agreement with goal owners on plans for key measures at the start of the program.

In conclusion, leaders need measures to do their jobs and consequently, leaders must be heavily involved in the creation of a measurement and reporting strategy. Put differently, leaders are the most important users of the measures so they must be involved in the measurement strategy from the beginning and they must own it.

A Continuous Improvement Approach Designed by Learning for Learning

by Kent Barnett, CEO, Performitiv

Ten years ago, a group of professionals came together to create an industry standard for Talent Development Reporting. Under Dave Vance’s leadership, that work turned into Talent Development Reporting principles (TDRp) and the Center for Talent Reporting. This work has withstood the test of time and has become the industry standard we all desired.

We now have the opportunity to leverage TDRp and create a much needed continuous improvement process that helps our industry optimize its impact. Other industries created their own continuous improvement process with a dramatic impact on their contributions, such as Six Sigma and Net Promoter System.

Many of us who have contributed to the development of TDRp have created a similar continuous improvement process, the Learning Optimization Model (LOM). The LOM is being deployed by several leading organizations and experts. It has six core components that can be implemented over time.

One of the key components is to introduce a new way of measuring effectiveness and impact. It leverages the simplicity and power of Net Promoter Score but is designed specifically for Talent Development. It is a great way to compare formal and informal learning while identifying way to improve impact.

Another component, the impact process map, helps us understand how all the measures we have fit together. More importantly, it helps us understand what to do with the data.

A third component of the LOM is hard data. Many of us believe that learning needs to take ownership of its financial contributions similar to manufacturing and sales. A vice president of manufacturing doesn’t control all aspects of gross profit, but it is a hard measure critical to managing manufacturing’s contribution. A vice president of ales doesn’t control all aspects of sales, but sales growth is hard data critical to managing impact. All business units with a P&L have workforce productivity goals measured by revenue divided by labor cost. Shouldn’t a CLO know the workforce productivity goals for his/her customers and take ownership to help achieve them?

Another component of the LOM is financial modeling. Jack and Patti Phillips of the ROI Institute have been an integral part of developing the LMO. Financial modeling is designed to incorporate the ROI process by helping to do analysis on the best mix of resources prior to rolling out certain strategic programs.

If you’re interested in learning more about the Learning Optimization Model, Join me at the CTR Conference, October 27-29, where I will go more in-depth.

2020 CTR Annual Conference


Join us at the 2020 Virtual CTR Annual Conference, October 27, for the session: A Continuous Improvement Approach Designed by Learning for Learning presented by Kent Barnett, Founder and CEO of Performitiv.


Three Measures You Should Capture Now During COVID-19

I hope everyone is healthy and managing through these very trying times. And, I hope the same for your family and friends. I know this pandemic has not been easy.

All of us by now are well into our Covid-induced new world where we have shifted from instructor-led training (ILT) to virtual instructor-led training (vILT) and eLearning, supplemented by content available through our employee portals. On top of all of these changes, you have probably had to create new courses on safety and return-to-work procedures. Perhaps you have provided guidance to your leaders about how to manage remotely in stressful times. Some have even gone into cost reduction mode to help offset the loss of income.

While you’ve been really busy, I hope are ready to ‘come up for air’ now. If so, here are a few measurements to think about. If you have not already implemented them, there is still time to collect the data before it gets lost.

Make Time to Collect and Analyze Data

First, be sure you are collecting data on your vILT and any new courses you have put online. You may be using a new platform like ZOOM, that were not set up to connect with your learning management system (LMS), and the number of participants may not be recorded and not be receiving typical follow-up survey. In this case, be sure to set up a system to capture the number of participants manually—perhaps by asking instructors to send in counts. And, consider sending a survey out to at least a sample of participants—even if you have to do it manually. If you cannot easily generate the email list of participants, send to a sample of all employees and make it more general. For example, ask whether you have taken a vILT course, online course, or downloaded content in the last three months. Then, get feedback on each.

Feedback, especially for vILT, is important so you can compare to ILT and understand what participants like better about it as well as what they dislike. Surprisingly, some organizations are reporting higher or equal scores for vILT. And these results are from vILT which was probably rushed into production (i.e., presentations from ILT that were used and repurposes for vILT). Imagine how much better vILT could be if it were actually designed for virtual delivery. This is the data you and your leaders will need to decide if you want to permanently change your mix to less ILT in favor of more vILT when the pandemic is over.

Capture Your Efficiency Savings

Second, be sure to capture the efficiency savings as a result of switching to vILT and online courses. Typically, vILT and eLearning will be less expensive because there is no instructor or participant travel and the course may be shortened. To be fair you should really only count the savings where an alternative to ILT was provided. Of course, there will be savings from simply cancelling ILT with no replacement, but that is not really a savings. It is just a reduction in offerings. You can estimate the cost of ILT and the cost of vILT so you can calculate the difference. Don’t be afraid of estimating—it’s a common business practice.  If you have too many offerings to make a calculation for each course, come up with the  average ILT and vILT cost, find the difference, and multiply it by the number of participants.

Capturing the dollar savings from switching to vILT is important because there is real value there and it should factor into your decision about continuing with a mix of learning modalities after the pandemic ends. The savings will be especially large for organizations where participants or instructors incur travel and lodging costs.

Calculate Your Opportunity Costs

Third, calculate the reduction in opportunity costs by switching to vILT and e-learning. Opportunity cost is the value of participants’ time while in class and traveling to and from class. This adds up—especially for half-day or full-day courses and can easily exceed the accounting costs (i.e., room rental, supplies, instructor, etc.) for some courses.

Calculating opportunity cost is simple. Take the average hourly employee labor and related cost (including benefits, employer paid taxes, etc.) from HR and multiply it by the reduction in hours from switching to vILT and e-learning. For example, suppose you replaced an 8-hour ILT with a 5-hour vILT, which also saved participants one hour of travel time.  The reduction in hours would be 3+1=4 hours. Multiply the 4 hours by the number of participants and by the average hourly compensation rate. You can use the average hours for an ILT course and compare it to the average for vILT or e-learning to make the calculation at an aggregate level.

It’s important to account for savings in travel time—because employee’s time is valuable. By eliminating travel, you are giving time back to employees when you move to vILT or e-learning. Share the opportunity cost savings with your senior leadership, CFO, and CEO along with the savings in course costs. The opportunity cost savings, in particular, is likely to be an impressive number. Think about this savings when you decide whether to go back to ILT or keep using vILT.

I hope you find these measures helpful. You really need them to make an informed decision about what to do when the pandemic ends. I think the profession has a great opportunity to use this experience to become much more efficient and effective in the long run.

Thoughts on the Coronavirus and What It Means for L&D

The situation surrounding COVID-19 is evolving on a daily basis. Uncertainty is high and it may be weeks or even several months before we know how all this plays out. Several things are clear, though.

First, the US and world are likely headed into an economic “timeout” and quite possibly a recession. Hopefully, this will be brief but even a brief downturn will cause significant harm. Many in the service industries are going to lose their job or have their hours severely reduced. Other companies will experience falling sales as people cut back or delay their spending. Most organizations have already restricted travel. Even if the worst of the virus is over by summer and travel restrictions are lifted, companies may implement cost reduction measures for the remainder of the year to partially offset a decline in revenue for the first half of the year.

So, if your organization is one which is negatively impacted, now is the time to implement the recession planning that we have talked about for years. Be prepared to start reducing variable expenses, including the use of vendors and consultants. Where appropriate, shift from instructor-led learning to virtual ILT, e-learning, and performance support. Prioritize your work and be prepared to focus on only the most important. Get advice from your governing body or senior leaders on where they would like you to focus your efforts. If you have not already done so, clearly align your programs to the top goals and needs of your CEO.

Second, COVID-19 presents an excellent opportunity to make significant progress in moving away from instructor-led learning and towards virtual ILT and e-learning. Most organizations have been moving in this direction for years but say they are not yet at their desired mix of ILT to vILT and e-learning. Well, now is the perfect opportunity to make this transition. Travel restrictions will prevent employees from traveling to a class and will also prevent instructors from flying to class locations. And social distancing discourages bringing employees together for training which is why so many universities have announced that they are shifting entirely to remote learning for at least the next month or so. The private sector should be equally responsive. Use the existing platforms to conduct classes virtually and ramp up the marketing of your e-learning and portal content.

Third, the virus also presents a once-in-a-lifetime opportunity to highlight performance support. What can we in L&D do to help our colleagues adapt to this new world and still do their jobs? People will be working remotely from home, perhaps for the first time. What performance support do they need? Most will not need a 30-minute e-learning course on working remotely but they will need steps to take to get connected. They will need help setting up virtual meetings. What can you provide them to make their life easier and to prevent a flood of calls to IT or HR for support? And, building on this, moving forward what opportunities do you have to replace some of your existing ILT or e-learning with performance support? Take this opportunity to truly integrate performance support into all that you do.

On a personal note, I wish each of you the best and hope you stay healthy. The situation will stabilize and eventually return to normal, but in the meantime let’s see what we can do to help each other through these challenging times.

Upskilling Revisited: What Strategy Makes the Most Sense?

In last month’s blog I shared my skepticism about some of the current upskilling initiatives. I have been thinking more about it since then. In fact, it is hard not to given the near daily references to the need for upskilling. Just this morning I saw a reference to a World Economic Forum study on the future of jobs which indicated that 54% of employees will require significant reskilling or upskilling by 2022.

Several things come to mind. First, I wonder about the terminology. How should we define these two terms? The term “upskilling” seems to suggest providing a higher level of skill while “reskilling” suggests a different skill, which may be higher, lower, or the same in terms of competency. Is that what we mean? And how does this differ from what we have been doing in the past? Haven’t you been reskilling and upskilling your employees for years? Isn’t that what the term training means? (Unless you have been “downskilling” your employees!) Is “reskilling” just the new term for “training”? Don’t get me wrong. If you can get a larger budget by retiring the term “training” and emphasizing “reskilling or upskilling”, then go for it. But let’s be careful about how we use the terms.

Second, I wonder about the methodology being used to conclude that upskilling is needed. As I mentioned last month, one of my worries is that we are not doing a needs analysis to confirm this need or identify precisely what the need is. I know some organizations are hiring management consultants to tell them what skills will be needed in the future but how do the consultants know what your needs will be? I know other L&D departments are asking their own organization leaders the same question, but how good is the knowledge of your own leaders? My concern is that this methodology will generate the type of common, somewhat vague, needs we now hear about all the time like digital literacy or fluency.

Third, what exactly do we do once we have identified these vague needs? A traditional needs analysis would identify the performance that needs to be improved and then, assuming learning does have a role to play in improving that performance, recommend training specifically designed to close the gap. If we were to provide training to improve digital fluency, exactly what will be improved? Are we back to increasing competency simply to increase competency? Training for training’s sake? So, my concern here is that even if we increase competency in these areas of “need”, nothing of measurable value will improve.

Fourth, what is the time frame? Put another way, how exactly is the question being asked of leaders about future skills? Is it about “future skills” with no time period specified, or is it about skills that will be needed in one year, five years, or ten years? When does the “future” start? It seems to me that the further out we go, the more likely we are to mis-identify the actual needs, especially given the rapid pace of change. Even if we could identify a need one or two years in the future, do you really want to develop a course today to address it, given that the need may evolve? Moreover, if you begin upskilling employees today, they are not going to have an opportunity to apply the new skill for one or two years, by which time they will have forgotten what they learned.

Here is my suggestion for discussion. Doesn’t it make more sense to focus on the skills needed today and in the very near-future and to deploy the learning at the time of need rather than months or years ahead of time? Ask leaders what skills their employees need today or will need in the very near-future, and then follow up with a proper needs analysis to confirm or reject their suggestions. If we can do a better job meeting the current and very near-future needs of the workforce, won’t our employees always be well positioned for the future? And isn’t a reskilling/upskilling initiative tailored to specific employees and delivered at the time of need likely to be far more impactful than an approach based on vague needs for a large population delivered months or years before the new skills will actually be used?

2019 TDRp Award Winners

The Center for Talent Reporting is pleased to announce the following 2019 TDRp Award winners:

Distinguished Contributor Award

Congratulations to Jack and Patti Phillips, CTR’s first Distinguished Contributor Award winners. This award recognizes outstanding and significant contributions to the measurement and management of human capital. Particularly noteworthy are Jack’s groundbreaking book, Handbook of Training Evaluation and his concepts on isolated impact and ROI. Jack and Patti’s ROI methodology has been implemented in hundreds of organizations in over 70 countries. They have forever changed the standard by which investments in human capital are measured.

2019 TDRp Award Winner for Learning Governance

Congratulations to Sun Trust, CTR’s 2019 TDRp Award winner for Learning Governance. Sun Trust earned this award for their best practices in running L&D like a business. In particular, Sun Trust provided evidence that L&D participates in a continuous robust enterprise-wide governance process that ensures L&D has a voice and input into enterprise priorities and funding. The governance council moreover has clear guidelines for funding initiatives and exercises ongoing oversight through disciplined financial budgeting, reporting, and forecasting.

2019 TDRp Award Winners for Learning Measurement

Congratulations to Texas Health and Heartland Dental who have been awarded the 2019 TDRp Award for Learning Measurement. 

Texas Health has demonstrated a depth and breadth of commitment to measurement through a Learning and Education Cabinet which directs, guides, monitors, evaluates, and reports quarterly on learning according to strategic business goals. In addition, they have established learning oriented measures not only at the program level but also at the CEO level. They have well-defined roles for measurement and strategic responsibilities.  They leverage Metrics that Matter™ technology to organize programs into business aligned portfolios that drive measurement planning and reporting.

Daniel Gandarilla, VP and Chief Learning Officer said this about the importance of measurement to Texas Health, “Measurement is one of the most critical aspects of the learning process. It allows us to understand what has happened, the effect, and to tell a more compelling story of how we play a critical role in the success of the organization. The best stories can tell both quantitative results and qualitative experiences. This method of storytelling makes a compelling case and lets everyone know that learning is a part of the business.”

Heartland Dental has demonstrated a commitment to measurement through well-defined measurement roles, the use of a portfolio model to segment their offerings and drive consistent decisions about what, when and how to measure their solutions. They focus on linking learning to business outcomes and demonstrating the “so what” to their business partners.  In addition, they have established specific goals for their measures and incorporate their goals into their regular reporting. Finally, they leverage Metrics that Matter (MTM) to gather and report data to their stakeholders.

Meredith Fulton, Heartland Dental’s Director of Education described their measurement journey, “Prior to establishing our robust measurement approach, we collected satisfaction and ad-hoc feedback from learners through simple “Smile Sheets”. We felt great about what we were doing, but we did not have a measurement strategy in place for collecting learner or manager feedback that could drive decision making for any course or program. Shifts in our organizational leadership and focus on company growth have helped pave the way to develop an evidence-based strategy.”

After working with MTM to establish a strong measurement process, Heartland has been able to see dramatic changes and an increased savings in Learning & Development. “We have also been able to create efficiencies in automating data collection and reporting. By incorporating MTM data and insights into company metrics we have been able to tell a deeper story of the impact of Learning & Development. Additional benefits include being able to socialize best practice metrics from the programs across various stakeholders and business leaders. Our compelling insights, shown in custom report, infographics, and dashboards have allowed us to elevate the conversation and our involvement in company decisions,” said Fulton.

Alignment Revisited

Tim Harnett in a recent Human Capital Media Industry Insights article (undated) reminds us of the importance of aligning L&D initiatives to organizational goals. He shares some sobering research indicating that “only 8% of L&D professionals believe their mission is aligned to the company strategy” and “only 27% of HR professionals believe there is a connection between existing [learning] KPIs and business goals”. So despite perennial intentions to do a better job of alignment as a profession we still have a long way to go.

I am afraid, though, that he does not go far enough in recommending the actions we need to take. He suggests that “identifying and tracking KPIs related to L&D initiatives is the best way to align L&D to organizational goals and make the business case for development programs”. For KPIs (key performance indicators) he is thinking of measures like level 1 participant satisfaction, learning hours, level 3 application and employee satisfaction. While these are all important measures and can indeed help make the case for learning, they actually have nothing to do with alignment.

Here is why. Alignment is the proactive, strategic process of planning learning to directly support the important goals and needs of the organization. Alignment requires L&D leaders to discover the goals and needs of the organization and then go talk to the goal owners to determine if learning has a role to play. If it does, the two parties need to agree on the specifics of the learning initiative including target audience, timing, type of learning, objectives, cost, and measures of success (ideally the outcome or impact of the initiative on the goal or need). They must also agree on the mutual roles and responsibilities required from each party for success including communication before the program and reinforcement afterward.

Measures or KPIs will come out of this process but the measures are NOT the process. It is entirely conceivable to have a learning program with great effectiveness and efficiency measures indicating many employees took it, liked it, learned it, and applied it, but the program was NOT aligned to the goals of the organization and should never have been funded. This is true even if it had a high ROI. Great numbers do not take the place of a great process, and alignment is not determined to have existed by looking back on measures or KPIs.

Conversely, you can easily imagine a program that is definitely aligned to one of the organization’s top goals but was executed so poorly that its effectiveness numbers came in very low. So, alignment is about the process of working with senior organizational leaders to plan learning initiatives which directly address their goals and needs. It must start with the organization’s goals and not with existing learning initiatives.

Last, there is much discussion these days about using employee engagement as an indicator of alignment. It is not for all the reasons discussed above. It is simply another measure and not a process. For engagement to be an indicator of alignment you would have to assume that employees know the organization’s goals, as well as the senior leaders do and that learning about those goals is the primary driver of engagement. Both of these assumptions are likely to be false. A focus on employee engagement would be appropriate only if engagement is the highest priority goal of the organization. In most organizations business goals like higher revenue, lower costs, and greater productivity are more important although higher engagement is always a good thing and will contribute indirectly to achieving the business goals.

In conclusion, I am happy to see a focus on this important topic of alignment, but success will require us to work the process of alignment with senior leaders. At the end of this process, every L&D department should have a one-pager listing the organizational goals in the CEO’s priority order and, whenever it is agreed that learning has a role to play, a list under each goal of the key learning programs that are planned. This is how you demonstrate alignment, not through KPIs or measures.

Portfolio Evaluation: Aligning L&D’s Metrics to Business Objectives

by Cristina Hall
Director of Advisory Services – CEB, now Gartner

Using data built on standard metrics has become table stakes for L&D organizations looking to transform and thrive in a rapidly-changing business environment.

A critical challenge that remains for many organizations, however, is how to prioritize and structure their metrics so that the numbers reinforce and showcase the value L&D is contributing to the business.  It’s important to select measurement criteria which reflect L&D’s performance and contextualize them with outcome measures used by the business.

Applying a Portfolio Evaluation approach to Learning and Development provides the linkage needed to address this challenge. It is a clear, outcome-centered framework that can be used to position L&D’s contributions in business-focused terms, at the right level of detail for executive reporting.

How Does L&D Deliver Value?

Delivering training does not, in itself, deliver value.  Training is a tool, a method, to develop employees’ knowledge and skills so they will deliver more value to the business.  The value that training programs deliver aligns to four strategic business objectives.

Driving Growth

Courses aligned to the Drive Growth objective are designed to increase top-line growth, thus growing revenue and market share.  The organization tracks metrics related to sales, renewals, upsells, customer loyalty and satisfaction, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses are intended to drive competitive or strategic advantage by focusing on organization-specific processes, systems, products, or skillsets.  Examples include courses that are designed to increase sales, customer retention or repeat business, new product innovation, or help managers best position their teams for business growth.

Increasing Operational Efficiency

Courses aligned to Operational Efficiency increase bottom-line profitability.  The business tracks metrics related to productivity, quality, cost, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses are intended to drive competitive or strategic advantage by focusing on organization-specific processes, systems, or skillsets.  Examples include courses that are designed to increase productivity, decrease costs, increase process innovation, or help managers maximize bottom line performance.

Building Foundational Skills

Courses aligned to the Foundational Skills value driver are designed to both ensure that gaps in employee skills can be addressed, and demonstrate that employees can grow and develop to provide even more value to the business; it’s frequently less expensive to fill a minor skill gap than to replace an employee who is already on-boarded and semi-productive. The business tracks metrics related to bench strength, employee engagement, turnover, promotion rates, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended. These courses tend to be off the shelf content, rather than custom designed content specific to the business. Examples include time management, MS Office, and introductory/generalized coaching or sales courses.

Mitigating Risk

Courses aligned to the Mitigate Risk value driver are designed to shield the business from financial or reputational risk by ensuring employee compliance with specific policies or maintenance of specific industry certifications.  The business tracks metrics related to safety, legal costs, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses tend to be focused on compliance, regulatory, and safety training, and tend to incorporate content similar to that of other courses in the organization’s industry.

Become a Portfolio Manager

Every learning asset, whether informal or formal, can be tied back to one of the four drivers of value.  The variety and depth of metric collection and the performance expectations associated with those metrics differ across each of these value drivers, which is why grouping courses or learning assets into Portfolios is helpful.  L&D leaders become investment managers; monitoring and managing assets that are expected to produce comparable results to effect the performance of people, who in turn effect the performance of the business.

Getting Started

  1. Align metrics to Portfolios: what is most important? What data is needed?
  2. Align learning assets to Portfolios: this ensures that the right metrics are collected.
  3. Gather the data: gather training effectiveness data from learners and their managers and combine it with efficiency data from the LMS and Finance and outcome data from the business.
  4. Review, interpret, and share: use the metrics to communicate L&D’s contribution to business goals, confirm alignment, and inform strategic decision-making.

For additional detail regarding the Portfolio Evaluation approach, download our white paper, Aligning L&D’s Value with the C-Suite.

About CEB, Now Gartner

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at


Informal Learning Evaluation: A Three-Step Approach

by John R. Mattox, II, PhD
Managing Consultant
CEB, now Gartner

You may recall that roughly 20 years ago, eLearning was seen as the next new thing. Learning leaders were keen to try out new technologies, while business leaders were happy to cut costs associated with travel and lodging. The eLearning cognoscenti claimed that this new learning method would deliver the same results as instructor-led training. They passionately believed that eLearning would become so prevalent that in-person classrooms would disappear like floppy discs, typewriters, and rotary telephones. The learning pendulum was ready to swing from the in-person classroom experience to the digital self-paced environment.

In time, the fervor surrounding eLearning waned and practical experience helped shape a new learning world where the pendulum was not aligned to one extreme or the other. Effective formal learning programs now employ a blended approach comprised of multiple components, including in-person instructor-led classes, virtual instructor-led events, self-paced web-based modules and maybe, just maybe, an archaic but valuable resource like a book.

Informal learning is the new hot topic amongst leaders in the L&D field. Three things appear to be driving the conversation: the 70/20/10 model, technology, and learners themselves. While the 70/20/10 model is by no means new—it was developed in the 1980s at the Center for Creative Leadership—it has become a prominent part of the conversation lately because it highlights a controversial thought: all oft he money and effort invested to create formal learning accounts for only 10% of the learning that occurs among employees. Only 10%! Seventy percent of the learning comes from on-the-job experience and 20% comes from coaching and mentoring.

These proportions make business leaders ask tough questions like, “Should I continue to invest so much in so little?” and “Will formal learning actually achieve our business goals or should I rely on informal learning?” L&D practitioners are also wondering, “Will my role become irrelevant if informal learning displaces formal learning?” or “How can L&D manage and curate informal learning as a way of maintaining relevance?”

The second influencer—technology—drives informal learning to a large extent by making content and experts easy to access. Google and other search engines make fact finding instantaneous. SharePoint and knowledge portals provide valuable templates and process documents. Content curators like Degreed and Pathgather provide a one-stop shop for eLearning content from multiple vendors like SkillSoft, Udemy, Udacity, and Lynda.

Employees are driving the change as well because they are empowered, networked, and impatient when it comes to learning:

  • 75% of employees report that they will do what they need to do to learn effectively
  • 69% of employees regularly seek out new ways of doing their work from their co-workers
  • 66% of employees expect to learn new information “just-in-time”

As informal learning becomes more prominent, the question that both L&D and business leaders should be asking is simple, “How do we know if informal learning is effective?” The new generation of learners might respond, “Duh! If the information is not effective, we go learn more until we get what we need.” A better way to uncover the effectiveness of information learning is to measure it.

Here’s a three-step measurement process that should provide insight about the effectiveness of most types of informal learning.

1. Determine what content you need to evaluate

This is actually the most difficult step if you intend to measure the impact of informal learning systematically across an organization. If you intend only to measure one aspect of informal learning—say a mentoring program then the work is substantially less. When undertaking a systematic approach, the universe of all possible learning options needs to be defined. Rather than give up now, take one simple step: create categories based on types of learning provided.

For example, group the following types of learning as:

  • Technology-Enabled Content: eLearning modules, videos, podcasts, online simulations or games
  • Documents: SharePoint resources, standard operating procedures and process documents, group webpages, wikis, and blogs
  • Internet Knowledge Portal

Create as many categories as needed to capture the variety of informal learning occurring in your organization.

2. Determine what measurement tool is best suited for each learning type

Common tools include surveys, focus groups, interviews, web analytics, and business systems that already gather critical operational data like widgets produced or products sold. Web analytics, business systems, and surveys tend to be highly scalable, low-effort methods for gathering large amounts of data. Focus groups and interviews take more time and effort, but often provide information rich details.

3. Determine when to deploy the measurement tool to gather information

For an eLearning module, it seems appropriate to include a web-based survey link on the last page of content. Learners can launch the survey and provide feedback immediately after the module is complete. If the content is curated by a vendor–preventing the insertion of a link on the final page of materials–then the completion of the module when registered in the LMS should trigger the distribution of an email with a link to the survey evaluation. Regardless of the type of learning (instructor led, virtual, self-paced, etc.), the timing and the tool will vary according to the content.

Is it easy to implement this measurement approach to evaluate the impact of informal learning? For some organizations, maybe. For others, not at all. However, measurement is a journey and it begins by taking the first step.

For More Information…

For guidance about measuring informal learning, contact John Mattox at CEB, now Garner To learn more about how to improve L&D measurement initiatives, download Increasing Response Rates white paper.

About CEB, now Gartner

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at

Telling a Story With Data

Data Story

by Christine Lawther, PhD
Senior Advisor—CEB (now Gartner)

Data is everywhere. In our personal lives, we are continually exposed to metrics. The number of “likes” on social media, usage metrics on every bill, and the caloric breakdown of burgers at the most popular fast food chains are all examples of common metrics that society is exposed to on a regular basis.

Looking at data within a business context, data insight is in high demand. More organizations are focusing on doing more with less, and so data often becomes the key element that determines decisions on goals, resources, and performance. This increase in data exposure acts as an opportunity for learning and development (L&D) professionals to showcase their efforts and to truly transition the conversation from being viewed as a cost center to an essential contributor to the organization’s goals.

One common challenge is that L&D teams are often not staffed with team members that have a rich background in analytics. When instructional designers, facilitators, program managers, and learning leaders hold the responsibility of sharing data, it can be rather challenging to translate stereotypical L&D metrics into a compelling story that resonates with external stakeholders. It’s because of this that tapping into some foundational best practices in telling a story with data can be valuable to consider.

Structure Your Story: The Funnel Approach

If you visualize a funnel, imagine a broad opening where contents are poured in and a stem that becomes increasingly narrow. Apply this visualization as the framework to craft your story: start with broad, generic insights, and then funnel down to specifics. Doing this enables the recipient of the story to understand the natural flow of moving through diverse metrics, but still understand the overarching picture of L&D performance as a comprehensive whole. For example, it may be helpful to start by outlining overall satisfaction or utilization metrics, and then transition into something slightly more specific such as breaking out scores of key courses within your portfolio that are the biggest contributors to those overall metrics. From there, you can move into more detailed metrics by delving into components such as highest/lowest rated items within that course, time to apply training, barriers to training application, and insightful qualitative sentiments. At the very end of the story, one can conclude with specific actions that the team plans to take. Following this approach not only paints a comprehensive picture, but it also creates momentum for next steps.

Speak Their Language

Metrics that L&D often focuses on (e.g., activity, cost per learner) may not easily translate into insights that external stakeholders innately resonate with. Each department within an organization may have their own custom metrics. However, it is imperative that a function can demonstrate the linkage back to the broader organization. Doing this enables one to exhibit that they are being good stewards of the resources granted to them and also reveals how their day-to-day efforts align with the broader organization.

So, how can you demonstrate that leadership should be confident with your decisions? Communicate your impact with metrics that resonate with decision makers. If there are any core metrics that the company tracks, identify ways to directly demonstrate L&D’s linkage to them. If you are unsure, look for organizational metrics that are announced at company-wide meetings or shared on a regular basis. For example, if Net Promoter Score is something that your organization tracks, establish a Net Promoter Score for L&D and include it in your story. If increasing sales is a priority, identify how sales training is contributing to that effort.

Strike a Balance

It can be tempting to share only successes, however, it is vital to also include opportunities for improvement. Why? Because demonstrating transparency is the key to establishing trust. A strong approach is to share an opportunity for improvement and also include a few specific actions the department is planning to take to improve this. Doing this will provide a two-fold benefit. First, it will demonstrate that you are aware of opportunities to work on. Second, it exemplifies that you have proactively mapped out a plan to address those areas.

If you are finding that your story is entirely positive, consider looking for differences within the population you support. For example, does every region/department/tenure bracket report the same level of impact? Often a team may find that on a holistic level they are doing well; however, when you dig into varied demographics, there may be an area that can drive greater value. By transparently sharing your data to outline both successes and opportunities, the learning organization can become the best at getting better.

CEB Metrics that Matter is committed to helping organizations achieve more precision in strategic talent decisions, moving beyond big data to optimizing workforce learning investments against the most business-critical skills and competencies. To learn more about how we help bridge the gap between L&D and business leaders, download a copy of our white paper, Aligning L&D’s Value with the C-Suite.

About CEB (now Gartner)

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at

Is Learning a “Real” Profession?

One of the hallmarks of a profession is a standard language—an agreed-upon framework for measures and common processes. Think of accounting. The profession has a common language and agreed-upon measures, statements and processes. For example, accountants have four basics types or categories to organize their hundreds of measures. You may recognize these as revenue (income), expense (cost), assets and liabilities. Most accounting measures can be grouped into one of these categories. Accountants place these measures into three standard statements each with a different purpose. The three are income statement (profit & loss or P&L for short), balance sheet and cash flow statement. The three taken together provide a holistic view of an organization’s financial position and can be used to manage the organization throughout the year. Last, their profession has a host of practices taught at university like how to define and use the measures, how to create and use the statements, how to use measures to analyze issues and solve problems (i.e., ratio analysis, the DuPont model and the audit process).

What does learning have that is comparable? Unfortunately, very little. Ed Trolley and David van Adelsberg suggested two categories of measures (effectiveness and efficiency) in their ground breaking 1999 book Running Training like a Business, but the profession did not fully embraced them. We started with their work in 2010 and broke out outcome measures from effectiveness measures for Talent Development Reporting principles (TDRp) to highlight their importance. While more are adopting the three TDRp categories, you can pick up any professional publication and find the terms used inconsistently. This does not happen in accounting. Accountants have standard names and definitions for all their measures while learning professionals do not. People define and calculate the same measures differently. There is no single source of truth, no profession-wide measures library like there is in accounting.

Unlike accounting and prior to TDRp, we have had no standard reports whatsoever. Consequently, there has been no agreement on what measures go into what reports, or how many to include, or what to do with the measures and reports once they’re created. Yes, we have a number of dashboards that typically show actual results for measures like number of participants, hours, courses developed and delivered, and utilization rates, but these really don’t rise to the level of standard reports and there is no agreement in any case on what should be included or how it should be displayed.

Last, we have few standard processes. We have some for instructional design and performance consulting like ADDIE, SAM, SCORM and API, but not much agreement as a profession on most other processes. For example, although it’s recommended by all authors and leading practitioners, we don’t always start with the end in mind, we don’t always meet proactively with our CEOs to discover the coming year’s goals, we don’t always meet proactively with the goal owners to reach upfront agreement on the impact of learning and mutual roles and responsibilities, we don’t always align our learning to the organizations’ most important goals, we don’t always create SMART goals for our most important measures, and we don’t always create reports with plan and year-to-date results to let us manage our programs to deliver the agreed-upon goals. Compare these with the audit process where all accountants know how to do an audit and where there are standard expectations for how an audit is to be conducted.

So, if we want to be a true profession, we need standards just like the accountants have. TDRp is designed to help us get there but we are just at the beginning of this long, but very important, journey. To succeed we all need to work together. Please start using the three categories of measures in your workplace. For your key programs and measures, please start using formats like those in the three TDRp reports. Let us know where you believe we need to change, refine or improve this framework. Tell us if you think there is another category which should become part of the standard or if we need a different type of report. Recently, several have asked for some example dashboards that would be consistent with the TDRp framework so we will work on this. Bottom line, learning is still a very immature field so this is a work in progress. No one has all the answers, but it is time for us to grow up and take the next important step to become a true profession.

Where Does Reporting Fall on the L&D Maturity Spectrum?

Maturity models show a journey from just beginning to collect data all the way through the use of advanced and sophisticated practices and processes. The models can be very useful in conveying the journey that must be followed from immature to mature and in helping organizations assess both where they are today and where they may wish to go in the future.

Unfortunately, since most practitioners don’t really understand what is required to run learning like a business, most of these models show reporting as the second or third step on what is often a fix or six-rung maturity ladder. Properly understood, reporting should be the next to highest rung.  Let’s see why.

First, we need to define terms which is where the problem begins. For most, reporting represents the creation of monthly dashboards or scorecards which show only actual results. And typically the measures are low level efficiency and effectiveness measures like number of courses, hours and participants, and level 1. You have all seen these. They may show monthly or quarterly data and may contain bar charts or line graphs. Dashboards may show a speedometer. If this is how we define reporting, then I agree it belongs on the second rung since it is merely capturing the most elementary data from the first rung and using it only to discern how the measure is trending. Authors of these maturity models often go to great lengths to contrast this low level “reporting” with higher level predictive analytics which these days is almost the highest rung. So, the model encourages you to move beyond elementary reporting (which is good!) to higher level measures (like levels 3-5, which is also very good!) to big data and predictive analytics (which may or may not be so good), all the while never mentioning the use of reports for management of the department of program.

If we instead define reporting in line with Talent Development Reporting principles (TDRp), the whole model changes. In TDRp executive reports are never simply a compilation of actual results (history). A report must include the plan (target or goal) for that measure, year-to-date (YTD) results, a comparison of YTD results to plan, and ideally a forecast of how the year is likely to end if no special (unplanned) action is taken. The sole purpose of the report is to answer the two basic questions every manager should ask for every important measure: (1) Are we on plan year to date? and (2) Are we going to end the year on plan? Trend for the year and comparison to previous years are interesting but largely irrelevant after the plan has been set for the year. Trend and history would have been used to set an achievable, reasonable plan, but after the plan is set, all that matters is whether you can achieve it. So, these reports are absolutely indispensable to managing key programs and the department as a whole. A good manager simply cannot manage without them.

In this light, good reporting should be near the top of the maturity model since it supports the active, disciplined management of the function which, in my opinion, is the top rung. I believe that this active management supported by good reporting will deliver FAR greater results and impact than big data and predictive analytics. In fact, it is not even close. Predictive analytics is typically used to discover relationships among measures which is great but usually impacts a small number of measures or projects. In contrast, the active management of all your key programs, by definition, will affect everything important you do for the year. In other words, if a CLO is trying to decide between investing in predictive analytics and disciplined management of the department using good reporting, my strong advice is to get your disciplined management in place first.  This will provide far more bang for the buck and has the potential to advance your career in management (versus in analytics).

Please take a second look the next time you see a maturity model to understand how the authors have defined the components. I bet they will have a “dumbed” down definition for reporting and consequently have allocated it to a low rung. If you are constructing your own maturity model or journey chart, I challenge you to aspire to great management of the entire department as your highest rung.

Starting With the End in Mind

The end of the year is always a good time for reflection and it seems particularly appropriate to reflect on starting with the end in mind. The concept sounds so simple but most struggle with it in practice. Basically, starting with the end in mind means establishing a plan or goal at the start of the year or project and then working backwards from the goal to determine all the events that must happen to achieve the goal. All of this would be done before the year or project gets underway, including the creation of appropriate measures. This approach is advocated by all the leaders in our field including Jack and Patti Phillips, Jim and Wendy Kirkpatrick, Roy Pollak et. al. in The Six Disciplines of Breakthrough Learning, and many others.

So, what does starting with the end in mind look like in practice? Let’s consider first the type of learning that can contribute to your organization’s goals. Examples include sales training, cost reduction, quality improvement and efforts to improve employee engagement and leadership. Most learning professionals do set a plan at the start of the year (or course) for number of participants (a level 0 or efficiency measure). Is that what we mean by starting with the end in mind? Definitely not. Some set a target for participant reaction (a level 1 effectiveness measure) like 80% will be satisfied or highly satisfied with the learning. Is that the end? No. How about a certain pass rate or average score on testing (level 2 effectiveness measure)? Still no. A few set a goal for how much of the learning is actually applied on the job (level 3 effectiveness measure). This is much better than number of participants and levels 1-2, but it is still not the appropriate end. It doesn’t answer the questions of why the learning is being conducted in the first place.

Only impact (an outcome measure which doubles as a level 4 effectiveness measure) addresses the end, the reason for the training. Therefore, impact must be the starting point when the training is aligned to your company goals. For example, if the goal is to increase sales or employee engagement, what impact do you believe training can have on achieving the goal? A lot, a little? The greater the planned impact, the more effort and time you and the goal owner will have to dedicate to the training. And the more important it will be to set targets for all the critical steps required for success. This is where you need to decide how many employees must take the training, how satisfied you want them to be with the learning, what their minimum passing test scores must be if appropriate, and what percent must successfully apply the learning to achieve the planned impact (end). So, number of participants and levels 1-3 are all important, but they are not the end. The end is planned impact. Start with planned impact and then work backwards to determine levels 0-3 as well as roles and responsibilities for the goal owner and the learning department. (Note: Although the Kirkpatricks don’t focus on it, I know they would agree that the investment in learning must make sense financially. Jack Phillips introduced the level 5 measure of ROI to address this issue. So, we can modify our “end” to be impact that makes sense financially. In other words don’t invest in learning if it is going to cost more than the impact is worth.)

Let’s conclude our thoughts on starting with the end in mind by considering initiatives within the department to improve efficiency and effectiveness measures. Most learning departments focus on improving a couple of these measures each year. Examples might include reaching a higher percentage of employees, increasing the utilization of online courses, or improving enterprise scores for levels 1-3. In this case the end is indeed the number of participants or a higher application rate. Starting with this end in mind, the next question is what must we do to achieve this goal? Work backwards from your planned level of improvement to identify all the steps required along with appropriate resources (staff and budget), measures, and roles and responsibilities. You will need enough specificity to know each month whether you are on target to meet the planned improvement.

As this year come to an end, reflect for a minute on what percentage of your programs or initiatives were truly designed with the end in mind. If the percentage is small, you might resolve to improve in 2017.

CTR Is excited to Announce the Release of a Significantly Updated TDRp Measures Libary

CTR is excited to announce the release of a significantly updated version of the TDRp Measures Library. This update focuses on Learning and Development Measures and has expanded from 111 measures to 183 Learning related measures.  The additions include measures related to informal and social learning as well as more detailed breakdown of the cost measures.  This version of the library also includes measures defined by the CEB Corporate Leadership Council as well as updated references to the HCMI Human Capital Metrics Handbook and updated references to ATD, Kirkpatrick and Philips defined measures

If you are a CTR member, you have access to the updated version at no additional charge. .

If you are not a member, join CTR for $299/year

Management: The Missing Link in L&D Today by Dave Vance

Despite great progress in so many areas of L&D, there is one area which has not seen much progress. This is the business-like management of the L&D department and L&D programs. Yes, department heads work to implement new LMS’s on time, and program managers and directors work to roll out new programs on time but there is still an opportunity to dramatically improve our management. Let’s look at programs first and then the department as a whole.

At the program level a really good manager would work with the goal owner or sponsor to reach upfront agreement on measures of success for the program like the planned impact on the goal. A really good manager would also work with the goal owner and stakeholders to identify plans or targets for all the key efficiency and effectiveness measures that must be achieved to have the desired impact on the goal. Examples of efficiency measures include number of participants to be put through the program, completions dates for the development or purchase of the content and completion dates for the delivery, and costs. Examples of effectiveness measures include levels 1 (participant reaction, 2 (knowledge check if appropriate), and 3 (application of learned behaviors or knowledge). A really good program manager would also work with the goal owner upfront to identify and reach agreement on roles and responsibilities for both parties, including a robust reinforcement plan to ensure the goal owner’s employees actually apply what they have learned. Today, many program managers do set plans for the number of participants and completion dates. Few, however, set plans for any effectiveness measures and few work with the goal owner to reach agreement on roles and responsibilities, including a good reinforcement plan. Virtually none use monthly reports which show the plan and year-to-date results for each measure and thus are not actively managing their program for success in the same way as their colleagues in sales or manufacturing.

At the department level, a really good department head or Chief Learning Officer would work with the senior leaders in the department to agree on a handful of key efficiency and effectiveness measures to improve for the coming year. Then the team would agree on a plan or target for each as well as an implementation plan for each, including the name of the person responsible (or at least the lead person) and the key action items. Examples of efficiency measures to manage at the department level include number of employees reached by L&D, percentage of courses completed and delivered on time, mix of learning modalities (like reducing the percentage of instructor-led in favor of online, virtual, and performance support), utilization rate (of e-learning suites, instructors or classrooms), and cost. Examples of effectiveness measures to be managed at the department level include level 1 participant and sponsor reaction, level 2 tests, and level 3 applications rates. Both the efficiency and effectiveness measures would be managed at an aggregated level with efficiency measures summed up across the enterprise and effectiveness measures averaged across the enterprise. A really good CLO would use monthly reports to compare year-to-date progress with plan to see where the department is on plan and where it is falling short of plan. A really good department head would use these reports in regularly scheduled monthly meetings to actively manage the department to ensure successful accomplishment of the plans set by the senior department leadership team. Today, very few department heads manage in this disciplined fashion with plans for their key measures, monthly reports which compare progress against plan, and monthly meetings dedicated to just the management of the department where the reports are used to identify where management action must be taken to get back on plan.

In conclusion, there is a great opportunity to improve our management which in turn would enable us to deliver even greater results. This requires getting upfront agreement on the key measures, on the plan for each one, and the actions items required to achieve each plan. For the outcome measures it also requires reaching agreement with the goal owner on mutual roles and responsibilities. Once the year is underway, good management also requires using reports in a regular meeting to identify problem areas and take corrective actions. Our colleagues in other departments have been doing this for a long time and with good reason. Let’s get onboard and manage L&D like we mean

The Promising State of Human Capital Anayltics (Webinar by ROI Co-Sponsored by CTR)

Talk with any human capital executive about the field of human capital analytics and they will generally agree: the best is yet to come. That doesn’t mean that many companies aren’t already performing incredible feats with people data—a few are profiled in this report—the statement is a testament to the opportunity that most can see in this burgeoning field. And it’s a testament to the constant new and innovative ways professionals are using people-related data to impact their organizations. This study surveyed analytics professionals in organizations of all sizes worldwide, and asked very pointed questions on how those organizations are using human capital analytics today, if at all. The results were more affirming than they were surprising:

  • Budgets for human capital analytics are increasing along with executive commitment.
  • Relatively few companies are using predictive analytics now, but expect to in the future.
  • Most are using analytics to support strategic planning and organizational effectiveness.

Successful companies tend to be those that purposefully use data to anticipate and prepare rather than to react to daily problems.

It’s clear in both the data from the survey and follow-up interviews that were conducted, that the future focus of
professionals in the human capital analytics field will increasingly be on using analytics to guide strategic decisions and affect
organizational performance. To sum up the state of human capital analytics in one word: Promising.


This webinar introduces the new research study that demonstrates the progress that
continues to be made with human capital analytics. Participants will learn:

  • Four key findings on the state of
    human capital analytics
  • How high-performance
    organizations are building
    leading human capital analytics
  • What Google, HSBC, LinkedIn
    and Intel are doing to drive
    business performance through
  • What you can do to move your
    analytics practice forward

We want to thank Patti Phillips, and Amy Armitage for the opportunity to co-sponsor this webinar. The recording and PPT have been made available to anyone who missed the webinar.


The Promising State of HCA

3.60 MB 64 downloads


Webinar: Innovation in L&D: Building a Modern Learning Culture

Tuesday, July 19 @ 11:00–12:00 p.m. CST




Join Caveo Learning CEO Jeff Carpenter and a panel of forward-thinking learning leaders from Microsoft, McDonald’s, and Ford as they explore innovation in L&D.

More and more, stakeholders throughout the business are bypassing the learning function to create learning outside the learning & development organization. To win back the hearts of these stakeholders (and win a bigger share of the organizational budget), learning leaders must deliver solutions that are exciting, cutting-edge, efficient—in a word, innovative.










By pushing beyond their comfort zone to embrace new ideas, concepts, and technologies, L&D organizations ensure their continued relevance and enhanced ability to deliver tangible business results.

In this webinar, you’ll learn how to:

  • Foster a culture of innovation and creativity in your learning organization
  • Reexamine and reconfigure outdated training through a lens of strategic innovation
  • Develop innovative training and eLearning programs within the confines of business processes and templates

Register today!

Webinar Presenters

Rich Burton, Group Project Manager at Microsoft

Jeff Carpenter, CEO of Caveo Learning (CTR Advisory Council Member)

Gale Halsey, Chief Learning Officer of Ford Motor Company

Rob Lauber, Chief Learning Officer of McDonald’s Corporation

Who Should Attend

Chief Learning Officers (CLOs), VPs of Training, Training Directors and Managers, Human Resources VPs and Directors, CEOs, and COOs.


  • “You personally meet and interact with the innovative thinkers who are shaping the future of L & D as well as other like-minded forward looking L & D leaders. This is the conference to attend if you are looking to take the next steps to becoming a more effective business partner! Don McGray”
  • Talent Analytics, especially when it comes to Learning and Development is getting increased focus. It’s as if the business schools have added it to their curriculum the way the C-Suite is starting to expect meaningful reporting and evidence of contribution. The wave has been building for a while now and we need to get on quick so we can ride it successfully. There is no better way to do this that to learn from the industry experts, the ‘inventors’ of the methodology and those that have seen success. The CTR conference will expose you to lessons, some hard lessons and others inspiring messages of success, either way, valuable insight to enhance what you do when you get back to your organization.
  • “The opportunity to gain knowledge, reaffirm efforts that we do have in place, as well as brainstorm solutions with others who share similar struggles and/or concerns was invaluable. I found it to be a highly useful, practical experience, bookended by industry leaders motivating us to keep thinking strategically and progressively.
  • Paul Scott, M.A.
  • Manager, Learning and Development
  • UT Southwestern Medical Center”
  • I was blown away by the quality of the speakers. I also walked away with very valuable tools and resources that I put into use immediately.
  • CTR Week was the single most impactful, relevant, useful, and exciting conference I’ve attended in 25 years! For 2 years, based only on reading the website, and attending 2 webinars, I’ve been gushing about CTR to clients in 15 Asia Pacific countries and 3 North American nations. I attended the 5-day CTR week to “seek first to understand” human capital measurement in my new role as VP of software training company. As a result of the conference, my new personal business goal is TDRp certification by December 2016. My new corporate goal is TDRp accreditation for my company by June 2017. These will be ranked priorities! We hope to build TDRp-accredited software to help our clients measure the impact of our services. We are so “all-in” on this that I have begun using the concepts with my family of 4 kids. We already set goals. Now, thanks to CTR, we will prioritize them, and track our progress using cool new tools – fun, not work – trust me! Thank you Dave & Andy Vance, Peggy Parskey, Kevin Jones, Cushing Anderson, Jeff Higgins, Jean Martin, Gene Pease, Laurie Bassi, Alwyn Klein, Carrie Beckstrom, Jac Fitz-enz, Patti Phillips, and Ed Trolley for enriching my life both at work and at home! – Sincerely, Mike Reid, Vice President, DevelopIntelligence
  • As a student, the CTR Conference is beneficial for an upcoming young professional. It is a great way to network and learn from mature practitioners and scholars in the field. The vast amount of knowledge that I gained from the conference is unforgettable. I look forward to applying what I learned in my new organization.
  • The CTR conference provided an opportunity to not only hear What by How and Why on the concept of HR Analytics and Measurement. The speakers were incredibly thought provoking, and even if you are familiar with the concept, the ideas discussed in the sessions will give you ideas on how to improve your processes.
  • Every Talent Management professional can benefit from the CTR conference. You will not only learn how to run your department like a business, but you will learn how to use data to improve your effectiveness.
  • The CTR Conference offered dynamic and thought provoking information on using analytics and building measurement programs for L&D that expands business function. The better we can do in the L&D community to merge our function to the business goals the more we will become valued business partner, this conference provides the tools to carry this out.


What were the most beneficial aspects of the CTR Conference?

What topics or speakers should we consider for the 2017 CTR Conference?

What topics or speakers should we consider for the 2017 CTR Conference?

  • Repeat speakers
  • Everyone who presented this year was great! Dave is awesome and super-engaging.
  • More from Laurie Bassi and Jeff Higgins. Financial Accumen session was great, I would recommend that and from Laurie, more case studies.
  • New speakers
  • Perhaps consider inviting someone from the Univ. of North Texas or other area college (if in Dallas) who has an academic tilt to the topic.
  • “I would be interested in hearing from Josh Bersin and any other leader who helped put CTR on the map.
  • Case studies from industry leaders in talent management, measurement, and talent analytics (Disney, Southwest, TMobile, Amazon, Google, and Starbucks) with a mixture of organizations represented (retail, automotive, manufacturing, public sector, etc.).
  • Get an MTM client who is using the executive reporting module and putting action to the talk in a sustainable way.
  • ADP – Susan Hanold, PhD – She has spoken at ATD several times and is interested in sharing trends and best practices from client trends
  • YES!!! – Jorge Leuro and Todd Kruger, competency experts with Halliburton, they have published numerous SPE papers on this subject and are industry known leaders in competency measurement.
  • CTR Members beyond the Board of Directors or Advisers.
  • Make sure there are non-L&D breakouts at each time frame.
  • Go after current cutting edge organizations that are redefining the training and learning experience. I mentioned Amazon, Khan, Udemy, how about the military, how about SAP discussing gamification, how about colleges with distance learning platforms, or even GA Tech to discuss their MOOC format of their fully accredited MS in CS degree.  How about Tim Ferris where he teaches you how to be world class in the least amount of time and the implications his approach could have on training programs.  Have someone from PeopleDev at Google talk about their informal learning program, and not here’s why it’s hard for companies. I guess I’m saying let’s not talk that there is a problem, let’s find more of those who have figured it out so that can give the attendees more options in our minds for how to rise to the challenge.
  • Invite Alexis Fink, director of Intel’s Talent Intelligence and Analytics group
  • leadership development impact
  • Speakers
  • Esko Kilpi – foundations for Post-Industrial Work –
  • Workforce optimization technology solutions e.g. measuring the impact of augmented reality used to “”upskill”” workers
  • Folks from:
  • Tamar Elkeles – whose book The Chief Learning Officer has inspired me for 4 years”
  • More practitioners from businesses and less from consulting agencies.
  • Success stories or change management
  • From a success standpoint, who has implemented TDRp in their company. Not only success stories, but approaches that have not worked. Or obstacles that stopped the process.”
  • The TDRp needs more success stories – showing the value in all areas of optimization of your deliverables – not JUST alignment which seems to be the heavy focus. TDRp allows one to increase efficiency and effectiveness and we need to see how that assists people running talent development teams
  • More speakers with stories of how they implemented TDRp or human capital analytics in the work place.
  • Present some case studies — perhaps from Carrie Beckstrom — along with a demonstration of real-live tools and techniques to help practitioners succeed in their effort.
  • Other topics
  • I would like to see them talk about security and sensitive information protection.
  • More about human capital analytics in general (along with the basics of TDRp)
  • HR Analytics as a whole
  • An ROI workshop.
  • More complete, comprehensive coverage of the topic of change management. “Just use your company’s existing change management approach” is a cop-out and doesn’t cut it. The questions from the audience during the conference indicated that attendees wanted more specifics on this subject.
  • If there has been any movement (actual implementation) on any of the 4 disruptive ideas, then I would like to see a speaker discuss what has happened. Regardless, I’d still like to hear an update from Laurie Bassie on how the SEC may require human capital reporting.
  • I think information on Gamification would be interesting.



What were the most beneficial aspects of the CTR Conference?


What were the most beneficial aspects of the CTR Conference?

  • Conference focus
  • That it was entirely devoted to measurement and evaluation.
  • Measurement conversations
  • Networking
  • Networking with like-minded professionals in the same field.
  • Sessions and networking events
  • The most beneficial aspect was the ability to network with like-minded professionals.
  • The classes and networking opportunities were great.
  • Access to experts, practitioners and materials
  • The real examples, learning from experts, access to the slides.
  • Getting an electronic copy of the slides as well as hard copies of the slides for note taking is above the norm
  • The materials that were provided in the ROI workshop were great…straightforward, easy to understand and implement, etc. Then, those concepts were reinforced all along the way of the conference. I left with a great deal of confidence knowing not only WHAT to implement, but HOW.
  • Caliber/expertise of the speakers
  • Credibility of the speakers in having been a CLO/Practitioner and actually walking the talk. Getting practitioners on stage and taking the journey with lessons learned. Loved format-break out session Quality of speakers
  • Experts sharing implementation experiences
  • The wealth of knowledge, practical tips and opportunities to learn from industry leaders as well as share experiences and struggles with like-minded and like-positioned professionals.
  • The knowledge and perspectives shared by thought leaders.
  • The caliber of the attendees and speakers.
  • The speakers and breakout group leaders were amazing. They were knowledgeable and engaging. Fantastic, especially for a subject many don’t find that interesting! 🙂
  • Hearing from the experts in the field – 5 Disruptive Ideas and HOW to move forward (like the ROI session and Laurie Bassi’s session). The conference was very thought provoking. Even though I already went through the TRRp workshop, it added additional insight.
  • Session topics/content
  • Sessions overall
  • The sessions I attended were well put together and informative, appreciated presenters willingness to answer questions
  • several of the break-out sessions and the information covered
  • Variety of breakout sessions
  • Breakout sessions
  • It was really simple. The presentations were not overly complex and easy to follow.
  • “I attended CTR Week. All 4 events dovetailed nicely together.
  • Industry discussions by people who have ‘been there, done that’. Financial practices was specifically applied to L&D.
  • I got something out of every session!! I really appreciated having the expert panels, learning tips for implementation, etc.
  • Specific sessions or content
  • TDRp workshop. Networking with like-minded folks. The order of events was perfect! I attended the whole week. Hearing about the “Wall street” view from Laurie (ISO committee) and her Human Capital conference (this week). Hearing Kevin Jones awesome survey, learning his PWC history and passion, understanding the history of accounting, and mostly, getting Kevin’s viewpoint on the technology players in the industry. I’ve never been so fired up about the ro
  • Gene Pease breakout on analytics.
  • The session with Patti Phillips, the sessions with a panel.
  • The Measurement, Evaluation, and Analytics breakout by Patti Phillips.
  • The panels where they talked about the journey to implement training measurement and the changes they were able to make because of it. Also how to tie TDRP, Kirkpatrick, and Phillips together.
  • Access to models and examples of how others have achieved success
  • new content about measuring the impact of TD
  • Understanding the Tdrp tools and how its being applied or deployed in other organizations. Listening to common concerns along with possible recommendations for how to deploy in my organization.
  • “Specific insight from sessions.
  • Using the Kirkpatrick model to guide consulting process.
  • ROI is NOT a unicorn!
  • Valuation is driven by human capital. Human capital appreciates.  Orgs become more valuable as their people learn.
  • CFO allocates budget based on most reasonable biz case. If OD doesn’t present a solid business case, the organization under-invests in training. Sales targets are missed.  Valuation declines.



What topics or speakers should we consider for the 2017 CTR Conference?



CTR Conference A Great Success


Our 2016 CTR Conference held in Dallas last month was a great success! We had 134 participants, almost double the number at our last conference and far more than we had hoped for.  If you were not there, you really missed a great gathering. Everyone there was interested in measurement, reporting and management so there was a tremendous amount of focused energy, sharing and interaction. Cushing Anderson from IDC and Ed Trolley from NIIT provided the keynotes to start off each day, and they were not shy about sharing their thoughts! We had a great panel on Disruptive Ideas and a very nice tribute to Jac Fitz-enz, the father of human capital analytics. There were 16 breakout sessions hosted by industry thought leaders over the two days and our participants often struggled to decide which session to attend. Unlike many conferences, most of our speakers came for more than just their session and even asked questions of their colleagues during other sessions which livened things up considerably. Finally, we followed the 1.5-day conference with two half-day workshops: CLO for a Day and Strategic Alignment. Attendees really had fun with CLO for a Day which is the only computer-based simulation for our profession.  We are starting to plan next year’s conference so watch for the Save the Date message coming soon.

Conference survey results  

What were the most beneficial aspects of the CTR Conference?

What topics or speakers should we consider for the 2017 CTR Conference?


We Don’t Exist to Increase Employee Engagement: By Dave Vance

We Don’t Exist to Increase Employee Engagement

By Dave Vance


A surprising number of L&D professionals seem to believe that our primary mission, our main purpose, is to increase employee engagement. They are wrong and their mistaken belief will lead them to misallocate resources, choose inappropriate learning, and deliver suboptimal outcomes.

It is always good to start at the beginning, and the beginning for any department is to know why it exists. What is its mission or purpose? Ideally, learning leaders have thought carefully about this and have created a written mission statement or at the very least can articulate the mission informally. In most organizations, the learning department has the capability and potential to do much more than provide or facilitate learning to increase employee engagement. L&D has the potential to help the organization achieve many, if not all, of its goals. This means helping to achieve business goals like increasing revenue by 10%, reducing costs by 5%, improving patient satisfaction by 3 points, or reducing injuries by 20%. This also means helping to achieve HR goals like improving the leadership score on the annual survey by 5 points, meeting all compliance-related goals, and yes, increasing employee engagement. It  means providing the basic skills needed by employees new to a position and the more advanced skills needed by experienced employees especially in knowledge-based companies like those in consulting and accounting. Last, learning can help address many needs and challenges that fall below these high-level goals but which nonetheless must be addressed for the organization’s overall success.

So, learning has a very broad reach and can help an organization achieve many of its goals and address numerous challenges. A good mission statement should reflect this broad reach. For example, “Help our organization achieve its goals” or “Help our organization be successful”. If L&D’s mission is simply to increase employee engagement, then whose mission is it in your organization to help achieve all the other goals and meet the numerous needs and challenges that can be addressed though learning? It is true that your sales department could do its own learning, quality its own, manufacturing its own, customer care its own, and so forth. Basically, any department that needs learning does its own. That leaves L&D to address just HR goals and maybe just employee engagement if a separate department takes care of leadership development. This is a very sad state for L&D and for your organization as a whole. Think about what this state implies. Most often, other departments do not have learning professionals so the quality of the needs analysis and the learning is low. Someone is assigned to take care of the training needs on a part-time basis and they may not be happy about it. And training people in these departments are all isolated from one another with no opportunity to pool resources, share knowledge, and specialize.

For most organizations, the learning department can have a much more powerful impact if it has a broad mission which includes helping the organization achieve its business goals and if it is organized to support more than a single department. Let’s nor limit ourselves to simply addressing HR goals like increasing employee engagement. Instead, let’s address business and HR goals as well as the basic and advanced skills our employees need for success. What is your mission?

Get To Know Your Board Member (Cushing Anderson)

Cushing Anderson is program vice president for IDC’s Project-Based Services research. In this role, Cushing is responsible for managing the research agenda, field research, and custom research projects for IDC’s Business Consulting, Human Resources and Learning research programs. Cushing has been actively investigating the link between strategic business planning and training in the extended enterprise, the value of certifications and the impact of training on IT team performance. He has also extensively researched the value of partner certification programs to software companies. He also conducts regular research on the views and experiences of enterprises’ with global consulting firms. Cushing speaks on a variety of topics and has received a number of industry accolades. He is on the editorial advisory board and authors a regular column for CLO magazine and is on the board a Center for Talent Reporting board member. Cushing holds a bachelor’s degree in Government from Connecticut College and earned his M.Ed. in Curriculum and Instruction from Boston College’s Graduate School of Education.

Join Cushing and CTR as he keynotes our Conference February 24th 2016.

Get To Know Your Board Member (Jean Martin)

Jean Martin

Jean Martin - Headshot 2014 (2)

(Executive Director, Talent Solutions Architect, CEB)


As executive director of CEB and CEB’s Talent Solutions Architect, Jean leads CEB’s insight and product development in talent management. Her areas of expertise span the HR spectrum and range from the future of the HR function to leadership to labor market trends. Specifically, Jean spends time working on issues relating to CEO and leadership succession, employee engagement, how companies can attract and keep the best employees, and how companies can seek out top talent globally and build out their global leadership bench.

Jean is often asked to share her knowledge in larger forums and has spoken at venues such as the Gathering of Leaders, the Economist Talent Summit, the Singapore Human Capital Summit, Wharton Women in Business Conference and the European Union. Jean also regularly presents to executive teams including Bombardier, Intel, Cisco, BBVA and Eskrom among others. In addition, her work has appeared in publications such as the Associated Press, Harvard Business Review, the Economist, Fortune, The Wall Street Journal, Bloomberg Businessweek, Forbes, and Human Resources Executive Magazine.

Prior to CEB, Jean served as a special assistant to President Clinton’s Domestic Policy Council. Additionally, Jean was a Presidential Management Fellow serving as a Special Assistant to the Senior Vice President for small business/community development banking at Bank of America. Also during her time as a PMF she was project manager for public-private partnerships in disadvantaged communities at the U.S. Department for Housing and Urban Development and for microfinance and microenterprise development at the U.S. Agency for International Development in Tanzania. Prior to that she worked a management consultant with the Institute for Human Services Management.

Jean received a Masters of Public Policy with a concentration in Economics and Finance at Harvard University and a Bachelor of Arts with highest distinction from the University of Virginia.  She also has served on several non-profit and public boards, such as City Year, Turnaround for Children and the San Francisco Presidio Planning Commission.  She lives in Washington, DC with her husband and three children.