Guest Blog

How to Improve Employee Engagement

by Maria A. Febres-Cordero, Channel & Events Marketing Manager, Explorance

The Annual CTR Conference is almost here! Attendees can once again look forward to an engaging three-day event featuring breakout sessions, keynotes, and informative discussions with the learning industry’s best minds. The theme of this year’s conference is “Building Sustainable Measurement Capability”, putting an emphasis on realizing long-term improvements on how HR and talent are managed and measured.

HR teams are increasingly challenged to substantiate how they contribute to business goals and demonstrate real, ongoing benefits to the organization. Explorance will be contributing two special sessions on how to realize these long-term benefits.

Two Approaches Lead to Similar Benefits

Both of Explorance’s sessions focus on building value and insight for HR. The strategies outlined in each session are about getting more value from employee feedback and assessing where the real value in current policy and programs is.

Creating a Measurement Strategy To Tell Your Value Story
November 2 | 1:40 pm EDT
Presenters: Jennifer Balcom, Director of Consulting, Explorance & Lan Tran, Head of Governance, Technology, and Operations, Kraft Heinz

This session provides insights into how Explorance and Kraft Heinz partnered to create a measurement strategy that answers the key question: Did learning make an impact and positively affect behavior outcomes? This session outlines the approach, tools, and frameworks used in building this strategy.


From Text Analytics to Comment Analysis: How AI Transforms Your Learning Measurement Qualitative Strategy
November 3 | 12:05 pm EDT
Presenter: Steve Lange, Senior Consultant, Explorance

This session introduces the next generation of HR-trained comment analysis tools and how they provide deep insights into qualitative feedback. Pulling from real-world experiences and examples gleaned from the development of Explorance’s BlueML, a comment analysis solution, Steve will reveal the level of sentiment analysis and predictive analytics that is available to provide rapid, action-focused analysis at scale.


As we progress through a period of heightened employee turnover, increasingly referred to as ‘The Great Resignation’, the ability to quickly and effectively respond to employee concerns and sentiment as it relates to training is vital. These two sessions ensure you are up to speed with the latest developments and effective strategies. Leaders that can adopt an effective measurement strategy and are better equipped to derive value from qualitative employee comments will be better positioned to tackle the current challenges of the talent market.

We can’t wait to see you at CTR’s 8th Annual Conference!


The L&D Industry’s Answer to Measuring Leadership Development

by Kent Barnett, CEO, Performitiv

I think we would all agree that as an industry we do a very poor job of measuring, communicating, and improving the value of leadership development programs. I think most, if not all of us, would also agree that leadership development is one of the most strategic business processes in our respective organizations.

So, why is it that we do such a poor job in this area? The answer is because it’s hard and confusing. The good news is that dozens of leading learning organizations and experts have come together to create a systematic framework to measure leadership development. It is designed to work for large and small organizations and be flexible and easy to get started.

As may of you know, the Talent Development Optimization Council (TDOC) was created over threee years ago to address two primary issues:

  1. Better Communicate Learning’s Value
  2. Optimize Learning’s Impact

TDOC created the Net Impact System (NIS) to provide a systematic approach to address these two issues. Organizations that have started to apply the NIS principles are seeing huge gains in business results, effectiveness, scrap reduction, outcome indicators, and Net Promoter Score. TDOC is now focused on leadership development.

Attend Kent Barnett’s Session, Measuring Leadership Development

Join Kent Barnett at the 2021 CTR Annual Conference where he will teach you how to build  world-class dashboards and management reports for leadership development. If you are interested in learning how to use metrics to demonstrate value, then you can’t miss this event!

When: November 2, 2021
Time: 3:00 – 3:35 pm (EDT)
Speaker: Kent Barnett, CEO, Performitiv

Register for this Event Now!

About Performitiv

Performitiv provides the technology, methodology, and expertise to help understand learning effectiveness, business impact, and continuous improvement opportunities. Their technology helps to streamline, automate, and upgrade evaluation processes to save time, simplify data collection, and improve the overall effectiveness of L&D operations.

Covid Has Changed Our L&D Word—Where Do We Go from Here?

by June Sonsalla,
Vice President, Head of Learning & Development,
Thompson Reuters

I 2014 I attended an event focused on Resilience at a boutique hotel near the harbor in San Diego. The day of the conference, it was a temperate 70?F, the palm trees were bending in the breeze, and the air smelled of saltwater and wildflowers. In a pre-pandemic world, resilience lacked urgency. At best, it was a new way to describe change readiness. At worst, it put the onus on employees to manage increasing work volume with the same resources. That day back in 2014 seems like a lifetime ago. I’ve since moved to the midwest, there’s a different president in the White House, and I must wear a mask while in the grocery store.

Fast-forward to 2020. March of this year I started a new position at Thompson Reuters and only spent nine days in the office before moving to my home office indefinitely. A global pandemic, increased focus on social justice, and inclusion as a critical input for all talent processes require a completely different way of working to get the same results as in prior years. In addition, adults are reporting more adverse mental health conditions as they navigate uncertainty. All of these variables mean new demands on human resources function and on each of us individually. If I had a crystal ball, I would have taken better notes during that Resilency workshop I attended back in 2014.

So, how do we overcome new obstacles, move forward, and contribute as much or more value to the organization than before? What matters most to our organizations now? What measures do we use to ensure continued progress and long-term success? And, how do we move ourselves and human resources from merely “getting by” to excelling in the new normal?

A Special Invitation…

I invite you to explore the answers to all of these questions with me at a very special event… the 2020 CTR Conference. During the session, Covid-19: Challenges, Change, and Continued Innovation, a few of my L&D colleagues and I will discuss how the pandemic has accelerated long-term trends creating permanent transformations in what and how L&D delivers strategic value. Hope to see you there for a lively discussion.

Measure and Appreciate Talent Outcomes

by Jeffrey Berk, COO, Performitiv

Learning’s ‘holy grail’ is to measure business outcomes. These include measuring how common business results are connected to learning. Results such as revenue, expenses, quality, productivity, customer satisfaction, risk, cycle time, and employee retention are common examples of business outcomes where L&D desires to show its connection.

However, let’s not forget about important talent outcomes. Talent outcomes are not the business results mentioned above, yet they play a very important role in measuring learning’s impact, especially for strategic, visible, and costly programs. Over the past year, Performitiv has partnered with L&D experts to identify the main sources of talent outcomes, such as culture, engagement, leadership, knowledge and skills, and quality of hire. These outcomes are extremely important to executives for programs such as leadership and onboarding, as well as programs that are part of material change or reorganization.

The next challenge is to figure out how to measure these talent outcomes. Unlike operational business outcomes, talent outcomes are not as clear or obvious. They require evidence of impact but neither perfect nor precise statistical measures of impact. As a result, we can ask questions on evaluations to understand if the programs were connected to the talent outcomes.

Here are some sample questions of talent outcome indicators on evaluations:

  • Culture: As a direct result of this learning, I feel more connected to our organization’s values and beliefs
  • Engagement: As a direct result of this learning, I am more motivated and committed to this organization
  • Leadership: As a direct result of this learning, I feel I am significantly better leader for this organization
  • Knowledge and Skills: As a direct result of this learning, I am confident I will significantly improve my job performance
  • Onboarding/Orientation: As a direct result of this learning, I am confident and comfortable to perform my job requirements at a high level

While not perfect, the above guidance may be used to build your own talent outcome indicator questions, as they are reasonable evidence of impact. For example, the Engagement indicator question may be used to help an executive understand whether participants in a program felt significantly more motivated and committed to be with the firm as a result of participating in a program. The program manager can then look to learning objectives, exercises, and support materials to show the connection of the program toward engagement.

2020 CTR Annual Conference

Learn More about Learning Measurement to Track Impact

Jeffery Berk will be discussion Learning Measurement in depth at the 2020 Virtual CTR Annual Conference, October 27-29, 2020. Join him for his session: Learning Measurement Using Impact Process Mapping, October 28 at 2:00 pm (ET). Registration is completely free.


A Continuous Improvement Approach Designed by Learning for Learning

by Kent Barnett, CEO, Performitiv

Ten years ago, a group of professionals came together to create an industry standard for Talent Development Reporting. Under Dave Vance’s leadership, that work turned into Talent Development Reporting principles (TDRp) and the Center for Talent Reporting. This work has withstood the test of time and has become the industry standard we all desired.

We now have the opportunity to leverage TDRp and create a much needed continuous improvement process that helps our industry optimize its impact. Other industries created their own continuous improvement process with a dramatic impact on their contributions, such as Six Sigma and Net Promoter System.

Many of us who have contributed to the development of TDRp have created a similar continuous improvement process, the Learning Optimization Model (LOM). The LOM is being deployed by several leading organizations and experts. It has six core components that can be implemented over time.

One of the key components is to introduce a new way of measuring effectiveness and impact. It leverages the simplicity and power of Net Promoter Score but is designed specifically for Talent Development. It is a great way to compare formal and informal learning while identifying way to improve impact.

Another component, the impact process map, helps us understand how all the measures we have fit together. More importantly, it helps us understand what to do with the data.

A third component of the LOM is hard data. Many of us believe that learning needs to take ownership of its financial contributions similar to manufacturing and sales. A vice president of manufacturing doesn’t control all aspects of gross profit, but it is a hard measure critical to managing manufacturing’s contribution. A vice president of ales doesn’t control all aspects of sales, but sales growth is hard data critical to managing impact. All business units with a P&L have workforce productivity goals measured by revenue divided by labor cost. Shouldn’t a CLO know the workforce productivity goals for his/her customers and take ownership to help achieve them?

Another component of the LOM is financial modeling. Jack and Patti Phillips of the ROI Institute have been an integral part of developing the LMO. Financial modeling is designed to incorporate the ROI process by helping to do analysis on the best mix of resources prior to rolling out certain strategic programs.

If you’re interested in learning more about the Learning Optimization Model, Join me at the CTR Conference, October 27-29, where I will go more in-depth.

2020 CTR Annual Conference


Join us at the 2020 Virtual CTR Annual Conference, October 27, for the session: A Continuous Improvement Approach Designed by Learning for Learning presented by Kent Barnett, Founder and CEO of Performitiv.


Optimizing Investments in Learning

by Peggy Parskey, Associate Director, Center for Talent Reporting & John Mattox, II, PhD, Managing Consultant, Metrics That Matter

You might wonder why L&D leaders need guidance for optimizing investments in learning. Recent research from The Learning Report 2020 (Mattox, J.R., II & Gray, D. Action-Learning Associates) indicates that L&D leaders struggle to convey the impact of L&D. When asked what they would do better or differently to communicate value to business leaders, 36% of L&D leaders indicated they would improve processes related to communication. More telling though is, 50% indicated they would improve measurement. To convey value, leaders need a story to tell. Without measurement there is not much of a story.

This is where Talent Development Reporting principles (TDRp) are so relevant. It begins with a measurement framework that recommends gathering three types of data: efficiency, effectiveness, and outcomes. Efficiency data tells the story of what happened and at what cost. How many courses did L&D offer? How many people completed training? How may hours of training did learners consume? What costs did we incur? Effectiveness data provides feedback about the quality of the learning event (e.g., instructor, courseware, and content) as well as leading indicators of the success of the event (e.g., likelihood to apply, estimates of performance improvement, estimates of business outcomes, and estimates of ROI). Outcomes are the business metrics that learning influences such as sales, revenue, customer satisfaction, and employee satisfaction. TDRp recommends gathering data for all three types of measures so L&D leaders can describe what happened and to what effect.

The table below (Table 11.1 in Learning Analytics) shows a variety of measures for each category.

Key Performance Measures for L&D

Efficiency MeasuresEffectiveness MeasuresOutcome Measures

  • Number of people trained

  • Number of people trained by learning methodology (instructor led, e-learning, virtual)

  • Reach (percentage of people trained in the target population

  • Cost of training per program

  • Decrease in costs

  • Cost of training per learner

  • Cost of training per hour

  • Satisfaction with training

  • Knowledge and skills gained due to training

  • Intent to apply learning on the job

  • Expectation that training will improve individual performance on the job

  • Expectation, that individual performance improvement to lead to organization performance improvement

  • Return on expectations

  • Increase in customer satisfaction

  • Increase in employee performance

  • Decrease in risk

  • Increase in sales

  • Increase in revenue

Source: Learning Analytics ©2020, John R. Mattox II, Peggy Parskey, and Cristina Hall. Published with permission, Kogan Page Ltd.

In addition to this guidance about what to measure, TDRp provides guidance on how to report results using methods familiar to business leaders. Using these measurement and reporting approaches, L&D leaders can tell a robust story to business leaders that they can connect with.

In April 2020, Learning Analytics, Second Edition (by John Mattox, Peggy Parskey, and Cristina Hall). The book helps L&D leaders improve the way they measure the impact of talent development programs. The second edition includes a chapter on Optimizing Investments in Learning, where TDRp is discussed in detail. TDRp is featured heavily in the book because it helps L&D leaders connect their efforts to business outcomes by measuring the right things and reporting them in a way that business leaders understand.

In Chapter 11, the authors connect TDRp to the Portfolio Evaluation methodology. This approach implies that business leaders are interested in how learning and development programs impact four business areas: growth, operational efficiency, foundational skills, and risk. By aligning courses with one of these business areas and using TDRp, L&D leaders can demonstrate the effectiveness of the courses in preparing the workforce to improve each business area (portfolio).

The book also provides guidance on how to use TDRp to spur L&D leaders to act on the data they report. The book indicates L&D leaders need to shift reporting in three ways to make it more actionable (see graphic below). Reports should provide:

  • Analytics that improve business decisions
  • Insights that link to evolving business challenges
  • Information that connects talent data to business impacts

Using Data to Spur Action

Using Data to Spur Action

Source: Learning Analytics ©2020. Reproduced with permission, Kogan Page, Ltd.

Another critical aspect of conveying a meaningful message that drives action is to tell a compelling story. Chapter 11 of the book includes three critical elements of a data-driven story:

  • Scene Setting—Connect all data and results back to the desired business outcomes and goals.
  • Plot Development—Emphasize logical and clear connections between learning results and outcomes; focus on the central message of the results, not peripheral findings; and note any areas where L&D is creating value for the organization.
  • Resolution—Clearly and succinctly outline the justification for key findings; suggest improvements, recommendations, and nest steps. Continue the conversation on how to help L&D improve.
2020 CTR Annual Conference


Join us at the 2020 Virtual CTR Annual Conference, October 27, for the session, Everything You Wanted to Know about Learning Analytics but Were Afraid to Ask.


Can you Justify your Learning and Development Projects?

By Jack J. Phillips Ph.D., Chairman, ROI Institute

Daily headlines in the business press focus on the economy. While it is booming in some areas, other areas are slowing, and economic uncertainty exists everywhere. During uncertainty, executives must take steps to make sure the organization can weather the storm—whenever or wherever it occurs.

One way executives meet this challenge is to ensure that expenditures represent investments and not just costs. If an expenditure is considered a cost, it will be frozen, reduced, or in some cases eliminated. If it is considered an investment that is producing a return, it will be protected and possibly even enhanced during difficult times. For example, many learning and development budgets are now being frozen or reduced, even with record profits. While this seems illogical, it happens. Now is the time to reflect on your budget and your programs. Can you withstand top executive scrutiny? Are you ready for ROI?

The most used evaluation system in the world is the ROI Methodology® to measure impact and ROI for a few major programs. The ROI Certification® is the process to develop this valuable capability. This valuable certification provides participants with the skills needed to analyze the return-on-investment in practical financial terms. The results are CEO and CFO friendly. Over 15,000 managers and professionals have participated in this certification since it was launched in 1995, underscoring the user-friendly nature of this system.

Don’t have the time or budget? Several approaches are available to reduce the amount of time and cost needed to develop this capability. For more information on ROI Certification, contact Brady Nelson at

ROI Institute is the global leader in measurement and evaluation including the use of return on investment (ROI) in non-traditional applications. This methodology has been used successfully by over 5,000 organizations and in over 70 countries.

Bridge the Gap from Training to Application with Predictive Learning Analytics

by Ken Phillips, CEO, Phillips Associates

In my previous blog post, I discussed the concept of scrap learning and how it is arguably the number one issue confronting the L&D profession today. I also provided a formula you could use to estimate the cost of scrap learning associated with your training programs.

In this post, I’ll share with you a revolutionary methodology I’ve been working on for the past several years called Predictive Learning Analyticts™ (PLA). The method enables you to pinpoint the underlying causes of scrap learning associated with a training program. It consists of three phases and nine steps to provide you with the data you need to take targeted corrective actions to maximize training transfer (see figure below). 

While the specific questions and formulae for the scores are proprietary, I hope you can apply the concepts in your organization using your own survey questions and your own weighting for the indexes. Even if you adopt a simpler process, the concepts will guide you and the article will give you an idea of what is possible.

Phase 1: Data Collection and Analysis

Unlike other training transfer approaches which focus mostly on the design and delivery of training, PLA offers a holistic approach to increasing training transfer. Built on a foundation of three research-based training transfer components and 12 research-based training transfer factors (see chart below), PLA targets the critical connection among all these elements. In short, PLA provides L&D professionals with a systematic, credible and repeatable process for optimizing the value of corporate learning and development investments by measuring, monitoring, and managing the amount of scrap learning associated with those investments.

Training Transfer Components & Training Transfer Factors

Phase 1: Data Collection & Analysis

The objective of phase one, Data Collection & Analysis, is to pinpoint the underlying causes of scrap learning associated with a training program using predictive analytics and data. Five metrics are produced and provide L&D professionals with both direction and insight as to where corrective actions should be targeted to maximize training transfer. The five measures are:

  • Learner Application Index™ (LAI) scores
  • Manager Training Support Index™ (MTSI) scores
  • Training Transfer Component Index™ (TTCI) scores
  • A scrap learning percentage score
  • Obstacles preventing training transfer

Data for calculating the first three measures: LAI, MTSI, and TTCI scores, is collected from program participants immediately following a learning program using a survey. The survey consists of 12 questions based on the 12 training transfer factors mentioned earlier. Data for calculating the final two measures are collected from participants 30 days post program using either a survey or focus groups and consists of the following three questions:

  1. What percent of the program material are you applying back on the job?
  2. How confident are you that your estimate is accurate?
  3. What obstacles prevented you from utilizing all that you learned if you’re not applying 100%?

Waiting 30 days post program is critical because it allows for the “forgetting curve” effect—the decline of memory retention over time—to take place and provides more accurate data.

LAI Scores

LAI scores predict which participants attending a training program are most likely to apply, at risk of not applying and least likely to apply what they learned in the program back on the job. Participants who fall into the at-risk and least likely to apply categories are prime candidates for follow-up and reinforcement activities. Examples include email reminders, micro-learning or review modules, and coaching or mentoring to try and move them into the most likely to apply category.

MTSI Scores

MTSI scores predict which managers of the program participants are likely to do a good or poor job of supporting the training they directed their employees to attend. Managers identified as likely to do a poor job of supporting the training are prime candidates for help and support in improving their approach. This help might take the form of one-on-one coaching; a job aid explaining what a manager should do before, during, and after sending an employee to training; or creating a training program which teaches managers how to conduct pre- and post-training discussions with employees.

TTCI Scores

TTCI scores identify which of the three training transfer components and the 12 training transfer factors affiliated with them are contributing the most and least to training transfer. Any components or factors identified as impeding or not contributing to training transfer are prime candidates for corrective action.

Scrap Learning Percentage

The scrap learning percentage score identifies the amount of scrap learning associated with a training program. It provides a baseline score against when follow-up scrap learning scores can be compared to determine the effect targeted corrective actions had on increasing training transfer.

The obstacles data identifies barriers participants encountered in the 30 days since attending the training program that prevented them from applying what they learned back on the job. Waiting 30 days to collect the data allows for the full range of training transfer obstacles to emerge. For example, some are likely to occur almost immediately—I forgot the things I learned—while others are likely to occur laters—I never had an opportunity to apply what I learned. Frequently mentioned obstacles are prime candidates for corrective actions to mitigate or eliminate them.

Phase 2: Solution Implementation

The objective of phase two: Solution Implementation, is to identify, implement, and monitor the effectiveness of corrective actions taken to mitigate or eliminate the underlying causes of scrap learning identified during phase one. Here is where the “rubber meets the road,” and you have an opportunity to demonstrate your creative problem-solving skills and ability to manage a critical business issue to a successful conclusion. Following the implementation of the corrective actions, it is now time to recalculate the amount of scrap learning associated with the training program. You can then compare the results to the baseline scrap learning percentage calculated during phase one.

Phase 3: Report Your Results

The objective of the third phase: Report Your Results, is to share your results with senior executives. Using the data you collected during phases one and two, it is time to show that you know how to manage the scrap learning problem to a successful conclusion.

In Sum

Scrap learning has been around forever, however what is different today is that there are now ways to measure, monitor, and manage it. One of those ways is through Predictive Learning Analytics™. Alternatively, you might employ the concepts to build your own simpler model. Either way, we an an opportunity to reduce scrap learning.

If you would like more information about the Predictive Learning Analytics™ methodology, email me at: I have an ebook that covers the method and a case study illustrating how a client used the process to improve the training transfer of a leadership development program.

The Greatest Issue Facing L&D Today

Scrap Learning

by Ken Phillips

What is arguably the top issue facing the L&D profession today?

The answer is scrap learning or the gap or difference between training that is delivered but not applied back on the job. It’s the flip side of training transfer and is a critical issue for both the L&D profession and the organizations L&D supports because it wastes money and time—two precious organization resources.

Now, you might be wondering, “How big is the problem?”

Two empirical studies, one by KnowledgeAdvisors in 2014 and one by Rob Brinkerhoff and Timothy Mooney in 2008, found scrap learning to be 45 percent and 85 percent respectively in the average organization. To add further credibility to these percentages, I’ve conducted three scrap learning studies over the past few years and found the scrap learning percentages associated with three different training programs in three separate organizations to be 64 percent, 48 percent, and 54 percent respectively. Added together, these five research studies results in an average scrap learning figure of approximately 60 percent in the average organization.

To further highlight the magnitude of the scrap learning problem, consider its effect on the amount of wasted organizational dollars and time. According to the 2018 ATD State of the industry research report, the average per employee organization training expenditure in 2017 was $1,296 and the average number of training hours consumed per employee was 34.1. Using the KnowledgeAdvisors and Brinkerhoff scrap learning percentages mentioned above, you can see in the table below just how much scrap learning costs the average organization in wasted dollars and time.

Cost of Scrap Learning in Wasted Dollars & Time

Average per employee training expenditure (=) $1,296(X) 45% scrap learning(=) 583 wasted $
The average per employee training expenditure (=) $1,296(X) 85% scrap learning(=) 1,102 wasted $
The average number of training hours consumed per employee (=) 34.1(X) 45% scrap learning(=) 15 wasted hours
Average per employee training expenditure (=) $1,296(X) 85% scrap learning(=) 29 wasted hours

Taking all of this data into account reminds me of James Lovell’s famous quote during his Apollo 13 mission to the moon when an oxygen tank aboard the space capsule exploded putting both the flight and crew in great peril: “Houston, we have a problem!”

If you would like to take a crack at estimating the cost of scrap learning associated with one of your training programs, you can use the Estimating the Cost of Scrap Learning Formula below. To gain the most useful insight, you should make every effort to collect the most accurate data possible for each of the input variables. Also, when selecting an estimated percentage of scrap learning associated with the program, variable 4 in the formula, you should get input from several people familiar with the program such as other L&D colleagues, participants who previously attended the program or perhaps even the managers of program participants and then compute an average of their estimates. Gaining the input of others will increase both the accuracy and credibility of the estimate and remove any concerns that the scrap learning percentage is merely your opinion.

Estimating the Cost of Scrap Learning Formula

Wasted Participant Dollars

The length of a learning program in hours _____
X the number of programs delivered over 12 months _____
X the average number of participants attending one program _____
= the cost of scrap learning in wasted time _____
X the average hourly participant salary + benefits _____
= the cost of wasted participant dollars _____ (A)

Wasted L&D Department Dollars

Administrative expenditures (e.g., materials, travel, facility, facilitator, delivery platform, food, etc.) for one program _____
X the number of programs delivered over a 12-month period _____
X the estimated percent of scrap learning (45-85%) associated with the program _____
= the cost of wasted L&D department dollars _____ (B)

Total Cost of Scrap Learning

Cost of wasted participant dollars (A) _____
+ cost of wasted L&D department dollars (B) _____
= total cost of scrap learning _____

With this information in hand, you are now ready to pinpoint the underlying causes of scrap learning and take targeted corrective actions to mitigate or eliminate these causes and maximize training transfer. How to do this will be part two of this blog article.

Portfolio Evaluation: Aligning L&D’s Metrics to Business Objectives

by Cristina Hall
Director of Advisory Services – CEB, now Gartner

Using data built on standard metrics has become table stakes for L&D organizations looking to transform and thrive in a rapidly-changing business environment.

A critical challenge that remains for many organizations, however, is how to prioritize and structure their metrics so that the numbers reinforce and showcase the value L&D is contributing to the business.  It’s important to select measurement criteria which reflect L&D’s performance and contextualize them with outcome measures used by the business.

Applying a Portfolio Evaluation approach to Learning and Development provides the linkage needed to address this challenge. It is a clear, outcome-centered framework that can be used to position L&D’s contributions in business-focused terms, at the right level of detail for executive reporting.

How Does L&D Deliver Value?

Delivering training does not, in itself, deliver value.  Training is a tool, a method, to develop employees’ knowledge and skills so they will deliver more value to the business.  The value that training programs deliver aligns to four strategic business objectives.

Driving Growth

Courses aligned to the Drive Growth objective are designed to increase top-line growth, thus growing revenue and market share.  The organization tracks metrics related to sales, renewals, upsells, customer loyalty and satisfaction, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses are intended to drive competitive or strategic advantage by focusing on organization-specific processes, systems, products, or skillsets.  Examples include courses that are designed to increase sales, customer retention or repeat business, new product innovation, or help managers best position their teams for business growth.

Increasing Operational Efficiency

Courses aligned to Operational Efficiency increase bottom-line profitability.  The business tracks metrics related to productivity, quality, cost, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses are intended to drive competitive or strategic advantage by focusing on organization-specific processes, systems, or skillsets.  Examples include courses that are designed to increase productivity, decrease costs, increase process innovation, or help managers maximize bottom line performance.

Building Foundational Skills

Courses aligned to the Foundational Skills value driver are designed to both ensure that gaps in employee skills can be addressed, and demonstrate that employees can grow and develop to provide even more value to the business; it’s frequently less expensive to fill a minor skill gap than to replace an employee who is already on-boarded and semi-productive. The business tracks metrics related to bench strength, employee engagement, turnover, promotion rates, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended. These courses tend to be off the shelf content, rather than custom designed content specific to the business. Examples include time management, MS Office, and introductory/generalized coaching or sales courses.

Mitigating Risk

Courses aligned to the Mitigate Risk value driver are designed to shield the business from financial or reputational risk by ensuring employee compliance with specific policies or maintenance of specific industry certifications.  The business tracks metrics related to safety, legal costs, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses tend to be focused on compliance, regulatory, and safety training, and tend to incorporate content similar to that of other courses in the organization’s industry.

Become a Portfolio Manager

Every learning asset, whether informal or formal, can be tied back to one of the four drivers of value.  The variety and depth of metric collection and the performance expectations associated with those metrics differ across each of these value drivers, which is why grouping courses or learning assets into Portfolios is helpful.  L&D leaders become investment managers; monitoring and managing assets that are expected to produce comparable results to effect the performance of people, who in turn effect the performance of the business.

Getting Started

  1. Align metrics to Portfolios: what is most important? What data is needed?
  2. Align learning assets to Portfolios: this ensures that the right metrics are collected.
  3. Gather the data: gather training effectiveness data from learners and their managers and combine it with efficiency data from the LMS and Finance and outcome data from the business.
  4. Review, interpret, and share: use the metrics to communicate L&D’s contribution to business goals, confirm alignment, and inform strategic decision-making.

For additional detail regarding the Portfolio Evaluation approach, download our white paper, Aligning L&D’s Value with the C-Suite.

About CEB, Now Gartner

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at


Informal Learning Evaluation: A Three-Step Approach

by John R. Mattox, II, PhD
Managing Consultant
CEB, now Gartner

You may recall that roughly 20 years ago, eLearning was seen as the next new thing. Learning leaders were keen to try out new technologies, while business leaders were happy to cut costs associated with travel and lodging. The eLearning cognoscenti claimed that this new learning method would deliver the same results as instructor-led training. They passionately believed that eLearning would become so prevalent that in-person classrooms would disappear like floppy discs, typewriters, and rotary telephones. The learning pendulum was ready to swing from the in-person classroom experience to the digital self-paced environment.

In time, the fervor surrounding eLearning waned and practical experience helped shape a new learning world where the pendulum was not aligned to one extreme or the other. Effective formal learning programs now employ a blended approach comprised of multiple components, including in-person instructor-led classes, virtual instructor-led events, self-paced web-based modules and maybe, just maybe, an archaic but valuable resource like a book.

Informal learning is the new hot topic amongst leaders in the L&D field. Three things appear to be driving the conversation: the 70/20/10 model, technology, and learners themselves. While the 70/20/10 model is by no means new—it was developed in the 1980s at the Center for Creative Leadership—it has become a prominent part of the conversation lately because it highlights a controversial thought: all oft he money and effort invested to create formal learning accounts for only 10% of the learning that occurs among employees. Only 10%! Seventy percent of the learning comes from on-the-job experience and 20% comes from coaching and mentoring.

These proportions make business leaders ask tough questions like, “Should I continue to invest so much in so little?” and “Will formal learning actually achieve our business goals or should I rely on informal learning?” L&D practitioners are also wondering, “Will my role become irrelevant if informal learning displaces formal learning?” or “How can L&D manage and curate informal learning as a way of maintaining relevance?”

The second influencer—technology—drives informal learning to a large extent by making content and experts easy to access. Google and other search engines make fact finding instantaneous. SharePoint and knowledge portals provide valuable templates and process documents. Content curators like Degreed and Pathgather provide a one-stop shop for eLearning content from multiple vendors like SkillSoft, Udemy, Udacity, and Lynda.

Employees are driving the change as well because they are empowered, networked, and impatient when it comes to learning:

  • 75% of employees report that they will do what they need to do to learn effectively
  • 69% of employees regularly seek out new ways of doing their work from their co-workers
  • 66% of employees expect to learn new information “just-in-time”

As informal learning becomes more prominent, the question that both L&D and business leaders should be asking is simple, “How do we know if informal learning is effective?” The new generation of learners might respond, “Duh! If the information is not effective, we go learn more until we get what we need.” A better way to uncover the effectiveness of information learning is to measure it.

Here’s a three-step measurement process that should provide insight about the effectiveness of most types of informal learning.

1. Determine what content you need to evaluate

This is actually the most difficult step if you intend to measure the impact of informal learning systematically across an organization. If you intend only to measure one aspect of informal learning—say a mentoring program then the work is substantially less. When undertaking a systematic approach, the universe of all possible learning options needs to be defined. Rather than give up now, take one simple step: create categories based on types of learning provided.

For example, group the following types of learning as:

  • Technology-Enabled Content: eLearning modules, videos, podcasts, online simulations or games
  • Documents: SharePoint resources, standard operating procedures and process documents, group webpages, wikis, and blogs
  • Internet Knowledge Portal

Create as many categories as needed to capture the variety of informal learning occurring in your organization.

2. Determine what measurement tool is best suited for each learning type

Common tools include surveys, focus groups, interviews, web analytics, and business systems that already gather critical operational data like widgets produced or products sold. Web analytics, business systems, and surveys tend to be highly scalable, low-effort methods for gathering large amounts of data. Focus groups and interviews take more time and effort, but often provide information rich details.

3. Determine when to deploy the measurement tool to gather information

For an eLearning module, it seems appropriate to include a web-based survey link on the last page of content. Learners can launch the survey and provide feedback immediately after the module is complete. If the content is curated by a vendor–preventing the insertion of a link on the final page of materials–then the completion of the module when registered in the LMS should trigger the distribution of an email with a link to the survey evaluation. Regardless of the type of learning (instructor led, virtual, self-paced, etc.), the timing and the tool will vary according to the content.

Is it easy to implement this measurement approach to evaluate the impact of informal learning? For some organizations, maybe. For others, not at all. However, measurement is a journey and it begins by taking the first step.

For More Information…

For guidance about measuring informal learning, contact John Mattox at CEB, now Garner To learn more about how to improve L&D measurement initiatives, download Increasing Response Rates white paper.

About CEB, now Gartner

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at

Telling a Story With Data

Data Story

by Christine Lawther, PhD
Senior Advisor—CEB (now Gartner)

Data is everywhere. In our personal lives, we are continually exposed to metrics. The number of “likes” on social media, usage metrics on every bill, and the caloric breakdown of burgers at the most popular fast food chains are all examples of common metrics that society is exposed to on a regular basis.

Looking at data within a business context, data insight is in high demand. More organizations are focusing on doing more with less, and so data often becomes the key element that determines decisions on goals, resources, and performance. This increase in data exposure acts as an opportunity for learning and development (L&D) professionals to showcase their efforts and to truly transition the conversation from being viewed as a cost center to an essential contributor to the organization’s goals.

One common challenge is that L&D teams are often not staffed with team members that have a rich background in analytics. When instructional designers, facilitators, program managers, and learning leaders hold the responsibility of sharing data, it can be rather challenging to translate stereotypical L&D metrics into a compelling story that resonates with external stakeholders. It’s because of this that tapping into some foundational best practices in telling a story with data can be valuable to consider.

Structure Your Story: The Funnel Approach

If you visualize a funnel, imagine a broad opening where contents are poured in and a stem that becomes increasingly narrow. Apply this visualization as the framework to craft your story: start with broad, generic insights, and then funnel down to specifics. Doing this enables the recipient of the story to understand the natural flow of moving through diverse metrics, but still understand the overarching picture of L&D performance as a comprehensive whole. For example, it may be helpful to start by outlining overall satisfaction or utilization metrics, and then transition into something slightly more specific such as breaking out scores of key courses within your portfolio that are the biggest contributors to those overall metrics. From there, you can move into more detailed metrics by delving into components such as highest/lowest rated items within that course, time to apply training, barriers to training application, and insightful qualitative sentiments. At the very end of the story, one can conclude with specific actions that the team plans to take. Following this approach not only paints a comprehensive picture, but it also creates momentum for next steps.

Speak Their Language

Metrics that L&D often focuses on (e.g., activity, cost per learner) may not easily translate into insights that external stakeholders innately resonate with. Each department within an organization may have their own custom metrics. However, it is imperative that a function can demonstrate the linkage back to the broader organization. Doing this enables one to exhibit that they are being good stewards of the resources granted to them and also reveals how their day-to-day efforts align with the broader organization.

So, how can you demonstrate that leadership should be confident with your decisions? Communicate your impact with metrics that resonate with decision makers. If there are any core metrics that the company tracks, identify ways to directly demonstrate L&D’s linkage to them. If you are unsure, look for organizational metrics that are announced at company-wide meetings or shared on a regular basis. For example, if Net Promoter Score is something that your organization tracks, establish a Net Promoter Score for L&D and include it in your story. If increasing sales is a priority, identify how sales training is contributing to that effort.

Strike a Balance

It can be tempting to share only successes, however, it is vital to also include opportunities for improvement. Why? Because demonstrating transparency is the key to establishing trust. A strong approach is to share an opportunity for improvement and also include a few specific actions the department is planning to take to improve this. Doing this will provide a two-fold benefit. First, it will demonstrate that you are aware of opportunities to work on. Second, it exemplifies that you have proactively mapped out a plan to address those areas.

If you are finding that your story is entirely positive, consider looking for differences within the population you support. For example, does every region/department/tenure bracket report the same level of impact? Often a team may find that on a holistic level they are doing well; however, when you dig into varied demographics, there may be an area that can drive greater value. By transparently sharing your data to outline both successes and opportunities, the learning organization can become the best at getting better.

CEB Metrics that Matter is committed to helping organizations achieve more precision in strategic talent decisions, moving beyond big data to optimizing workforce learning investments against the most business-critical skills and competencies. To learn more about how we help bridge the gap between L&D and business leaders, download a copy of our white paper, Aligning L&D’s Value with the C-Suite.

About CEB (now Gartner)

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at

3 Principles of Effective Learning Metrics & Measurement

Contributed by Caveo Learning

As more and more talent development leaders take a serious look at implementing meaningful metrics and measurement across their learning and development organizations, the business relationships and conversations between L&D professionals and stakeholders are changing for the better.

There are times when talent development leaders need to root themselves in foundational principles of talent development metrics. It’s easy to get caught up in new thinking, models, and frameworks, and lose focus on the fundamentals of how to run a learning organization.

No matter what model, framework, system, tool, methodology, or new approach we want to adapt, adopt, and deploy, there are at least three fundamentals that we should never lose sight of—principles that should be applied to all learning metrics.

  1. Differentiate between metrics you actively manage and those you monitor.

The principle is so simple, yet so rarely considered. Just like in medicine, some “vital statistics” are always monitored—the moment something changes in the wrong direction, an alarm sounds, allowing for the “metric” to be actively managed. Similarly, when selecting your metrics, determine which metrics you intend to monitor, and which you intend to actively manage. Further, know what the target thresholds are for the monitored metrics that will trigger the “alarm” for active management. For example, you may want to monitor L&D staff utilization, and only actively manage it should the metric fall out of the acceptable range.

  1. Align to industry formulas, where practical.

Another issue that often comes up is defining the specific formula for a metric. Frequently, the formula is not considered deeply enough to add the full value that it can. This can result in metrics with little credibility or validity, and which should not be informing any decisions. It’s true that each organization has its own characteristics, its own language, and specifics that need to be considered. However, using a standard formula defined by an existing industry benchmark can be very helpful when planning your metric strategy, goals, and even budget. Industry benchmarks are only valuable for planning and comparison if you are comparing apples with apples—to do that, the formulas need to match.

CTR does a fantastic job in its L&D metrics library of not only providing the recommended formula, but also noting which industry organization defined it, or even which other industry benchmark it is similar to. A good balance between very-company-specific and industry-aligned formulas will allow for at least some comparison, planning, and target setting against industry metrics. Whether it is CTR’s Talent Development Reporting Principles (TDRp), Bersin by Deloitte, Training magazine, ATD, SHRM, or any others, consider aligning at least some meaningful metrics to an industry definition and formula. With our above example of staff utilization, one TDRp formula we could use is “Learning staff, % of total hours used.”

  1. Measure to inform decisions and actions.

Whether you are monitoring or managing a particular metric, there must be an underlying business reason to do so. Having a metric that does not inform a decision or an action is of no value. Also, consider why you are measuring something and what that metric will influence. Learning metrics are there to help us improve what we do as L&D professionals. Whether it is improving the efficiency of our learning organization, the effectiveness of our learning solutions, alignment to our stakeholders, or the contribution our interventions have on the strategic business goals of the organization, metrics play a critical role in influencing the value we bring to our organizations and, almost more critically, the credibility of L&D in the eyes of external stakeholders and the C-suite.

It’s better to have a few good metrics that inform meaningful decision making and allow for agility in improving the value L&D brings to your organization, rather than hundreds of metrics that offer limited value. Sticking with our example, we could monitor the utilization of learning staff and start actively addressing this metric should the percentage increase above, say, 90%, by engaging a learning solutions provider to assist with some of the workload until it returns to the acceptable range.

Learning leaders are starting to take more notice of the deep value that metrics can bring toward the constant improvement of everything we do in L&D. No matter what the specific model, framework, and approach your organization chooses for learning metrics, there remain some fundamental principles that will help ensure that we ultimately have a metrics strategy that guides us, helps us improve, changes conversations with our stakeholders, and increases our credibility as business leaders.

Learning metrics are our friend, our source of feedback and intelligence, ensuring we are constantly focused on maximizing the value we bring to our organizations.

Caveo Learning is a learning consulting firm, providing learning strategies and solutions to Fortune 1000 and other leading organizations. Caveo’s mission is to transform the learning industry into one that consistently delivers targeted and recurring business value. Since 2004, Caveo has delivered ROI-focused strategic learning and performance solutions to organizations in a wide range of industries, including technology, healthcare, energy, financial services, telecommunications, manufacturing, foodservice, pharmaceuticals, and hospitality. Caveo was named one of the top content creation companies of 2017 by Training Industry Inc. For more information, visit

Take a Business-Minded Approach to Sourcing Learning Partners

By: Gary Schafer
President, Caveo Learning

One of the most important tasks talent development leaders face is selecting outsourcing partners and product vendors. It also happens to be one of the most daunting.

The learning and development organization’s relationship with providers can be a major factor in the success of the business. Learning leaders must weigh many variables in the purchasing process, from the factual (pricing, experience) to the intangible (flexibility, dedication).

Navigating the procurement process can be tremendously difficult. How can a provider’s ability to flex with the challenging demands of the business be analyzed through a formal procurement process? How does one tell if an external learning partner is going to react to changing environments and truly be aligned with the business? How is the commitment of the supplier to the mission, vision, and values of the business measured? What about issues of scalability, global capability, and communication?

How to Develop a Learning Sourcing Strategy

Start by establishing a factual market intelligence base—understand the array of variables around learning services, from expertise to capability, and the rates associated with those services. Create your supplier portfolio, determine a list of qualification criteria, and then winnow your list of potential suppliers.

Identify the types of services your organization needs—learning strategy, audience analysis, curriculum design, instructor-led training, eLearning, change management, etc. Then, determine the demand across the organization, broken out by role.

Next, perform learning-spend and target-cost analyses. You’ll want to analyze supplier rates using both internal and external data sources.

  • Conduct some internal benchmarking with the same supplier—are rates equal across multiple projects? Does the firm charge premiums based on experience? Is pricing consistent across roles?
  • Do the same internal benchmarking across multiple suppliers. Find out if rates are similar for comparable roles and skill levels at different organizations.
  • Develop some external benchmarks using publicly available market data, comparing rates from market analyses and publicly available salary data. Factor for loaded cost (about 1.3 times base salary) to cover benefits, PTO, company-paid taxes, etc., and also allow some room for supplier margin.

Investigate the Intangibles

Before moving forward with sourcing service partners and learning products providers, conduct a supplier quality assessment. Create quality profiles for each prospective partner by evaluating the following factors:

  • Strategic planning—Will they assist with communicating and messaging to senior stakeholders throughout the business? Do they provide intelligence around industry trends and best practices? What do they offer in the way of post-project analysis, such as lessons learned and next-steps recommendations?
  • Experience—Does their team have expertise in relevant areas? What is the screening process for potential employees and contractors? How cohesive is the team, and how closely do they adhere to defined processes?
  • Responsiveness—Do they show a commitment to your business needs through effective communication? Is there a proven track record of adherence to deadlines? Do they offer strategic learning support?
  • Quality of work—What processes are in place to ensure solid instructional design? What methodology is used for quality reviews (test plans, style guides, etc.)? Does the supplier set clear review expectations for training deliverables and correct issues in a timely manner?
  • Flexibility—Do the suppliers have the ability and willingness to react to new or changing business needs? What is the capacity to scale a team with appropriate roles and volume?
  • Support—Do they have ready access to tools and templates necessary to speed development? Are there documented methodologies for executing common tasks?

Pricing as a Determining Factor

Suppliers’ rates often (though certainly not always) correlate with their quality profiles. If budget were no object, the learning sourcing strategy would simply mean identifying the most qualified provider; of course, budget is oftentimes the overriding factor, and so we must balance the need for quality with fiscal realities. Thus, try to optimize supplier usage based on rates, hiring high-quality suppliers for critical projects and project management, and settling for more cost-effective options, such as staff augmentation, for lower-importance projects and commodity roles.

Likewise, negotiate for spending reductions based on the type of support required. Each of the three main services models comes with its own potential cost-savings.

  • A project-based agreement can reduce per-unit costs and may be further cost-efficient due to greater opportunity to leverage offshore assets.
  • Contract labor ensures compliance across the organization, and the consolidated labor structure means negotiated volume rates are an option.
  • A managed services model will likely have optimized service levels and adjustable staffing levels, along with efficiencies gleaned from custom redesigned processes.

Be cautious with regard to electronic procurement. There are several tools now available and in use by procurement professionals to streamline the proposal process, but they are not always ideal for learning organizations. These tools are great for providing a uniform response with efficiency, but for learning services, not everything is uniform. E-procurement tools often create barriers for providers to be able to tell their full story, essentially reducing the proposal process to 100-word bites of information that make it difficult to recognize the provider’s true value. A good procurement professional can get around some of these limitations by still offering a face-to-face meeting opportunity of the top candidates for consideration.

Finally, take care to negotiate a favorable agreement with the external learning partner. Have a negotiating strategy before initiating contract talks, and be ready and willing to walk away if a better partnership can be found elsewhere.

Creating and implementing a learning sourcing strategy for learning will greatly reduce the stress of the services procurement process, helping identify the ideal partners for your learning organization’s initiatives while optimizing budget. Rather than an intimidating task, a well-prepared strategy plan can make sourcing service partners a cornerstone in your organization’s success.

Gary Schafer is president of Caveo Learning, recently named a top custom content company by Training Industry Inc. Gary was formerly a management consultant at McKinsey & Co., and he holds an MBA from Northwestern University’s Kellogg School of Management.


TDRp Message Shared with the Legal Profession

Dave Vance shared the TDRp message with learning professionals in the legal profession on December 1st. He spoke on Day 1 of the Professional Development Institute’s two-day annual conference in Washington, D.C. Three hundred attended the conference which is dedicated to the professional development of attorneys and legal firm staff members. Dave was asked to speak by several who had heard about TDRp and the Center for Talent Reporting and wondered if the principles would translate to the legal field. Dave worked with them over the summer and fall to adapt TDRp to the legal environment, creating a Program report and a Summary report reflecting typical legal firm goals and initiatives. The presentation is titled Strategic Talent Development Collaboration, Management, and Reporting.

Presentation Slides available for download ?

CTR Is excited to Announce the Release of a Significantly Updated TDRp Measures Libary

CTR is excited to announce the release of a significantly updated version of the TDRp Measures Library. This update focuses on Learning and Development Measures and has expanded from 111 measures to 183 Learning related measures.  The additions include measures related to informal and social learning as well as more detailed breakdown of the cost measures.  This version of the library also includes measures defined by the CEB Corporate Leadership Council as well as updated references to the HCMI Human Capital Metrics Handbook and updated references to ATD, Kirkpatrick and Philips defined measures

If you are a CTR member, you have access to the updated version at no additional charge. .

If you are not a member, join CTR for $299/year

Please Welcome Jeff Carpenter to the CTR Board







Jeff Carpenter is a Senior Principal at Caveo Learning. Jeff works with clients to bridge performance gaps by addressing process improvements as well as front-line knowledge and skill development programs for Fortune 1000 companies.

Jeff has worked in entrepreneurial environments as a senior leader working to build internal organizational structures and business processes while leading teams at many Fortune 500 clients to solve some of their most pressing performance and process issues.

We are excited to welcome Jeff to the CTR Board. His knowledge and expertise will enhance our board and make CTR even greater. We have worked closely with Jeff in our past conferences and he is a great supporter of CTR and TDRp.

Please Welcome Joshua Craver to the CTR Board







Joshua Craver is a values based and results oriented HR executive. In March of 2012 he joined Western Union as their global HR Chief of Staff. In January 2013, Joshua took on a new role as the Head of Western Union University and VP of Talent Management. Previous to this he lived and worked in India, Mexico and Argentina for over 7 years in various HR leadership roles. Based on these experiences he is well versed in growth markets strategy and execution.

Joshua also worked at the strategy consulting firm Booz Allen Hamilton. Companies that he has consulted within include but are not limited to The World Bank, Georgetown University Hospital, GE, CIBA, Scotia Bank, Qwest, Famers Insurance, Electronic Arts, Citibank, Agilent Technologies, Cigna, DuPont, Nissan, Lowes, Chevron and Cisco. He has also conducted business in over 40 countries.

CTR is happy and honored to have Joshua join our CTR Board. Josh has been one of our greatest supporters. We are excited to have his expertise, energy, and insight.

The Promising State of Human Capital(Report)

CTR is happy to provide the Promising State of Human Capital Report sponsored by CTR, i4cp, and ROI Institute.

This Valuable document  is available for downloa thanks to CTR and our partnerships with i4cp and ROI Institute.  This Report ties in with the ROI Institute webinar with the same name posted on our site for download.

Partnership with the Corporate Learning Network

The Center for Talent Reporting, is pleased to announce a partnership with the Corporate Learning Network. CLN, is an online resource for corporate learning leaders and academic professionals. The Corporate Learning Network believes the Future of Learning will be created through multi-disciplinary approaches and peer-led exchange. With live conferences, community webinars and virtual forums, they bring together stakeholders across the L&D spectrum to help you realize your plans for improved learning outcomes and organizational success.

Learn more about  CLN at

CLN goals are similar to CTR, and we believe this partnership will further the change and growth in HR and L&D.

The Promising State of Human Capital Anayltics (Webinar by ROI Co-Sponsored by CTR)

Talk with any human capital executive about the field of human capital analytics and they will generally agree: the best is yet to come. That doesn’t mean that many companies aren’t already performing incredible feats with people data—a few are profiled in this report—the statement is a testament to the opportunity that most can see in this burgeoning field. And it’s a testament to the constant new and innovative ways professionals are using people-related data to impact their organizations. This study surveyed analytics professionals in organizations of all sizes worldwide, and asked very pointed questions on how those organizations are using human capital analytics today, if at all. The results were more affirming than they were surprising:

  • Budgets for human capital analytics are increasing along with executive commitment.
  • Relatively few companies are using predictive analytics now, but expect to in the future.
  • Most are using analytics to support strategic planning and organizational effectiveness.

Successful companies tend to be those that purposefully use data to anticipate and prepare rather than to react to daily problems.

It’s clear in both the data from the survey and follow-up interviews that were conducted, that the future focus of
professionals in the human capital analytics field will increasingly be on using analytics to guide strategic decisions and affect
organizational performance. To sum up the state of human capital analytics in one word: Promising.


This webinar introduces the new research study that demonstrates the progress that
continues to be made with human capital analytics. Participants will learn:

  • Four key findings on the state of
    human capital analytics
  • How high-performance
    organizations are building
    leading human capital analytics
  • What Google, HSBC, LinkedIn
    and Intel are doing to drive
    business performance through
  • What you can do to move your
    analytics practice forward

We want to thank Patti Phillips, and Amy Armitage for the opportunity to co-sponsor this webinar. The recording and PPT have been made available to anyone who missed the webinar.


The Promising State of HCA

3.60 MB 69 downloads


Our Newest CTR Advisory Council Member Todd Harrison

Todd Harrison, Ed.D                                                                             Todd

[Director, Talent Solutions]

CTR is pleased to announce Todd Harrison as our newest Advisory Council Member. Todd will be taking over for Kendell Kerekes. We are thrilled to have such an experienced and renowned individual added to our ranks.


Dr. Harrison joined CEB Metrics that Matter (MTM) in 2012, after nearly 15 years of corporate experience in various learning and development leadership roles, where he was an active practitioner of the MTM system.  At MTM, he is accountable for the MTM Professional Services team of approximately 40 people, and provides strategic leadership to the Metrics that Matter (MTM) Client Advisory and Consulting teams within this group.  These two teams have primary responsibility for delivering ongoing support services to MTM technology clients, as well accountability for the development and delivery of learning and human capital consulting measurement solutions, respectively.  Specifically, the Professional Services team helps measure the impact and effectiveness of learning and development programs within an organization intended to unlock the potential of organizations and leaders by advancing the science and practice of talent management.  Dr. Harrison’s business responsibilities include oversight of a portfolio of more than 600 clients and an annual global P&L goal of nearly $5M.

Dr. Harrison has extensive knowledge and expertise in several areas of talent management to include:

·        Succession Planning ·        Leadership Development ·        Talent Analytics
·        Learning Strategies ·        Employee Engagement ·        Competency Design
·        New Hire Onboarding ·        Performance Management ·        Organization Development



  • Doctorate in Organizational Leadership (2016) – Indiana Wesleyan University
  • Master of Arts in Human Resource Development (1995) – Webster University
  • Bachelor of Science in Journalism (1986) – Murray State University

Professional Experience

  • Director, Talent Solutions, Metrics that Matter: CEB, Chicago, IL (2015 – present)
  • Director, Consulting Services, Metrics that Matter, Chicago, IL (2014 – 2015)
  • Senior Consultant, Metrics that Matter, Chicago, IL (2012 – 2014)
  • Director, Global Leadership & Organizational Development, Stryker, Kalamazoo, MI (2010 – 2012)
  • Director, Leadership & Associate Development, Anthem Blue Cross/Blue Shield, Indianapolis, IN (2002 – 2010)
  • Vice President, Human Resources, Total eMed, Franklin, TN (1999 – 2001)
  • Director, Leadership & Organizational Development, American Rehability Services, Brentwood, TN (1997 – 1999)
  • Lieutenant Colonel (Retired), United States Army (1984 – 2005)

Professional Affiliations

  • Association for Training Development (1993 – Present)

CEB Platinum Sponsor

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at

CEB is a Platinum sponsor of CTR. With their continued support, CTR has continued its mission to bring standards and measures to the HR community. CEB was a big participant in the February 2016 CTR Conference. Helping to draw many participants making this conference, our largest yet. We look forward to continuing our relationship with CEB.

For information on sponsorship CTR please visit 

Join CEB For a selection of webinars in June.


All time are in CT


Addressing Pay Equity
June 7th 11am-12pm
Join CEB on June 7 for their complimentary webinar as they examine pay equity concerns for organizations with U.S. workforces. Register now at
Keep Quality through Onboarding
Jun 9th 11am-12pm
Best practice onboarding can maintain and actually improve quality of hire by up to 12%. Hear how the best companies prioritize employee integration in onboarding during CEB’s complimentary webinar June 9. Register now at
Organizing HR to Lead Enterprise Change
Jun 9th 9am-13:30am
Only 21% of organizations are able to initiate change as soon as the need arises. Learn how to better influence change implementation where it happens during CEB’s complimentary webinar June 9. Register now at
The Talent Within
Jun 23rd 11am-12pm
74% of organizations want to increase their internal hiring this year. Hear how the best companies improve internal mobility, and yield greater quality of hire during CEB’s complimentary webinar June 23. Register now at
Four Imperatives to Increase the Representation of Women in Leadership Positions
June 28th 12pm-1pm
Increasing representation of women in leadership positions has a real, tangible impact on an organization’s financial and talent outcomes. Join CEB on June 28 as they discuss the common myths surrounding women in leadership, and how to engage and retain a more gender-balanced workforce. Register now at


About CEB

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at




CTR Accepts the Community Partnership Award

Dave Vance was present and honored to accept the Community Partnership Award, awarded to CTR by the University Of Southern Mississippi Department Of Human Capital Development, one of the few departments in the country to offer a PhD in Human Capital Development. CTR was selected in recognition of our inviting USM PhD students to our conference in Dallas and for our leadership in the profession though TDRp. It was awarded Friday April 29 at the Third Annual Awards Ceremony held at the Gulf Park Campus. Six other awards were given, and graduating Masters and PhD students were recognized.

Dave Vance, (Executive Director of the Center for Talent Reporting).

Dave is a frequent speaker at learning conferences and association meetings. He also conducts workshops and simulations on managing the learning function. Dave is the former President of Caterpillar University, which he founded in 2001. Dave received his Bachelor of Science degree in political science from M.I.T. in 1974, a Master of Science degree in business administration from Indiana University (South Bend) in 1983, and a Ph.D. in economics from the University of Notre Dame in 1988. Dave was named 2006 CLO of the Year by Chief Learning Officer magazine. He was also named 2004 Corporate University Leader of the Year by the International Quality and Productivity Center in its annual CUBIC (Corporate University Best in Class) Awards. Dave’s current research focuses on bringing economic and business rigor to the learning field as well as the development of computer-based simulations to help learning professionals increase their effectiveness and efficiency. His first book, The Business of Learning: How to Manage Corporate Training to Improve Your Bottom Line, was published in October 2010.