Guest Blog

Optimizing Investments in Learning

by Peggy Parskey, Associate Director, Center for Talent Reporting & John Mattox, II, PhD, Managing Consultant, Metrics That Matter

You might wonder why L&D leaders need guidance for optimizing investments in learning. Recent research from The Learning Report 2020 (Mattox, J.R., II & Gray, D. Action-Learning Associates) indicates that L&D leaders struggle to convey the impact of L&D. When asked what they would do better or differently to communicate value to business leaders, 36% of L&D leaders indicated they would improve processes related to communication. More telling though is, 50% indicated they would improve measurement. To convey value, leaders need to a story to tell. Without measurement there is not much of a story.

This is where Talent Development Reporting principles (TDRp) are so relevant. It begins with a measurement framework that recommends gathering three types of data: efficiency, effectiveness, and outcomes. Efficiency data tells the story of what happened and at what cost. How many courses did L&D offer? How many people completed training? How may hours of training did learners consume? What costs did we incur? Effectiveness data provides feedback about the quality of the learning event (e.g., instructor, courseware, and content) as well as leading indicators of the success of the event (e.g., likelihood to apply, estimates of performance improvement, estimates of business outcomes, and estimates of ROI). Outcomes are the business metrics that learning influences such as sales, revenue, customer satisfaction, and employee satisfaction. TDRp recommends gathering data for all three types of measures so L&D leaders can describe what happened and to what effect.

The table below (Table 11.1 in Learning Analytics) shows a variety of measures for each category.

Key Performance Measures for L&D

Efficiency MeasuresEffectiveness MeasuresOutcome Measures

  • Number of people trained

  • Number of people trained by learning methodology (instructor led, e-learning, virtual)

  • Reach (percentage of people trained in the target population

  • Cost of training per program

  • Decrease in costs

  • Cost of training per learner

  • Cost of training per hour


  • Satisfaction with training

  • Knowledge and skills gained due to training

  • Intent to apply learning on the job

  • Expectation that training will improve individual performance on the job

  • Expectation, that individual performance improvement to lead to organization performance improvement

  • Return on expectations


  • Increase in customer satisfaction

  • Increase in employee performance

  • Decrease in risk

  • Increase in sales

  • Increase in revenue

Source: Learning Analytics ©2020, John R. Mattox II, Peggy Parskey, and Cristina Hall. Published with permission, Kogan Page Ltd.

In addition to this guidance about what to measure, TDRp provides guidance on how to report results using methods familiar to business leaders. Using these measurement and reporting approaches, L&D leaders can tell a robust story to business leaders that they can connect with.

In April 2020, Learning Analytics, Second Edition (by John Mattox, Peggy Parskey, and Cristina Hall). The book helps L&D leaders improve the way they measure the impact of talent development programs. The second edition includes a chapter on Optimizing Investments in Learning, where TDRp is discussed in detail. TDRp is featured heavily in the book because it helps L&D leaders connect their efforts to business outcomes by measuring the right things and reporting them in a way that business leaders understand.

In Chapter 11, the authors connect TDRp to the Portfolio Evaluation methodology. This approach implies that business leaders are interested in how learning and development programs impact four business areas: growth, operational efficiency, foundational skills, and risk. By aligning courses with one of these business areas and using TDRp, L&D leaders can demonstrate the effectiveness of the courses in preparing the workforce to improve each business area (portfolio).

The book also provides guidance on how to use TDRp to spur L&D leaders to act on the data they report. The book indicates L&D leaders need to shift reporting in three ways to make it more actionable (see graphic below). Reports should provide:

  • Analytics that improve business decisions
  • Insights that link to evolving business challenges
  • Information that connects talent data to business impacts

Using Data to Spur Action

Using Data to Spur Action

Source: Learning Analytics ©2020. Reproduced with permission, Kogan Page, Ltd.

Another critical aspect of conveying a meaningful message that drives action is to tell a compelling story. Chapter 11 of the book includes three critical elements of a data-driven story:

  • Scene Setting—Connect all data and results back to the desired business outcomes and goals.
  • Plot Development—Emphasize logical and clear connections between learning results and outcomes; focus on the central message of the results, not peripheral findings; and note any areas where L&D is creating value for the organization.
  • Resolution—Clearly and succinctly outline the justification for key findings; suggest improvements, recommendations, and nest steps. Continue the conversation on how to help L&D improve.
2020 CTR Annual Conference

LEARN MORE ABOUT THE USE OF DATA AND LEARNING ANALYTICS


Join us at the 2020 Virtual CTR Annual Conference, October 27, for the session, Everything You Wanted to Know about Learning Analytics but Were Afraid to Ask.

LEARN MORE >>

Can you Justify your Learning and Development Projects?

By Jack J. Phillips Ph.D., Chairman, ROI Institute

Daily headlines in the business press focus on the economy. While it is booming in some areas, other areas are slowing, and economic uncertainty exists everywhere. During uncertainty, executives must take steps to make sure the organization can weather the storm—whenever or wherever it occurs.

One way executives meet this
challenge is to ensure that expenditures represent investments and not just
costs. If an expenditure is considered a cost, it will be frozen, reduced, or
in some cases eliminated. If it is considered an investment that is producing a
return, it will be protected and possibly even enhanced during difficult times.
For example, many learning and development budgets are now being frozen or
reduced, even with record profits. While this seems illogical, it happens. Now
is the time to reflect on your budget and your programs. Can you withstand top
executive scrutiny? Are you ready for ROI?

The most used evaluation system
in the world is the ROI Methodology® to measure impact and ROI for a few major
programs. The ROI Certification® is the process to develop this valuable
capability. This valuable certification provides participants with the skills
needed to analyze the return-on-investment in practical financial terms. The
results are CEO and CFO friendly. Over 15,000 managers and professionals have
participated in this certification since it was launched in 1995, underscoring
the user-friendly nature of this system.

Don’t have the time or budget?
Several approaches are available to reduce the amount of time and cost needed
to develop this capability. For more information on ROI Certification, contact
Brady Nelson at brady@roiinstitute.net.

ROI Institute is the global
leader in measurement and evaluation including the use of return on investment
(ROI) in non-traditional applications. This methodology has been used
successfully by over 5,000 organizations and in over 70 countries.

Bridge the Gap from Training to Application with Predictive Learning Analytics

by Ken Phillips, CEO, Phillips Associates

In my previous blog post, I discussed the concept of scrap learning and how it is arguably the number one issue confronting the L&D profession today. I also provided a formula you could use to estimate the cost of scrap learning associated with your training programs.

In this post, I’ll share with you a revolutionary methodology I’ve been working on for the past several years called Predictive Learning Analyticts™ (PLA). The method enables you to pinpoint the underlying causes of scrap learning associated with a training program. It consists of three phases and nine steps to provide you with the data you need to take targeted corrective actions to maximize training transfer (see figure below). 

While the specific questions and formulae for the scores are proprietary, I hope you can apply the concepts in your organization using your own survey questions and your own weighting for the indexes. Even if you adopt a simpler process, the concepts will guide you and the article will give you an idea of what is possible.

Phase 1: Data Collection and Analysis

Unlike other training transfer approaches which focus mostly on the design and delivery of training, PLA offers a holistic approach to increasing training transfer. Built on a foundation of three research-based training transfer components and 12 research-based training transfer factors (see chart below), PLA targets the critical connection among all these elements. In short, PLA provides L&D professionals with a systematic, credible and repeatable process for optimizing the value of corporate learning and development investments by measuring, monitoring, and managing the amount of scrap learning associated with those investments.

Training Transfer Components & Training Transfer Factors

Phase 1: Data Collection & Analysis

The objective of phase one, Data Collection & Analysis, is to pinpoint the underlying causes of scrap learning associated with a training program using predictive analytics and data. Five metrics are produced and provide L&D professionals with both direction and insight as to where corrective actions should be targeted to maximize training transfer. The five measures are:

  • Learner Application Index™ (LAI) scores
  • Manager Training Support Index™ (MTSI) scores
  • Training Transfer Component Index™ (TTCI) scores
  • A scrap learning percentage score
  • Obstacles preventing training transfer

Data for calculating the first three measures: LAI, MTSI, and TTCI scores, is collected from program participants immediately following a learning program using a survey. The survey consists of 12 questions based on the 12 training transfer factors mentioned earlier. Data for calculating the final two measures are collected from participants 30 days post program using either a survey or focus groups and consists of the following three questions:

  1. What percent of the program material are you applying back on the job?
  2. How confident are you that your estimate is accurate?
  3. What obstacles prevented you from utilizing all that you learned if you’re not applying 100%?

Waiting 30 days post program is critical because it allows for the “forgetting curve” effect—the decline of memory retention over time—to take place and provides more accurate data.

LAI Scores

LAI scores predict which participants attending a training program are most likely to apply, at risk of not applying and least likely to apply what they learned in the program back on the job. Participants who fall into the at-risk and least likely to apply categories are prime candidates for follow-up and reinforcement activities. Examples include email reminders, micro-learning or review modules, and coaching or mentoring to try and move them into the most likely to apply category.

MTSI Scores

MTSI scores predict which managers of the program participants are likely to do a good or poor job of supporting the training they directed their employees to attend. Managers identified as likely to do a poor job of supporting the training are prime candidates for help and support in improving their approach. This help might take the form of one-on-one coaching; a job aid explaining what a manager should do before, during, and after sending an employee to training; or creating a training program which teaches managers how to conduct pre- and post-training discussions with employees.

TTCI Scores

TTCI scores identify which of the three training transfer components and the 12 training transfer factors affiliated with them are contributing the most and least to training transfer. Any components or factors identified as impeding or not contributing to training transfer are prime candidates for corrective action.

Scrap Learning Percentage

The scrap learning percentage score identifies the amount of scrap learning associated with a training program. It provides a baseline score against when follow-up scrap learning scores can be compared to determine the effect targeted corrective actions had on increasing training transfer.

The obstacles data identifies barriers participants encountered in the 30 days since attending the training program that prevented them from applying what they learned back on the job. Waiting 30 days to collect the data allows for the full range of training transfer obstacles to emerge. For example, some are likely to occur almost immediately—I forgot the things I learned—while others are likely to occur laters—I never had an opportunity to apply what I learned. Frequently mentioned obstacles are prime candidates for corrective actions to mitigate or eliminate them.

Phase 2: Solution Implementation

The objective of phase two: Solution Implementation, is to identify, implement, and monitor the effectiveness of corrective actions taken to mitigate or eliminate the underlying causes of scrap learning identified during phase one. Here is where the “rubber meets the road,” and you have an opportunity to demonstrate your creative problem-solving skills and ability to manage a critical business issue to a successful conclusion. Following the implementation of the corrective actions, it is now time to recalculate the amount of scrap learning associated with the training program. You can then compare the results to the baseline scrap learning percentage calculated during phase one.

Phase 3: Report Your Results

The objective of the third phase: Report Your Results, is to share your results with senior executives. Using the data you collected during phases one and two, it is time to show that you know how to manage the scrap learning problem to a successful conclusion.

In Sum

Scrap learning has been around forever, however what is different today is that there are now ways to measure, monitor, and manage it. One of those ways is through Predictive Learning Analytics™. Alternatively, you might employ the concepts to build your own simpler model. Either way, we an an opportunity to reduce scrap learning.

If you would like more information about the Predictive Learning Analytics™ methodology, email me at: ken@phillipsassociates.com. I have an ebook that covers the method and a case study illustrating how a client used the process to improve the training transfer of a leadership development program.

The Greatest Issue Facing L&D Today

Scrap Learning

by Ken Phillips

What is arguably the top issue facing the L&D profession today?

The answer is scrap learning or the gap or difference between training that is delivered but not applied back on the job. It’s the flip side of training transfer and is a critical issue for both the L&D profession and the organizations L&D supports because it wastes money and time—two precious organization resources.

Now, you might be wondering, “How big is the problem?”

Two empirical studies, one by KnowledgeAdvisors in 2014 and one by Rob Brinkerhoff and Timothy Mooney in 2008, found scrap learning to be 45 percent and 85 percent respectively in the average organization. To add further credibility to these percentages, I’ve conducted three scrap learning studies over the past few years and found the scrap learning percentages associated with three different training programs in three separate organizations to be 64 percent, 48 percent, and 54 percent respectively. Added together, these five research studies results in an average scrap learning figure of approximately 60 percent in the average organization.

To further highlight the magnitude of the scrap learning problem, consider its effect on the amount of wasted organizational dollars and time. According to the 2018 ATD State of the industry research report, the average per employee organization training expenditure in 2017 was $1,296 and the average number of training hours consumed per employee was 34.1. Using the KnowledgeAdvisors and Brinkerhoff scrap learning percentages mentioned above, you can see in the table below just how much scrap learning costs the average organization in wasted dollars and time.

Cost of Scrap Learning in Wasted Dollars & Time

Average per employee training expenditure (=) $1,296(X) 45% scrap learning(=) 583 wasted $
The average per employee training expenditure (=) $1,296(X) 85% scrap learning(=) 1,102 wasted $
The average number of training hours consumed per employee (=) 34.1(X) 45% scrap learning(=) 15 wasted hours
Average per employee training expenditure (=) $1,296(X) 85% scrap learning(=) 29 wasted hours

Taking all of this data into account reminds me of James Lovell’s famous quote during his Apollo 13 mission to the moon when an oxygen tank aboard the space capsule exploded putting both the flight and crew in great peril: “Houston, we have a problem!”

If you would like to take a crack at estimating the cost of scrap learning associated with one of your training programs, you can use the Estimating the Cost of Scrap Learning Formula below. To gain the most useful insight, you should make every effort to collect the most accurate data possible for each of the input variables. Also, when selecting an estimated percentage of scrap learning associated with the program, variable 4 in the formula, you should get input from several people familiar with the program such as other L&D colleagues, participants who previously attended the program or perhaps even the managers of program participants and then compute an average of their estimates. Gaining the input of others will increase both the accuracy and credibility of the estimate and remove any concerns that the scrap learning percentage is merely your opinion.

Estimating the Cost of Scrap Learning Formula

Wasted Participant Dollars

The length of a learning program in hours _____
X the number of programs delivered over 12 months _____
X the average number of participants attending one program _____
= the cost of scrap learning in wasted time _____
X the average hourly participant salary + benefits _____
= the cost of wasted participant dollars _____ (A)

Wasted L&D Department Dollars

Administrative expenditures (e.g., materials, travel, facility, facilitator, delivery platform, food, etc.) for one program _____
X the number of programs delivered over a 12-month period _____
X the estimated percent of scrap learning (45-85%) associated with the program _____
= the cost of wasted L&D department dollars _____ (B)

Total Cost of Scrap Learning

Cost of wasted participant dollars (A) _____
+ cost of wasted L&D department dollars (B) _____
= total cost of scrap learning _____

With this information in hand, you are now ready to pinpoint the underlying causes of scrap learning and take targeted corrective actions to mitigate or eliminate these causes and maximize training transfer. How to do this will be part two of this blog article.

Portfolio Evaluation: Aligning L&D’s Metrics to Business Objectives

by Cristina Hall
Director of Advisory Services – CEB, now Gartner

Using data built on standard metrics has become table stakes for L&D organizations looking to transform and thrive in a rapidly-changing business environment.

A critical challenge that remains for many organizations, however, is how to prioritize and structure their metrics so that the numbers reinforce and showcase the value L&D is contributing to the business.  It’s important to select measurement criteria which reflect L&D’s performance and contextualize them with outcome measures used by the business.

Applying a Portfolio Evaluation approach to Learning and Development provides the linkage needed to address this challenge. It is a clear, outcome-centered framework that can be used to position L&D’s contributions in business-focused terms, at the right level of detail for executive reporting.

How Does L&D Deliver Value?

Delivering training does not, in itself, deliver value.  Training is a tool, a method, to develop employees’ knowledge and skills so they will deliver more value to the business.  The value that training programs deliver aligns to four strategic business objectives.

Driving Growth

Courses aligned to the Drive Growth objective are designed to increase top-line growth, thus growing revenue and market share.  The organization tracks metrics related to sales, renewals, upsells, customer loyalty and satisfaction, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses are intended to drive competitive or strategic advantage by focusing on organization-specific processes, systems, products, or skillsets.  Examples include courses that are designed to increase sales, customer retention or repeat business, new product innovation, or help managers best position their teams for business growth.

Increasing Operational Efficiency

Courses aligned to Operational Efficiency increase bottom-line profitability.  The business tracks metrics related to productivity, quality, cost, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses are intended to drive competitive or strategic advantage by focusing on organization-specific processes, systems, or skillsets.  Examples include courses that are designed to increase productivity, decrease costs, increase process innovation, or help managers maximize bottom line performance.

Building Foundational Skills

Courses aligned to the Foundational Skills value driver are designed to both ensure that gaps in employee skills can be addressed, and demonstrate that employees can grow and develop to provide even more value to the business; it’s frequently less expensive to fill a minor skill gap than to replace an employee who is already on-boarded and semi-productive. The business tracks metrics related to bench strength, employee engagement, turnover, promotion rates, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended. These courses tend to be off the shelf content, rather than custom designed content specific to the business. Examples include time management, MS Office, and introductory/generalized coaching or sales courses.

Mitigating Risk

Courses aligned to the Mitigate Risk value driver are designed to shield the business from financial or reputational risk by ensuring employee compliance with specific policies or maintenance of specific industry certifications.  The business tracks metrics related to safety, legal costs, etc., while L&D can track leading indicators in each of these areas based on employees’ assessment of how much these areas will improve based on each training program they have attended.  These courses tend to be focused on compliance, regulatory, and safety training, and tend to incorporate content similar to that of other courses in the organization’s industry.

Become a Portfolio Manager

Every learning asset, whether informal or formal, can be tied back to one of the four drivers of value.  The variety and depth of metric collection and the performance expectations associated with those metrics differ across each of these value drivers, which is why grouping courses or learning assets into Portfolios is helpful.  L&D leaders become investment managers; monitoring and managing assets that are expected to produce comparable results to effect the performance of people, who in turn effect the performance of the business.

Getting Started

  1. Align metrics to Portfolios: what is most important? What data is needed?
  2. Align learning assets to Portfolios: this ensures that the right metrics are collected.
  3. Gather the data: gather training effectiveness data from learners and their managers and combine it with efficiency data from the LMS and Finance and outcome data from the business.
  4. Review, interpret, and share: use the metrics to communicate L&D’s contribution to business goals, confirm alignment, and inform strategic decision-making.

For additional detail regarding the Portfolio Evaluation approach, download our white paper, Aligning L&D’s Value with the C-Suite.

About CEB, Now Gartner

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at gartner.com/ceb.

 

Informal Learning Evaluation: A Three-Step Approach

by John R. Mattox, II, PhD
Managing Consultant
CEB, now Gartner

You may recall that roughly 20 years ago, eLearning was seen as the next new thing. Learning leaders were keen to try out new technologies, while business leaders were happy to cut costs associated with travel and lodging. The eLearning cognoscenti claimed that this new learning method would deliver the same results as instructor-led training. They passionately believed that eLearning would become so prevalent that in-person classrooms would disappear like floppy discs, typewriters, and rotary telephones. The learning pendulum was ready to swing from the in-person classroom experience to the digital self-paced environment.

In time, the fervor surrounding eLearning waned and practical experience helped shape a new learning world where the pendulum was not aligned to one extreme or the other. Effective formal learning programs now employ a blended approach comprised of multiple components, including in-person instructor-led classes, virtual instructor-led events, self-paced web-based modules and maybe, just maybe, an archaic but valuable resource like a book.

Informal learning is the new hot topic amongst leaders in the L&D field. Three things appear to be driving the conversation: the 70/20/10 model, technology, and learners themselves. While the 70/20/10 model is by no means new—it was developed in the 1980s at the Center for Creative Leadership—it has become a prominent part of the conversation lately because it highlights a controversial thought: all oft he money and effort invested to create formal learning accounts for only 10% of the learning that occurs among employees. Only 10%! Seventy percent of the learning comes from on-the-job experience and 20% comes from coaching and mentoring.

These proportions make business leaders ask tough questions like, “Should I continue to invest so much in so little?” and “Will formal learning actually achieve our business goals or should I rely on informal learning?” L&D practitioners are also wondering, “Will my role become irrelevant if informal learning displaces formal learning?” or “How can L&D manage and curate informal learning as a way of maintaining relevance?”

The second influencer—technology—drives informal learning to a large extent by making content and experts easy to access. Google and other search engines make fact finding instantaneous. SharePoint and knowledge portals provide valuable templates and process documents. Content curators like Degreed and Pathgather provide a one-stop shop for eLearning content from multiple vendors like SkillSoft, Udemy, Udacity, and Lynda.

Employees are driving the change as well because they are empowered, networked, and impatient when it comes to learning:

  • 75% of employees report that they will do what they need to do to learn effectively
  • 69% of employees regularly seek out new ways of doing their work from their co-workers
  • 66% of employees expect to learn new information “just-in-time”

As informal learning becomes more prominent, the question that both L&D and business leaders should be asking is simple, “How do we know if informal learning is effective?” The new generation of learners might respond, “Duh! If the information is not effective, we go learn more until we get what we need.” A better way to uncover the effectiveness of information learning is to measure it.

Here’s a three-step measurement process that should provide insight about the effectiveness of most types of informal learning.

1. Determine what content you need to evaluate

This is actually the most difficult step if you intend to measure the impact of informal learning systematically across an organization. If you intend only to measure one aspect of informal learning—say a mentoring program then the work is substantially less. When undertaking a systematic approach, the universe of all possible learning options needs to be defined. Rather than give up now, take one simple step: create categories based on types of learning provided.

For example, group the following types of learning as:

  • Technology-Enabled Content: eLearning modules, videos, podcasts, online simulations or games
  • Documents: SharePoint resources, standard operating procedures and process documents, group webpages, wikis, and blogs
  • Internet Knowledge Portal

Create as many categories as needed to capture the variety of informal learning occurring in your organization.

2. Determine what measurement tool is best suited for each learning type

Common tools include surveys, focus groups, interviews, web analytics, and business systems that already gather critical operational data like widgets produced or products sold. Web analytics, business systems, and surveys tend to be highly scalable, low-effort methods for gathering large amounts of data. Focus groups and interviews take more time and effort, but often provide information rich details.

3. Determine when to deploy the measurement tool to gather information

For an eLearning module, it seems appropriate to include a web-based survey link on the last page of content. Learners can launch the survey and provide feedback immediately after the module is complete. If the content is curated by a vendor–preventing the insertion of a link on the final page of materials–then the completion of the module when registered in the LMS should trigger the distribution of an email with a link to the survey evaluation. Regardless of the type of learning (instructor led, virtual, self-paced, etc.), the timing and the tool will vary according to the content.

Is it easy to implement this measurement approach to evaluate the impact of informal learning? For some organizations, maybe. For others, not at all. However, measurement is a journey and it begins by taking the first step.

For More Information…

For guidance about measuring informal learning, contact John Mattox at CEB, now Garner john.mattoxii@gartner.com To learn more about how to improve L&D measurement initiatives, download Increasing Response Rates white paper.

About CEB, now Gartner

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at gartner.com/ceb

Telling a Story With Data

Data Story

by Christine Lawther, PhD
Senior Advisor—CEB (now Gartner)

Data is everywhere. In our personal lives, we are continually exposed to metrics. The number of “likes” on social media, usage metrics on every bill, and the caloric breakdown of burgers at the most popular fast food chains are all examples of common metrics that society is exposed to on a regular basis.

Looking at data within a business context, data insight is in high demand. More organizations are focusing on doing more with less, and so data often becomes the key element that determines decisions on goals, resources, and performance. This increase in data exposure acts as an opportunity for learning and development (L&D) professionals to showcase their efforts and to truly transition the conversation from being viewed as a cost center to an essential contributor to the organization’s goals.

One common challenge is that L&D teams are often not staffed with team members that have a rich background in analytics. When instructional designers, facilitators, program managers, and learning leaders hold the responsibility of sharing data, it can be rather challenging to translate stereotypical L&D metrics into a compelling story that resonates with external stakeholders. It’s because of this that tapping into some foundational best practices in telling a story with data can be valuable to consider.

Structure Your Story: The Funnel Approach

If you visualize a funnel, imagine a broad opening where contents are poured in and a stem that becomes increasingly narrow. Apply this visualization as the framework to craft your story: start with broad, generic insights, and then funnel down to specifics. Doing this enables the recipient of the story to understand the natural flow of moving through diverse metrics, but still understand the overarching picture of L&D performance as a comprehensive whole. For example, it may be helpful to start by outlining overall satisfaction or utilization metrics, and then transition into something slightly more specific such as breaking out scores of key courses within your portfolio that are the biggest contributors to those overall metrics. From there, you can move into more detailed metrics by delving into components such as highest/lowest rated items within that course, time to apply training, barriers to training application, and insightful qualitative sentiments. At the very end of the story, one can conclude with specific actions that the team plans to take. Following this approach not only paints a comprehensive picture, but it also creates momentum for next steps.

Speak Their Language

Metrics that L&D often focuses on (e.g., activity, cost per learner) may not easily translate into insights that external stakeholders innately resonate with. Each department within an organization may have their own custom metrics. However, it is imperative that a function can demonstrate the linkage back to the broader organization. Doing this enables one to exhibit that they are being good stewards of the resources granted to them and also reveals how their day-to-day efforts align with the broader organization.

So, how can you demonstrate that leadership should be confident with your decisions? Communicate your impact with metrics that resonate with decision makers. If there are any core metrics that the company tracks, identify ways to directly demonstrate L&D’s linkage to them. If you are unsure, look for organizational metrics that are announced at company-wide meetings or shared on a regular basis. For example, if Net Promoter Score is something that your organization tracks, establish a Net Promoter Score for L&D and include it in your story. If increasing sales is a priority, identify how sales training is contributing to that effort.

Strike a Balance

It can be tempting to share only successes, however, it is vital to also include opportunities for improvement. Why? Because demonstrating transparency is the key to establishing trust. A strong approach is to share an opportunity for improvement and also include a few specific actions the department is planning to take to improve this. Doing this will provide a two-fold benefit. First, it will demonstrate that you are aware of opportunities to work on. Second, it exemplifies that you have proactively mapped out a plan to address those areas.

If you are finding that your story is entirely positive, consider looking for differences within the population you support. For example, does every region/department/tenure bracket report the same level of impact? Often a team may find that on a holistic level they are doing well; however, when you dig into varied demographics, there may be an area that can drive greater value. By transparently sharing your data to outline both successes and opportunities, the learning organization can become the best at getting better.

CEB Metrics that Matter is committed to helping organizations achieve more precision in strategic talent decisions, moving beyond big data to optimizing workforce learning investments against the most business-critical skills and competencies. To learn more about how we help bridge the gap between L&D and business leaders, download a copy of our white paper, Aligning L&D’s Value with the C-Suite.

About CEB (now Gartner)

Leading organizations worldwide rely on CEB services to harness their untapped potential and grow. Now offered by Gartner, CEB best practices and technology solutions equip clients with the intelligence to effectively manage talent, customers, and operations. More information is available at gartner.com/ceb.

3 Principles of Effective Learning Metrics & Measurement

Contributed by Caveo Learning

As more and more talent development leaders take a serious look at implementing meaningful metrics and measurement across their learning and development organizations, the business relationships and conversations between L&D professionals and stakeholders are changing for the better.

There are times when talent development leaders need to root themselves in foundational principles of talent development metrics. It’s easy to get caught up in new thinking, models, and frameworks, and lose focus on the fundamentals of how to run a learning organization.

No matter what model, framework, system, tool, methodology, or new approach we want to adapt, adopt, and deploy, there are at least three fundamentals that we should never lose sight of—principles that should be applied to all learning metrics.

  1. Differentiate between metrics you actively manage and those you monitor.

The principle is so simple, yet so rarely considered. Just like in medicine, some “vital statistics” are always monitored—the moment something changes in the wrong direction, an alarm sounds, allowing for the “metric” to be actively managed. Similarly, when selecting your metrics, determine which metrics you intend to monitor, and which you intend to actively manage. Further, know what the target thresholds are for the monitored metrics that will trigger the “alarm” for active management. For example, you may want to monitor L&D staff utilization, and only actively manage it should the metric fall out of the acceptable range.

  1. Align to industry formulas, where practical.

Another issue that often comes up is defining the specific formula for a metric. Frequently, the formula is not considered deeply enough to add the full value that it can. This can result in metrics with little credibility or validity, and which should not be informing any decisions. It’s true that each organization has its own characteristics, its own language, and specifics that need to be considered. However, using a standard formula defined by an existing industry benchmark can be very helpful when planning your metric strategy, goals, and even budget. Industry benchmarks are only valuable for planning and comparison if you are comparing apples with apples—to do that, the formulas need to match.

CTR does a fantastic job in its L&D metrics library of not only providing the recommended formula, but also noting which industry organization defined it, or even which other industry benchmark it is similar to. A good balance between very-company-specific and industry-aligned formulas will allow for at least some comparison, planning, and target setting against industry metrics. Whether it is CTR’s Talent Development Reporting Principles (TDRp), Bersin by Deloitte, Training magazine, ATD, SHRM, or any others, consider aligning at least some meaningful metrics to an industry definition and formula. With our above example of staff utilization, one TDRp formula we could use is “Learning staff, % of total hours used.”

  1. Measure to inform decisions and actions.

Whether you are monitoring or managing a particular metric, there must be an underlying business reason to do so. Having a metric that does not inform a decision or an action is of no value. Also, consider why you are measuring something and what that metric will influence. Learning metrics are there to help us improve what we do as L&D professionals. Whether it is improving the efficiency of our learning organization, the effectiveness of our learning solutions, alignment to our stakeholders, or the contribution our interventions have on the strategic business goals of the organization, metrics play a critical role in influencing the value we bring to our organizations and, almost more critically, the credibility of L&D in the eyes of external stakeholders and the C-suite.

It’s better to have a few good metrics that inform meaningful decision making and allow for agility in improving the value L&D brings to your organization, rather than hundreds of metrics that offer limited value. Sticking with our example, we could monitor the utilization of learning staff and start actively addressing this metric should the percentage increase above, say, 90%, by engaging a learning solutions provider to assist with some of the workload until it returns to the acceptable range.

Learning leaders are starting to take more notice of the deep value that metrics can bring toward the constant improvement of everything we do in L&D. No matter what the specific model, framework, and approach your organization chooses for learning metrics, there remain some fundamental principles that will help ensure that we ultimately have a metrics strategy that guides us, helps us improve, changes conversations with our stakeholders, and increases our credibility as business leaders.

Learning metrics are our friend, our source of feedback and intelligence, ensuring we are constantly focused on maximizing the value we bring to our organizations.

Caveo Learning is a learning consulting firm, providing learning strategies and solutions to Fortune 1000 and other leading organizations. Caveo’s mission is to transform the learning industry into one that consistently delivers targeted and recurring business value. Since 2004, Caveo has delivered ROI-focused strategic learning and performance solutions to organizations in a wide range of industries, including technology, healthcare, energy, financial services, telecommunications, manufacturing, foodservice, pharmaceuticals, and hospitality. Caveo was named one of the top content creation companies of 2017 by Training Industry Inc. For more information, visit www.caveolearning.com

Take a Business-Minded Approach to Sourcing Learning Partners

By: Gary Schafer
President, Caveo Learning

One of the most important tasks talent development leaders face is selecting outsourcing partners and product vendors. It also happens to be one of the most daunting.

The learning and development organization’s relationship with providers can be a major factor in the success of the business. Learning leaders must weigh many variables in the purchasing process, from the factual (pricing, experience) to the intangible (flexibility, dedication).

Navigating the procurement process can be tremendously difficult. How can a provider’s ability to flex with the challenging demands of the business be analyzed through a formal procurement process? How does one tell if an external learning partner is going to react to changing environments and truly be aligned with the business? How is the commitment of the supplier to the mission, vision, and values of the business measured? What about issues of scalability, global capability, and communication?

How to Develop a Learning Sourcing Strategy

Start by establishing a factual market intelligence base—understand the array of variables around learning services, from expertise to capability, and the rates associated with those services. Create your supplier portfolio, determine a list of qualification criteria, and then winnow your list of potential suppliers.

Identify the types of services your organization needs—learning strategy, audience analysis, curriculum design, instructor-led training, eLearning, change management, etc. Then, determine the demand across the organization, broken out by role.

Next, perform learning-spend and target-cost analyses. You’ll want to analyze supplier rates using both internal and external data sources.

  • Conduct some internal benchmarking with the same supplier—are rates equal across multiple projects? Does the firm charge premiums based on experience? Is pricing consistent across roles?
  • Do the same internal benchmarking across multiple suppliers. Find out if rates are similar for comparable roles and skill levels at different organizations.
  • Develop some external benchmarks using publicly available market data, comparing rates from market analyses and publicly available salary data. Factor for loaded cost (about 1.3 times base salary) to cover benefits, PTO, company-paid taxes, etc., and also allow some room for supplier margin.

Investigate the Intangibles

Before moving forward with sourcing service partners and learning products providers, conduct a supplier quality assessment. Create quality profiles for each prospective partner by evaluating the following factors:

  • Strategic planning—Will they assist with communicating and messaging to senior stakeholders throughout the business? Do they provide intelligence around industry trends and best practices? What do they offer in the way of post-project analysis, such as lessons learned and next-steps recommendations?
  • Experience—Does their team have expertise in relevant areas? What is the screening process for potential employees and contractors? How cohesive is the team, and how closely do they adhere to defined processes?
  • Responsiveness—Do they show a commitment to your business needs through effective communication? Is there a proven track record of adherence to deadlines? Do they offer strategic learning support?
  • Quality of work—What processes are in place to ensure solid instructional design? What methodology is used for quality reviews (test plans, style guides, etc.)? Does the supplier set clear review expectations for training deliverables and correct issues in a timely manner?
  • Flexibility—Do the suppliers have the ability and willingness to react to new or changing business needs? What is the capacity to scale a team with appropriate roles and volume?
  • Support—Do they have ready access to tools and templates necessary to speed development? Are there documented methodologies for executing common tasks?

Pricing as a Determining Factor

Suppliers’ rates often (though certainly not always) correlate with their quality profiles. If budget were no object, the learning sourcing strategy would simply mean identifying the most qualified provider; of course, budget is oftentimes the overriding factor, and so we must balance the need for quality with fiscal realities. Thus, try to optimize supplier usage based on rates, hiring high-quality suppliers for critical projects and project management, and settling for more cost-effective options, such as staff augmentation, for lower-importance projects and commodity roles.

Likewise, negotiate for spending reductions based on the type of support required. Each of the three main services models comes with its own potential cost-savings.

  • A project-based agreement can reduce per-unit costs and may be further cost-efficient due to greater opportunity to leverage offshore assets.
  • Contract labor ensures compliance across the organization, and the consolidated labor structure means negotiated volume rates are an option.
  • A managed services model will likely have optimized service levels and adjustable staffing levels, along with efficiencies gleaned from custom redesigned processes.

Be cautious with regard to electronic procurement. There are several tools now available and in use by procurement professionals to streamline the proposal process, but they are not always ideal for learning organizations. These tools are great for providing a uniform response with efficiency, but for learning services, not everything is uniform. E-procurement tools often create barriers for providers to be able to tell their full story, essentially reducing the proposal process to 100-word bites of information that make it difficult to recognize the provider’s true value. A good procurement professional can get around some of these limitations by still offering a face-to-face meeting opportunity of the top candidates for consideration.

Finally, take care to negotiate a favorable agreement with the external learning partner. Have a negotiating strategy before initiating contract talks, and be ready and willing to walk away if a better partnership can be found elsewhere.

Creating and implementing a learning sourcing strategy for learning will greatly reduce the stress of the services procurement process, helping identify the ideal partners for your learning organization’s initiatives while optimizing budget. Rather than an intimidating task, a well-prepared strategy plan can make sourcing service partners a cornerstone in your organization’s success.

Gary Schafer is president of Caveo Learning, recently named a top custom content company by Training Industry Inc. Gary was formerly a management consultant at McKinsey & Co., and he holds an MBA from Northwestern University’s Kellogg School of Management.

 

TDRp Message Shared with the Legal Profession

Dave Vance shared the TDRp message with learning professionals in the legal profession on December 1st. He spoke on Day 1 of the Professional Development Institute’s two-day annual conference in Washington, D.C. Three hundred attended the conference which is dedicated to the professional development of attorneys and legal firm staff members. Dave was asked to speak by several who had heard about TDRp and the Center for Talent Reporting and wondered if the principles would translate to the legal field. Dave worked with them over the summer and fall to adapt TDRp to the legal environment, creating a Program report and a Summary report reflecting typical legal firm goals and initiatives. The presentation is titled Strategic Talent Development Collaboration, Management, and Reporting.

Presentation Slides available for download

http://www.centerfortalentreporting.org/download/washington-conference-slides/ ?

CTR Is excited to Announce the Release of a Significantly Updated TDRp Measures Libary

CTR is excited to announce the release of a significantly updated version of the TDRp Measures Library. This update focuses on Learning and Development Measures and has expanded from 111 measures to 183 Learning related measures.  The additions include measures related to informal and social learning as well as more detailed breakdown of the cost measures.  This version of the library also includes measures defined by the CEB Corporate Leadership Council as well as updated references to the HCMI Human Capital Metrics Handbook and updated references to ATD, Kirkpatrick and Philips defined measures

If you are a CTR member, you have access to the updated version at no additional charge. https://www.centerfortalentreporting.org/download/talent-acquisition/ .

If you are not a member, join CTR for $299/year

Please Welcome Jeff Carpenter to the CTR Board

2953551

 

 

 

 

 

Jeff Carpenter is a Senior Principal at Caveo Learning. Jeff works with clients to bridge performance gaps by addressing process improvements as well as front-line knowledge and skill development programs for Fortune 1000 companies.

Jeff has worked in entrepreneurial environments as a senior leader working to build internal organizational structures and business processes while leading teams at many Fortune 500 clients to solve some of their most pressing performance and process issues.

We are excited to welcome Jeff to the CTR Board. His knowledge and expertise will enhance our board and make CTR even greater. We have worked closely with Jeff in our past conferences and he is a great supporter of CTR and TDRp.

Please Welcome Joshua Craver to the CTR Board

AAEAAQAAAAAAAAV7AAAAJDkxYzU3MTUyLTJiYWMtNDE0Ni1hMmFjLWM2NTg2NjViMGQxYg

 

 

 

 

 

Joshua Craver is a values based and results oriented HR executive. In March of 2012 he joined Western Union as their global HR Chief of Staff. In January 2013, Joshua took on a new role as the Head of Western Union University and VP of Talent Management. Previous to this he lived and worked in India, Mexico and Argentina for over 7 years in various HR leadership roles. Based on these experiences he is well versed in growth markets strategy and execution.

Joshua also worked at the strategy consulting firm Booz Allen Hamilton. Companies that he has consulted within include but are not limited to The World Bank, Georgetown University Hospital, GE, CIBA, Scotia Bank, Qwest, Famers Insurance, Electronic Arts, Citibank, Agilent Technologies, Cigna, DuPont, Nissan, Lowes, Chevron and Cisco. He has also conducted business in over 40 countries.

CTR is happy and honored to have Joshua join our CTR Board. Josh has been one of our greatest supporters. We are excited to have his expertise, energy, and insight.

The Promising State of Human Capital(Report)

CTR is happy to provide the Promising State of Human Capital Report sponsored by CTR, i4cp, and ROI Institute.

This Valuable document  is available for downloa thanks to CTR and our partnerships with i4cp and ROI Institute.  This Report ties in with the ROI Institute webinar with the same name posted on our site for download. https://www.centerfortalentreporting.org/the-promising-state-of-human-capital-anayltics-webinar-by-roi-sponsored-by-ctr/

Partnership with the Corporate Learning Network

The Center for Talent Reporting, is pleased to announce a partnership with the Corporate Learning Network. CLN, is an online resource for corporate learning leaders and academic professionals. The Corporate Learning Network believes the Future of Learning will be created through multi-disciplinary approaches and peer-led exchange. With live conferences, community webinars and virtual forums, they bring together stakeholders across the L&D spectrum to help you realize your plans for improved learning outcomes and organizational success.

Learn more about  CLN at http://www.corporatelearningnetwork.com/

CLN goals are similar to CTR, and we believe this partnership will further the change and growth in HR and L&D.

The Promising State of Human Capital Anayltics (Webinar by ROI Co-Sponsored by CTR)

Talk with any human capital executive about the field of human capital analytics and they will generally agree: the best is yet to come. That doesn’t mean that many companies aren’t already performing incredible feats with people data—a few are profiled in this report—the statement is a testament to the opportunity that most can see in this burgeoning field. And it’s a testament to the constant new and innovative ways professionals are using people-related data to impact their organizations. This study surveyed analytics professionals in organizations of all sizes worldwide, and asked very pointed questions on how those organizations are using human capital analytics today, if at all. The results were more affirming than they were surprising:

  • Budgets for human capital analytics are increasing along with executive commitment.
  • Relatively few companies are using predictive analytics now, but expect to in the future.
  • Most are using analytics to support strategic planning and organizational effectiveness.

Successful companies tend to be those that purposefully use data to anticipate and prepare rather than to react to daily problems.

It’s clear in both the data from the survey and follow-up interviews that were conducted, that the future focus of
professionals in the human capital analytics field will increasingly be on using analytics to guide strategic decisions and affect
organizational performance. To sum up the state of human capital analytics in one word: Promising.

Objectives

This webinar introduces the new research study that demonstrates the progress that
continues to be made with human capital analytics. Participants will learn:

  • Four key findings on the state of
    human capital analytics
  • How high-performance
    organizations are building
    leading human capital analytics
    teams
  • What Google, HSBC, LinkedIn
    and Intel are doing to drive
    business performance through
    analytics
  • What you can do to move your
    analytics practice forward

We want to thank Patti Phillips, and Amy Armitage for the opportunity to co-sponsor this webinar. The recording and PPT have been made available to anyone who missed the webinar.

PPT  

The Promising State of HCA

3.60 MB 41 downloads

Recording https://roievents.webex.com/roievents/lsr.php?RCID=30bd391d319d4709b4203d48f6b7e4c9

Our Newest CTR Advisory Council Member Todd Harrison

Todd Harrison, Ed.D                                                                             Todd

[Director, Talent Solutions]

CTR is pleased to announce Todd Harrison as our newest Advisory Council Member. Todd will be taking over for Kendell Kerekes. We are thrilled to have such an experienced and renowned individual added to our ranks.

Background

Dr. Harrison joined CEB Metrics that Matter (MTM) in 2012, after nearly 15 years of corporate experience in various learning and development leadership roles, where he was an active practitioner of the MTM system.  At MTM, he is accountable for the MTM Professional Services team of approximately 40 people, and provides strategic leadership to the Metrics that Matter (MTM) Client Advisory and Consulting teams within this group.  These two teams have primary responsibility for delivering ongoing support services to MTM technology clients, as well accountability for the development and delivery of learning and human capital consulting measurement solutions, respectively.  Specifically, the Professional Services team helps measure the impact and effectiveness of learning and development programs within an organization intended to unlock the potential of organizations and leaders by advancing the science and practice of talent management.  Dr. Harrison’s business responsibilities include oversight of a portfolio of more than 600 clients and an annual global P&L goal of nearly $5M.

Dr. Harrison has extensive knowledge and expertise in several areas of talent management to include:

·        Succession Planning ·        Leadership Development ·        Talent Analytics
·        Learning Strategies ·        Employee Engagement ·        Competency Design
·        New Hire Onboarding ·        Performance Management ·        Organization Development

 

Education

  • Doctorate in Organizational Leadership (2016) – Indiana Wesleyan University
  • Master of Arts in Human Resource Development (1995) – Webster University
  • Bachelor of Science in Journalism (1986) – Murray State University

Professional Experience

  • Director, Talent Solutions, Metrics that Matter: CEB, Chicago, IL (2015 – present)
  • Director, Consulting Services, Metrics that Matter, Chicago, IL (2014 – 2015)
  • Senior Consultant, Metrics that Matter, Chicago, IL (2012 – 2014)
  • Director, Global Leadership & Organizational Development, Stryker, Kalamazoo, MI (2010 – 2012)
  • Director, Leadership & Associate Development, Anthem Blue Cross/Blue Shield, Indianapolis, IN (2002 – 2010)
  • Vice President, Human Resources, Total eMed, Franklin, TN (1999 – 2001)
  • Director, Leadership & Organizational Development, American Rehability Services, Brentwood, TN (1997 – 1999)
  • Lieutenant Colonel (Retired), United States Army (1984 – 2005)

Professional Affiliations

  • Association for Training Development (1993 – Present)

CEB Platinum Sponsor

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at cebglobal.com.

CEB is a Platinum sponsor of CTR. With their continued support, CTR has continued its mission to bring standards and measures to the HR community. CEB was a big participant in the February 2016 CTR Conference. Helping to draw many participants making this conference, our largest yet. We look forward to continuing our relationship with CEB.

For information on sponsorship CTR please visit https://www.centerfortalentreporting.org/sponsorship-opportunities/ 

Join CEB For a selection of webinars in June.

CEB-Logo-RGB-150x41-color

All time are in CT

 

Title
Description
Addressing Pay Equity
June 7th 11am-12pm
Join CEB on June 7 for their complimentary webinar as they examine pay equity concerns for organizations with U.S. workforces. Register now at http://ceburl.com/1osc
Keep Quality through Onboarding
Jun 9th 11am-12pm
Best practice onboarding can maintain and actually improve quality of hire by up to 12%. Hear how the best companies prioritize employee integration in onboarding during CEB’s complimentary webinar June 9. Register now at http://ceburl.com/1ov5
Organizing HR to Lead Enterprise Change
Jun 9th 9am-13:30am
Only 21% of organizations are able to initiate change as soon as the need arises. Learn how to better influence change implementation where it happens during CEB’s complimentary webinar June 9. Register now at http://ceburl.com/1ov6
The Talent Within
Jun 23rd 11am-12pm
74% of organizations want to increase their internal hiring this year. Hear how the best companies improve internal mobility, and yield greater quality of hire during CEB’s complimentary webinar June 23. Register now at http://ceburl.com/1ov7
Four Imperatives to Increase the Representation of Women in Leadership Positions
June 28th 12pm-1pm
Increasing representation of women in leadership positions has a real, tangible impact on an organization’s financial and talent outcomes. Join CEB on June 28 as they discuss the common myths surrounding women in leadership, and how to engage and retain a more gender-balanced workforce. Register now at http://ceburl.com/1ov8

 

About CEB

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at cebglobal.com.

CEB-Logo-RGB-150x41-color

 

 

CTR Accepts the Community Partnership Award

Dave Vance was present and honored to accept the Community Partnership Award, awarded to CTR by the University Of Southern Mississippi Department Of Human Capital Development, one of the few departments in the country to offer a PhD in Human Capital Development. CTR was selected in recognition of our inviting USM PhD students to our conference in Dallas and for our leadership in the profession though TDRp. It was awarded Friday April 29 at the Third Annual Awards Ceremony held at the Gulf Park Campus. Six other awards were given, and graduating Masters and PhD students were recognized.

Dave Vance, (Executive Director of the Center for Talent Reporting).

Dave is a frequent speaker at learning conferences and association meetings. He also conducts workshops and simulations on managing the learning function. Dave is the former President of Caterpillar University, which he founded in 2001. Dave received his Bachelor of Science degree in political science from M.I.T. in 1974, a Master of Science degree in business administration from Indiana University (South Bend) in 1983, and a Ph.D. in economics from the University of Notre Dame in 1988. Dave was named 2006 CLO of the Year by Chief Learning Officer magazine. He was also named 2004 Corporate University Leader of the Year by the International Quality and Productivity Center in its annual CUBIC (Corporate University Best in Class) Awards. Dave’s current research focuses on bringing economic and business rigor to the learning field as well as the development of computer-based simulations to help learning professionals increase their effectiveness and efficiency. His first book, The Business of Learning: How to Manage Corporate Training to Improve Your Bottom Line, was published in October 2010.

Setting Standards for our Profession: Three Standard Reports By Dave Vance

Last month we talked about the need for standards and introduced three types of measures which would serve to organize our measures just as accountants use four types of measures in their profession. This month we will introduce three standard reports for L&D which contain the three types of measures. These three reports, used together, provide a holistic and comprehensive view of learning activity just as the three standard accounting reports (income statement, balance sheet, and cash flow) do for financial activity.

The three reports are Operations, Program, and Summary. While a version of these reports (called detailed reports) contains just historical information (like actual results by month or quarter), we will focus on reports for executive reporting which in addition to last year’s results include the plan or target for the current year, year-to-date (YTD) progress against plan, and, ideally, a forecast of how the year will end if no special actions are taken (like extra staffing, budget or effort beyond what is currently planned). The reports are meant to be used by the department head and program managers in monthly meetings to actively manage the function to deliver results as close to plan as possible. Of course, before the year starts, L&D leaders will need to decide what measures are important for the year and what reasonable plans or targets would be for those measures.

We will start with the Operations Report which contains effectiveness and efficiency measures. Recall from our discussion last month that effectiveness measures are about the quality of the program (like levels 1-5) and efficiency measures are about how many, at what cost, etc. (like number of participants, hours, utilization rate, and cost). The Operations Report is meant to capture the 5-10 most important effectiveness and efficiency measures at an aggregated level which may be for the enterprise, region, business unit or product group depending on the L&D department’s scope. So, for example, the Operations Report might show last year’s actual, current year plan, YTD results, and forecast for unique and total participants for all the courses being offered. It might also show the average level 1, 2 and 3 for all courses being offered. The department head would use this report monthly to see if the department is on track to meet plan for the year. If it appears the department is not on track, then the senior L&D leaders need to understand why and agree on what actions to take to get back on plan.

The Program Report also contains effectiveness and efficiency measures but its focus is at the program or initiative level rather than the enterprise level. It is designed to show the key programs in support of a single company goal like increasing sales by 10% or reducing operating cost by 5%. Under each identified program would be the most important effectiveness and efficiency measures. The Program Report also would contain an outcome measure showing the planned impact or contribution from learning on the goal. So, the report pulls together the key programs and measures required to achieve the desired impact on the goal. The owner of the goal (like the SVP of sales) and the L&D leaders would agree on the programs; outcome, effectiveness and efficiency measures; and plans or targets for the measures before the programs begin. The goal owner and L&D also need to agree on their mutual roles and responsibilities, including how the owner plans to reinforce the learning. Program Reports would be used by the department head and program managers each month to ensure everything is on plan to deliver the agreed-upon results.

While the Operations and Program Reports are used to manage the function each month, the Summary Report is designed to show the alignment and impact of learning as well as its effectiveness and efficiency. The Summary Report starts by listing the company’s top 5-7 goals in the CEO’s priority order and then shows the key learning programs aligned to each. (In some cases there will be no planned learning.) Where learning does have a role to play in helping achieve the company goal, the report ideally will include an outcome measure or some measure of success. Next, the report will share three-four key effectiveness measures and three-four key efficiency measures. The target audience for this report is the CEO, CFO, SVP of HR, other senior leaders as well as employees of the L&D department. The report lets the senior company leaders know four important things: 1) L&D knows what the company’s goals and priorities are, 2) L&D leaders have talked with goal owners and the two parties have agreed on whether learning has a role to play and if it does, what kind of contribution may be expected from learning, 3) L&D leaders are focused on improving the effectiveness and efficiency of learning, and 4) L&D leaders know how to run learning like a business by setting plans and then managing each month in a disciplined way to deliver those plans.

Use of these three standard reports and the processes associated with creating them will dramatically improve the impact, effectiveness, efficiency, stature, and credibility of L&D. Along with adopting the three standard measures, it will also be a major step towards L&D becoming a true profession with our own standards and reports.

Develop Your L&D Team’s Competencies to Deliver on Business Goals Webinar Recording

In this webinar, Caveo’s Strategic Learning Partner Alwyn Klein uses real-world case studies and proven concepts to show how to develop your L&D team to become proactive and high performing, balanced with the appropriate mix of roles and competencies. You’ll learn to…

      • Leverage the L&D Compass for your team’s journey
      • Plan team development strategies and tactics
      • Determine future competencies and roles
      • Maintain a focus on current and future business alignment
      • Deliver consistent and measurable business value

Please Visit http://www.caveolearning.com/develop-l-and-d-team-webinar  for access to the recording.

The 9th Annual Global Learning Summit Singapore

CTR is excited to have the opportunity to be a part of the Global Learning Summit. David Vance, our executive Director, will be running a Pre Summit Workshop on the 7th of March 2016. Dave will also be a Keynote Speaker for the Global Learning Summit Conference. Dave was one of the first speakers at the First Global Learning Summit. The conference is located at the Raffles City Convention & Exhibition Centre, Fairmont Singapore and will be from the 7th to 9th of March 2016. Visit http://salvoglobal.com/global-learning-summit-2016/  to learn more and to register.

Venue Location 

Neighbouring Hotels- Room Rate (December 2015)

160307 Global Learning Summit 2016 SG GOWREG

Want to share and learn with a crowd of your peer-group?

Want to share and learn with a crowd of your peer-group?

Center for Talent Reporting community are welcome to share practical Best Practices & Next Practices with the Executive learning Exchange  curating solutions as a global cohort online.  Join the following Google+ community to learn more:

Simple & cost effective ways to measure big results (Google id required)

Get To Know Your Board Member (Cushing Anderson)

Cushing Anderson is program vice president for IDC’s Project-Based Services research. In this role, Cushing is responsible for managing the research agenda, field research, and custom research projects for IDC’s Business Consulting, Human Resources and Learning research programs. Cushing has been actively investigating the link between strategic business planning and training in the extended enterprise, the value of certifications and the impact of training on IT team performance. He has also extensively researched the value of partner certification programs to software companies. He also conducts regular research on the views and experiences of enterprises’ with global consulting firms. Cushing speaks on a variety of topics and has received a number of industry accolades. He is on the editorial advisory board and authors a regular column for CLO magazine and is on the board a Center for Talent Reporting board member. Cushing holds a bachelor’s degree in Government from Connecticut College and earned his M.Ed. in Curriculum and Instruction from Boston College’s Graduate School of Education.

Join Cushing and CTR as he keynotes our Conference February 24th 2016.