CTR Exchange and Blog

What CEOs Know (and You Should Too)

By Dave Vance

Some in our profession have not had the opportunity to work closely with senior leaders like a CEO or CFO. With that in mind I would like to share some observations about senior leaders from my experiences at Caterpillar (first as the Chief Economist and later as the CLO) and at other companies since then. Of course it goes without saying that my sample size is small and thus may not be representative of all CEOs, but I believe most will share the traits discussed below.

First, despite their very lofty position, they are still just people. Now I know that some CEOs believe they have transcended to a whole new level of existence, but in my experience most were down to earth and you could just talk with them. This doesn’t mean you can waste their time because it is incredibly valuable so you need to be prepared and be clear about you want from them. At the same time, though, you don’t need to memorize a script or be scared to death about the meeting. It is okay to ask questions and with most it is okay not to have all the answers. Because they are people, don’t be surprised if they ask questions about your family or interests in addition to work. They really do want to know more about you, and some personal connection time breaks up their day which is otherwise filled with 10-12 hours of work-related issues. So, relax a little bit and enjoy the time you get with them. And be prepared to be surprised at what you might learn about them and the organization.

Second, they have a very high tolerance for some uncertainty and ambiguity. Their tolerance will almost certainly be much greater than yours because they have to deal with it every single day. Having said that, though, we need to provide some context. They will not have a high tolerance for your being unprepared or acting unprofessionally or making excuses. They will, however, understand that a plan for next year is just that: a plan. They know from experience that plans are usually wrong because something unexpected typically happens. So, risk cannot be eliminated because of fundamental uncertainties about the future but it can be managed and that starts with ensuring you have a good process for planning and that you have talked with the right people. So, expect questions about how you came up with the plan and who you talked to. If you talked to the right people (like the Sales SVP to agree on an outcome measure) and made reasonable assumptions, then they are going to feel good about the plan. Put differently, there is nothing they can do to guarantee a perfect plan so instead they will focus on what they can influence (process and people). So, just be humble in setting and communicating plans. Don’t worry that they are not perfect.

Third, they know that you are going to make some mistakes along the way. They themselves are still making mistakes and other senior leaders are still making mistakes so it is only natural that you will, too. (Unless you believe you are far superior to them!) Many in our profession are afraid that the CEO is going to discover they are not perfect, and thus are afraid to share TDRp-type reports which may show that every measure is not on track to meet plan. News flash: Your CEO already knows you are not perfect and they don’t expect you to achieve or exceed plan on every single measure. In fact, any good leader would be very skeptical of someone who says they always achieve or exceed plan because that is not how the world works. Unless you set incredibly easy targets to meet, you are going to achieve some, exceed some, and fall short on the rest. This is what your CEO is expecting. I met with our Board of Governors (CEO and other senior leaders) each quarter and I never reported that we were on plan to achieve or exceed all measures. Likewise, as an economist it was impossible to get all the forecasts right. Since your CEO already knows this (even if you do not) they will be looking to see if you are honest and transparent, willing to share the bad news as well as the good news. They will then want to see if you can address those areas where you are behind without being defensive or blaming anyone else. They want to know that you are on top of the issues and are doing everything that makes sense to deliver the planned results. So, don’t worry about letting the CEO see you are not perfect. Instead show them that you can manage L&D like business.

If you have not already discovered the above for yourself, I hope this makes you a little more confident and at ease about meeting and interacting with your CEO and other senior leaders. At the end of the day they are just people just like you, planning in the face of uncertainty, making mistakes, and moving forward despite it all.

Running Learning Like a Business: What Does It Mean and Is It Still Relevant? By Dave Vance

Running learning like a business simply means applying business discipline to the learning field. More precisely, it means creating a plan with specific, measurable goals and then executing that plan with discipline throughout the year. That’s it. Most organizations already do this at a high level. They create a business plan for the year and then create and use reports on a monthly basis to manage their activities to come as close to plan as possible. At a lower level many other departments already do this as well. Think about a typical sales, quality, or manufacturing department. They establish goals at the start of the year and then they compare their progress each month against those goals, taking appropriate actions as necessary to stay on plan or get back on plan if they are behind. The point is they have a plan and they manage to it.

It is my experience that fewer than 10% of learning departments manage this way. Most would say they work very hard and accomplish a great deal. And they do work hard and they do accomplish a lot. (And most produce activity reports to show just how busy they are (number of courses, participants, etc.)). But they could have an even greater impact on their organization if they worked smarter. They might not deliver more courses or reach more employees, but they will make more of a difference in their organization’s performance. So, what does it mean to work smarter? Well, it means that you plan your actions more carefully and then you execute your plan with discipline. The upfront planning means that you think carefully and critically about what you want to accomplish, why you want to accomplish it, what you will need in terms of resources to accomplish it, and what measures you will use. Invariably this will involve tradeoffs and prioritization since resources are always limited. So, what is more important for you to work on this coming year? Where can you make the greatest difference or add the most value? Once you have some ideas (and they may be different than what you have been working on!) the next task is to settle on a reasonable and achievable goal which means you must think carefully about what resources will be required. It simply may not be possible to accomplish the goal with your resources in which case you need to set a more realistic goal. And this will force you to think about measures to use in setting the plan and lining up your work effort.

While this does require work, I would argue that a disciplined planning process culminating in specific, measurable goals will help ensure you are focused on the right goals for the right reasons with the right resources. In essence this is a learning process and if you don’t plan this way you deprive yourself of this very important learning opportunity. I think many leaders would say they do think about goals and resources but they stop short of setting specific, measurable goals. In this case they have not thought critically about the level of resources required (without a specific measurable plan how do you know how resource will be required?) nor have they created reports with a column for plan to use in managing the learning each month. Theoretically, it would be possible for these leaders to accomplish the same as those with plans but practically speaking it is far less likely. After all, why do you think your colleagues in other departments go through all this work? They wouldn’t if it didn’t lead to better results. Simply put, running learning like a business will ensure you are focused on the right learning with the right goals, measures and resources to deliver the greatest value to the organization.

While I believe more learning leaders are running learning like a business, most are not. One objection I hear is that this focus on planning and executing is no longer relevant. Some tell me that their organization’s goals change every two or three months and thus it makes no sense to plan. This is ridiculous. I don’t think the company’s goals change every few months and, if they do, I would question senior leadership’s abilities to manage the company. It may be that priorities shift during the year but goals like increasing sales, quality, and employee engagement are not going to change every few months. (They may mean that their own projects or work assignments change frequently which may actually be a result of not having a good business plan for learning.) Others say they have stable goals but their company’s products (like cell phones) change once per quarter and thus there is no way to plan. Of course there is. Focus first on the goal of increasing sales and let this be your guide in planning programs. True, new models will come out and you will have to design training programs around them but you don’t need all the specifics at the start of the year. You have an idea of what is required for each product launch so use that to plan. Your specific, measurable plans for reaction, learning and application are probably the same for all your product-related learning and you have ideas about the size of the target audience. So, plan based on what you know and refine them as you get better information through the year.

Bottom line, running learning like a business is simply applying time-tested basic business skills to learning. Outside of learning, it is nothing new. And, outside of learning it is still very much relevant. So, let’s move that percentage up from 10% and see what a difference it makes to manage learning this way. Give it a try and see what results you get. I think you will be pleasantly surprised.

Four Approaches to Break the “No One Is Asking” Cycle By Peggy Parskey

 

Untitled2

 

 

 

 

In the last 18 months, I have noticed a concerning shift in the sentiment of learning professionals about ramping up the quality of their measurement and in particular their reporting. This sentiment had been quite prevalent 10-12 years ago, but abated for a while. Now it’s back with a vengeance.
The essence of the sentiment is “Business leaders aren’t asking for this information, so there is no need for us to provide it.”  Most L&D practitioners don’t state it quite like that of course. They say, “Our business leaders only want business impact data.” Sometimes the stated reason is that business leaders consider L&D self-report data to be useless. Regardless of how they say it, the message is clear: “Business leaders aren’t asking for it (or worse, don’t want it), so I’m not going to bother.”
I’m not imaging this shift. Brandon Hall Group recently published a study entitled, “Learning Measurement 2016: Little Linkage to Performance.” In their study, they found that the pressure to measure learning is coming mostly from within the learning function itself. In addition, of the 367 companies studied, 13% said there was no pressure at all to measure learning.

 

What’s concerning about this situation?

L&D has been saying for years that it wants to be a viewed as a strategic partner and get a seat at the proverbial table.  Yet, the sentiment that “no one is asking for measurement data” is equivalent to saying, “Hey, I’m just an order taker. No orders, no product or service.”   If we are going to break free of an order-taking mentality, then we need to think strategically about measurement and reporting, not just the solutions L&D creates.

 

The sad reality is that often business leaders do not ask for L&D data because they don’t know what data is available or they don’t understand its value.  Many business leaders assume L&D is only capable of reporting activity or reaction (L1) data. Many still pejoratively refer to L&D evaluations as Happy Sheets or Smile Sheets.  If they believe that’s all L&D can do, then of course they won’t ask or exert any pressure to do more.

 

Equally importantly, innovation would come to a standstill if organizations only produced products and services that resulted from a client request. Henry Ford famously said, “If I had asked people what they wanted, they would have said faster horses.”  Steve Jobs talked about the dangers of group-think as a product development strategy when he said, “It’s really hard to design products by focus groups. A lot of times, people don’t know what they want until you show it to them.”  Simply because your internal client cannot envision what you can provide does not mean you shouldn’t provide it and then iterate to get it right.

 

The big problem with a “no one is asking” mentality is that it is a self-fulfilling prophesy and a downward spiral. At some point, someone needs to break the cycle. It might as well be L&D.  (See the Infographic)

 

What can you do differently right now?

Fortunately, as an individual learning practitioner you can break this cycle. Here are four approaches (singly or together) you should consider:Untitled

  1. Reframe the problem: If truly ‘no one is asking”, stop and ask yourself, “Why aren’t leaders asking?” Is it because this information has no value or because they don’t understand the value? Is it because they don’t know what L&D can provide?” Based on your answers to these questions, develop a game plan to demonstrate the value of the data you can provide (and then provide it to them).
  2. Find a friendly: If you want to break the ‘negative reinforcing loop’, find a business leader who is willing to work with you, a ‘friendly.’  You can spot a friendly fairly easily. She is interested in how L&D can add value to her business. He views you as a trusted advisor, brainstorming how to build new skills and capabilities within his team. She is innovative and willing to try new approaches even if they might not succeed the first time out. If you have one or two such business leaders with whom you work, seek them out and discuss what data you could provide to answer their questions.
  3. Report on data that matters to the business leader:From a measurement and reporting standpoint, L&D still puts too much of its energy into gathering and reporting data that matters to L&D but is not compelling to the business.  The number of employees who attended courses or the results of your Level 1 evaluation are simply not important to the business. Look beyond your current measures and educate yourself on best practices in L&D measurement.  Integrate questions about learning application and support into your evaluation instruments. This is data business leaders will care about if you show them how it affects them and their organization.
  4. Tell a compelling story:Do you remember the Magic Eye picture within a picture phenomenon? If you held the picture up to your nose, you might see the constellation Orion buried inside the picture. (I never saw anything.)  If you believe your data is meaningful and can help the business, don’t use the Magic Eye approach. Don’t expect your business partner to find the meaning in the data. Rather, tell the story behind the charts and graphs through dialogue. Help your business partner connect the dots; help her understand the consequences of not acting on the data and the benefits if she does.

A real life example

The ability of employees to apply training in the workplace depends on several conditions, much of it outside control of the L&D department. Factors such as the quality of the training, the motivation of the employee, the opportunity to apply the training to real work and the reinforcement of the direct manager all affect the extent to which training is applied.

 

A few years ago, I worked with a company that sold complex financial software. They re-engineered their implementation process to simplify the client experience, reduce implementation time and accelerate revenue recognition. The business leaders identified project management (PM) skills as critical to the success of this new approach.

 

The Process Transformation Team identified an initial group of employees to attend the PM training and pilot the new approach with several clients. When they reviewed the pilot results they were disappointed. Implementation time had not declined appreciably and the client felt that the process was more complex than expected. The Team Leaders investigated and found that the employees’ managers were not reinforcing the training or directing them to support resources when they struggled to apply the PM methodology in a real life setting. They also found that the job aids L&D had created were too cumbersome and not designed to be used in a dynamic client setting.

 

Imagine you are the L&D partner of the Implementation Transformation Business Leader. What data could you have provided to demonstrate that his people were not building and honing skills in this critical discipline?

 

This leader needed a regular stream of L&D data on actual application on the job and barriers to application.  He needed data on employees’ perceptions of how this training would impact business outcomes of simplification, implementation time and reduced time to revenue recognition. He needed to understand the barriers to application and where the business was accountable to address the issue or L&D. Moreover, he did not yet need incontrovertible proof that the training was improving business outcomes.

 

Data from L&D was essential for this leader to take action and address issues that affected his ability to successfully transform a key client process.  Unfortunately, the business leader didn’t realize that L&D could help him get ahead of this issue and didn’t think to ask. After L&D and the business leader started talking, sharing data and insights, the Leader not only acted, but worked with L&D to develop regular business-oriented reports.

 


Final thoughts

As an L&D practitioner, you can break the negative reinforcing cycle. Why not regularly provide this type of data and use it to create a dialogue about what other insights you can provide?  Why not take the first step to dispel the belief that L&D has no useful data or insights to offer?  It’s in your hands.

 

Have you had a “No One is Asking” moment? I would love to hear from you. Follow me on Twitter @peggyparskey or connect with me on LinkedIn at https://www.linkedin.com/in/peggy-parskey-11634

 

It’s Time to Solve the “The Last Mile Problem” in L&D

 

The telecommunications industry has had to confront a challenge referred to as “The Last Mile Problem.” This phrase describes the disproportional effort and cost to connect the broader infrastructure and communications network to the customer who resides in the ‘last mile.’

Learning Measurement also has a “Last Mile” problem. We have a wealth of data, automated reporting and data analysts who attempt to make sense of the data. Yet, as I observe and work with L&D organizations, I continually witness “Last Mile Problems” where the data doesn’t create new insights, enable wise decisions or drive meaningful action. If we want our investments in measurement to pay off, we need first to recognize that we have a problem and then second, to fix it.

Do you have a “Last Mile” problem?  Here are six indicators:

  1. One-size-fits-all reporting
  2. Mismatch between reporting and decision-making cadence
  3. Lack of context for assessing performance
  4. Poor data visualizations
  5. Little attention to the variability of the data
  6. Insufficient insight that answers the “so what?” question

Let’s explore each indicator with a brief discussion of the problem and the consequences for learning measurement

1.    One size fits all reporting

While L&D organizations generate a plethora of reports, not enough of them tailor their reports to the needs of their various audiences. Each audience, from senior leaders to program managers to instructors has a unique perspective and requires different data to inform decisions.

In an age of personally targeted ads on Facebook (creepy as they may be), we each want a customized, “made for me” experience. A single report that attempts to address the needs of all users will prove frustrating to everyone and meet the needs of no one. Ultimately, the one-size-fits-all approach will lead users to request ad hoc reports, create their own shadow process or simply resort to gut feel since the data provided wasn’t very useful.

2.    Mismatch between Reporting and Decision Making Cadence

Every organization has a cadence for making decisions and the cadence will vary based on the stakeholder and the decisions he/she will make. Instructors and course owners need data as soon as the course has completed. Course owners will also need monthly data to compare performance across the courses in their portfolio. The CLO will need monthly data but may also want a report when a particular measure is above or below a specific threshold.

In a world of high velocity decision making, decision makers need the right information at the right time for them. When the reporting cycle is mismatched with the timing of decision-making, the reports become ineffective as decision-making tools. The result? See #1: shadow processes or ad hoc reporting to address decision needs.

3.    Lack of context to assess performance

Many L&D reports present data without any context for what ‘good’ looks like.  The reports display the data, perhaps comparing across business units or showing historical performance. However, too many reports do not include a goal, a performance threshold or a benchmark.

Leaders don’t manage to trends nor do they manage to comparative data. Without specific performance goals or thresholds, L&D leaders lose the ability to motivate performance on the front end and have no basis for corrective action on the back end. These reports may be interesting, but ultimately produce little action.

4.    Poor data visualization

Data visualization is a hot topic and there is no shortage of books, blogs and training courses available. Despite the abundance of information, L&D’s charts and graphs appear not to have received the memo on the importance of following data display best practices.

Look at the reports you receive (or create). Do your visuals include unnecessary flourishes (also known as chart junk)? Do they contain 3-D charts? Do they present pie charts with more than five slices? Do your annotations simply describe the data on the chart or graph? If you answered “yes” to any of these questions, then you are making the last mile rockier and more challenging for your clients than it needs to be.

No one wants to work hard to figure out what your chart means. They don’t want to struggle to discern if that data point on the 3D chart is hovering near 3.2 (which is how it looks) or if the value is really 3.0.  They won’t pull out a magnifying glass to see which slice of the pie represents their Business Unit.

Poor visualizations fail to illuminate the story.  You have made it more difficult for your users to gain new insights or make decisions.  Over time, your readers shut down and opt to get their information elsewhere.

5.    Little attention to the variability of the data

As the L&D community increasingly embraces measurement and evaluation, it has also ratcheted up its reporting. Most L&D organizations publish reports with activity data, effectiveness scores, test results, cost data and often business results.

What we rarely consider, however, is the underlying variability of the data.  Without quantifying the variance, we may over-react to changes that are noise or under-react to changes that look like noise but in fact are signals.  We are doing our users a disservice by not revealing the normal variability of the data that helps to guide their decisions.

6.    No answers to the “So-what” question

This last mile problem is perhaps the most frustrating. Assume you have addressed the other five issues. You have role-relevant reports. The reports are synchronized with decision cycles. You have included goals and exception thresholds based on data variability. Your visualization are top notch.

Unfortunately, we all too often present data “as is” without any insights, context, or meaningful recommendations. In an attempt to add something that looks intelligent, we may add text that describes the graph or chart (e.g. “This chart shows that our learning performance declined in Q2.”). Perhaps we think our audience knows more than we do about what is driving the result. Often, we are rushed to publish the report and don’t have time to investigate underlying causes. We rationalize that it’s better to get the data out there as is than publish it late or not at all.

Amanda Cox, the New York Times Graphics Editor once said: “Nothing really important is headlined, “Here is some data. Hope you find something interesting.”

If we are going to shorten the last mile, then we have an obligation to highlight the insights for our users. Otherwise, we have left them standing on a lonely road hoping to hitch a ride home.

Recommendations to solve the “Last Mile” problems

There is a lot you can do to address the “Last Mile Problem” in L&D. Here are six suggestions that can help:

  • Create a reporting strategy. Consider your audiences, the decisions they need to make and how to present the data to speed time to insight. Include in your reporting strategy the frequency of reports and data to match the decision-making cadence of each role.
  • Identify a performance threshold for every measure on your report or dashboard that you intend to actively manage.  In the beginning, use historical data to set a goal or use benchmarks if they are available. Over time, set stretch goals to incent your team to improve its performance.
  • Learn about data visualization best practices and apply them.  Start with Stephen Few’s books. They are fun to read and even if you only pick up a few tips, you will see substantive improvements in your graphs and charts.
  • Periodically review the variability of your data to see if its behavior has changed. Use control charts (they are easy to create in Excel) for highly variable data to discern when your results are above or below your control limits.
  • Add meaningful annotations to your graphs and charts. Do your homework before you present your data. If a result looks odd, follow up and try to uncover the cause of the anomaly.
  • For senior leaders, in particular, discuss the data in a meeting (virtual or face-to-face). Use the meeting to explore and understand the findings and agree on actions and accountability for follow up. Employ these meetings to create a deeper understanding of how you can continually improve the quality of your reporting.

If you start taking these recommendations to heart, your last mile could very well shrink to a very small and manageable distance. Have you experienced a “Last Mile” problem? I’d love to hear from you.

Follow me on Twitter @peggyparskey or subscribe to my blog at  parskeyconsulting.blogspot.com/

CTR Is excited to Announce the Release of a Significantly Updated TDRp Measures Libary

CTR is excited to announce the release of a significantly updated version of the TDRp Measures Library. This update focuses on Learning and Development Measures and has expanded from 111 measures to 183 Learning related measures.  The additions include measures related to informal and social learning as well as more detailed breakdown of the cost measures.  This version of the library also includes measures defined by the CEB Corporate Leadership Council as well as updated references to the HCMI Human Capital Metrics Handbook and updated references to ATD, Kirkpatrick and Philips defined measures

If you are a CTR member, you have access to the updated version at no additional charge. https://www.centerfortalentreporting.org/download/talent-acquisition/ .

If you are not a member, join CTR for $299/year

The Missing Component of L&D Management By Dave Vance

The L&D field is more exciting than ever, particularly with the advent of smart systems, mobile learning, big data, and greater access to content than ever before. Even predictive analytics is beginning to move beyond workforce planning, retention and recruiting to some applications in the learning arena. What is missing, however, is an improvement in how we manage our programs and departments. Our management practices have not kept pace and often have not improved at all over the last twenty years. Sadly, most masters and Ph.D. programs in organizational or human capital development do not include even a single course in management, and there is often little opportunity to learn good management practices on the job.

Now, I know what some of you are thinking. You would tell me that you do manage. You hire people into the department, set goals for them, and have performance reviews. You create an annual budget. You implement new systems, you roll out new courses and update existing content, and you might even track where staff spend their time. You address issues when they arise and you may have a long-term strategy. All of these activities are important and, yes, are a part of the overall management of the department. But there is a component of management that is missing in most organizations, and it is a critical component – one that has the potential to drive significant value creation. In fact, in most organizations it would drive much greater improvement than adopting a new LMS, implementing a mobile strategy, or doubling the amount of e-learning available.

What is this magical, powerful management component? It is the disciplined process of creating a plan and then executing it. Sounds simple and obvious, right? It is not, which is why very few L&D departments manage this way today. Let’s put this concept in concrete terms. It means the department head and senior department leaders need to decide what they want to do for the coming year and then create SMART goals for each item. Do you want to improve participant satisfaction or the application rate? Great, set a specific, measurable goal like a 5 percentage point increase in the application rate. Do you want to shift your mix of learning from instructor-led to online? You need to set a specific, measurable goal like increasing the percentage of online learning from 25% to 45%. Of course, you won’t accomplish these goals simply because you wrote them down on paper. You need to create action plans for each one, detailing all that will be necessary to achieve the desired improvement, including any additional staff or budget. You will need to assign someone responsibility for it and everyone will need a clear understanding of their roles and responsibilities.

And that is just the start. Once the plan is complete containing all these SMART goals and the year is underway, you will need to actively manage these initiatives to ensure their success. You will need to use reports every month to see if you are on plan or not. If you are not on plan, or if the forecast shows you falling short of plan by year-end, then you need to take management action to get back on plan. So, the department head should have a dedicated (meaning the only agenda item is management of the department) one or two-hour meeting each month with her direct reports to review progress against goals and decide on actions to get back on plan. (At Caterpillar, this meeting was the second Tuesday of every month, and we always used the full two hours.)

This, then, is the essence of good management. You must spend the time upfront to craft a good plan with specific, measurable goals, and then you must execute that plan with discipline. Very few L&D departments do this today. Nearly all can tell you how many courses were delivered last year to how many participants, and they can tell what the participant satisfaction was with those courses. But very few created a business plan before the year started with SMART goals, and fewer yet produced the reports necessary to manage throughout the year. Almost none have a monthly meeting dedicated to using these reports to manage the key programs and initiatives of the department. So, conceptually, this all sounds pretty straightforward, and colleagues in other departments would assume L&D already operates this way. After all, many of them do and have done so for a long time. (Can you imagine a sales department with no SMART goals and no monthly meetings to compare progress to those goals?) We need to adopt the same disciplined approach to management. It will result in our achieving significantly more than the alternative management approach which is basically that everyone will work hard and we will achieve whatever we achieve. Given the current state of management in L&D, I am convinced that the discipline of creating a good plan and then executing it will deliver far greater value in most organizations than any other single change, program or initiative.

 

 

 

Corporate Learning Week Conference, Dave Vance to Speak: CTR Exclusive Discout

2016 CLW Logo

 

 

 

 

Corporate Learning Week
November 7-10, 2016
Renaissance Downtown, Dallas, Texas
www.clnweek.com

Join us on LinkedIn: https://www.linkedin.com/groups/7052790

Theme: Disrupting Ourselves: L&D Innovation at the Speed of Change

Corporate Learning Week 2016 will focus on powerful new tools and capabilities to help you transform your learning organization to align with your organization’s core needs and core values, to strike the balance between best practices and next practices that will engage today’s dynamic workforce.

New for 2016, we have confirmed 2 exciting site tours:

– Brinker International (Chili’s & Maggiano’s)

– American Airlines Training Center

Corporate Learning Week 2016 is designed for L&D leaders who are:

– Taking charge of the end-to-end talent operations

– Innovating their learning ecosystems to keep pace with the speed of change

– Rethinking learning measurement across the board

– Looking for ways to accelerate leadership development

 

Have some fun while you’re away from the office and build stronger relationships within your team by taking part in our featured team building activities:

– Keynote Meet n Greet

– Team Scavenger Hunt

– Dallas Foodie Crawl

– Dedicated Team Bonding

 

Featured Speakers:

Diane August, Chief Learning Architect, Nationwide
Brad Samargya, Chief Learning Officer, Ericsson
Lisa Cannata, Chief Learning Officer, Orlando Health
Jon Kaplan, CLO & VP of L&D, Discover Financial
Karen Hebert-Maccaro, Chief Learning Officer, AthenaHealth
Katica Roy, Global VP, Learning Analytics, SAP
Tom Spahr, Vice President of L&D, The Home Depot
Marc Ramos, Head of Education for Google Fiber, Google
Matt Smith, Vice President, Talent & Learning, Fairmont Hotels & Resorts
Mary Andrade, Head of Learning Design & Delivery Center of Excellence, McKinsey
Corey Rewis, SVP of Learning & Leadership Development, Bank of America
Nathan Knight, Director of L&D, Sony
Casper Moerck, Head of Learning Technology – Americas, Siemens
Kate Day, VP Workforce Transformation, MetLife
Mong Sai, Director, Learning and Organizational Development, Newegg
Sarah Mineau, AVP Learning & Development, State Farm Insurance
Sheryl Black, Head of Training Operations, American Airlines
Stacy Henry, VP L&D Americas, Bridgestone
Namrata Yadav, Senior Vice President, Head of Diversity & Inclusion Learning, Bank of America
James Woolsey, President, DAU
Sanford Gold, Sr. Director, Global Leadership & Development, ADP
Donna Simonetta, Senior Director, Commercial Services Learning, Hilton Hotels
Melissa Carlson, Director, Leadership Development, Quintiles
Paul Rumsey, Chief Learning Officer, Parkland Health & Hospital System
Priscilla Gill, SPHR, PCC, Leadership and Organization Development, Mayo Clinic
Shveta Miglani, Learning and Development Director, SanDisk
Chenier Mershon, Vice President Operations and Training, La Quinta Inns & Suites
Al Cornish, Chief Learning Officer – Norton University, Norton Healthcare
Angel Green, Director of Talent & Learning, Coca-Cola Beverages Florida
Jeremy Hunter, Creator of The Executive Mind @ Drucker Graduate School of Management
Josh Davis, Director of Research, NeuroLeadership Institute
James Mitchell, CLO & Head of Global Talent Development, Rackspace
Dom Perry, VP of L&D, Brinker (Chili’s & Maggiano’s)
Leah Minthorn, Director of Operations Learning – North America, Iron Mountain
Emmanuel Dalavai, Global Program Manager, Training & Leadership Development, Aviall, A Boeing Company
David Vance, President, Center for Talent Reporting
Sarah Otley, Next Generation Learning Lab Director, Capgemini University, Capgemini

Discounts 

CTR EXCLUSIVE: Save 20% off  2016CLW_CTR. Register now!

It’s Time to Solve the “The Last Mile Problem” in L&D By Peggy Parskey

The telecommunications industry has had to confront a challenge referred to as “The Last Mile Problem.” This phrase describes the disproportional effort and cost to connect the broader infrastructure and communications network to the customer who resides in the ‘last mile.’

Learning Measurement also has a “Last Mile” problem. We have a wealth of data, automated reporting and data analysts who attempt to make sense of the data. Yet, as I observe and work with L&D organizations, I continually witness “Last Mile Problems” where the data doesn’t create new insights, enable wise decisions or drive meaningful action. If we want our investments in measurement to pay off, we need first to recognize that we have a problem and then second, to fix it.

Do you have a “Last Mile” problem?  Here are six indicators:

  1. One-size-fits-all reporting
  2. Mismatch between reporting and decision-making cadence
  3. Lack of context for assessing performance
  4. Poor data visualizations
  5. Little attention to the variability of the data
  6. Insufficient insight that answers the “so what?” question

Let’s explore each indicator with a brief discussion of the problem and the consequences for learning measurement

1.    One size fits all reporting

While L&D organizations generate a plethora of reports, not enough of them tailor their reports to the needs of their various audiences. Each audience, from senior leaders to program managers to instructors has a unique perspective and requires different data to inform decisions.

In an age of personally targeted ads on Facebook (creepy as they may be), we each want a customized, “made for me” experience. A single report that attempts to address the needs of all users will prove frustrating to everyone and meet the needs of no one. Ultimately, the one-size-fits-all approach will lead users to request ad hoc reports, create their own shadow process or simply resort to gut feel since the data provided wasn’t very useful.

2.    Mismatch between Reporting and Decision Making Cadence

Every organization has a cadence for making decisions and the cadence will vary based on the stakeholder and the decisions he/she will make. Instructors and course owners need data as soon as the course has completed. Course owners will also need monthly data to compare performance across the courses in their portfolio. The CLO will need monthly data but may also want a report when a particular measure is above or below a specific threshold.

In a world of high velocity decision making, decision makers need the right information at the right time for them. When the reporting cycle is mismatched with the timing of decision-making, the reports become ineffective as decision-making tools. The result? See #1: shadow processes or ad hoc reporting to address decision needs.

3.    Lack of context to assess performance

Many L&D reports present data without any context for what ‘good’ looks like.  The reports display the data, perhaps comparing across business units or showing historical performance. However, too many reports do not include a goal, a performance threshold or a benchmark.

Leaders don’t manage to trends nor do they manage to comparative data. Without specific performance goals or thresholds, L&D leaders lose the ability to motivate performance on the front end and have no basis for corrective action on the back end. These reports may be interesting, but ultimately produce little action.

4.    Poor data visualization

Data visualization is a hot topic and there is no shortage of books, blogs and training courses available. Despite the abundance of information, L&D’s charts and graphs appear not to have received the memo on the importance of following data display best practices.

Look at the reports you receive (or create). Do your visuals include unnecessary flourishes (also known as chart junk)? Do they contain 3-D charts? Do they present pie charts with more than five slices? Do your annotations simply describe the data on the chart or graph? If you answered “yes” to any of these questions, then you are making the last mile rockier and more challenging for your clients than it needs to be.

No one wants to work hard to figure out what your chart means. They don’t want to struggle to discern if that data point on the 3D chart is hovering near 3.2 (which is how it looks) or if the value is really 3.0.  They won’t pull out a magnifying glass to see which slice of the pie represents their Business Unit.

Poor visualizations fail to illuminate the story.  You have made it more difficult for your users to gain new insights or make decisions.  Over time, your readers shut down and opt to get their information elsewhere.

5.    Little attention to the variability of the data

As the L&D community increasingly embraces measurement and evaluation, it has also ratcheted up its reporting. Most L&D organizations publish reports with activity data, effectiveness scores, test results, cost data and often business results.

What we rarely consider, however, is the underlying variability of the data.  Without quantifying the variance, we may over-react to changes that are noise or under-react to changes that look like noise but in fact are signals.  We are doing our users a disservice by not revealing the normal variability of the data that helps to guide their decisions.

6.    No answers to the “So-what” question

This last mile problem is perhaps the most frustrating. Assume you have addressed the other five issues. You have role-relevant reports. The reports are synchronized with decision cycles. You have included goals and exception thresholds based on data variability. Your visualization are top notch.

Unfortunately, we all too often present data “as is” without any insights, context, or meaningful recommendations. In an attempt to add something that looks intelligent, we may add text that describes the graph or chart (e.g. “This chart shows that our learning performance declined in Q2.”). Perhaps we think our audience knows more than we do about what is driving the result. Often, we are rushed to publish the report and don’t have time to investigate underlying causes. We rationalize that it’s better to get the data out there as is than publish it late or not at all.

Amanda Cox, the New York Times Graphics Editor once said: “Nothing really important is headlined, “Here is some data. Hope you find something interesting.”

If we are going to shorten the last mile, then we have an obligation to highlight the insights for our users. Otherwise, we have left them standing on a lonely road hoping to hitch a ride home.

Recommendations to solve the “Last Mile” problems

There is a lot you can do to address the “Last Mile Problem” in L&D. Here are six suggestions that can help:

  • Create a reporting strategy. Consider your audiences, the decisions they need to make and how to present the data to speed time to insight. Include in your reporting strategy the frequency of reports and data to match the decision-making cadence of each role.
  • Identify a performance threshold for every measure on your report or dashboard that you intend to actively manage.  In the beginning, use historical data to set a goal or use benchmarks if they are available. Over time, set stretch goals to incent your team to improve its performance.
  • Learn about data visualization best practices and apply them.  Start with Stephen Few’s books. They are fun to read and even if you only pick up a few tips, you will see substantive improvements in your graphs and charts.
  • Periodically review the variability of your data to see if its behavior has changed. Use control charts (they are easy to create in Excel) for highly variable data to discern when your results are above or below your control limits.
  • Add meaningful annotations to your graphs and charts. Do your homework before you present your data. If a result looks odd, follow up and try to uncover the cause of the anomaly.
  • For senior leaders, in particular, discuss the data in a meeting (virtual or face-to-face). Use the meeting to explore and understand the findings and agree on actions and accountability for follow up. Employ these meetings to create a deeper understanding of how you can continually improve the quality of your reporting.

If you start taking these recommendations to heart, your last mile could very well shrink to a very small and manageable distance. Have you experienced a “Last Mile” problem? I’d love to hear from you.

Follow me on Twitter @peggyparskey or subscribe to my blog at  parskeyconsulting.blogspot.com/

Are You Spending At Least 20% of Your Time Reinforcing Learning?

By Dave Vance

Training may not be rocket science, but it is a lot more complicated than it first appears. This is especially true when it comes to reinforcing the learning so that it will be applied and have the planned impact. After all, if the learning is not actually applied, there will be no impact and it will be “scrap learning”, regardless of how well it is designed or brilliantly delivered. So, your learning can be well received (high level 1 participant reaction) and the target audience can demonstrate mastery of the content (high level 2), but if it is not applied (level 3) there will be no impact (level 4) and the ROI will be negative (level 5). Unfortunately, all too often we don’t even measure level 3 to find out if the learning was applied. We deliver the training, hope for the best, and move on to the next need. We need to do better.

Our greatest opportunity here is to help the sponsors (who requested the learning for their employees) understand their very important role in managing and reinforcing the learning. Learning professionals can provide advice but cannot do this for them. Sponsors need to understand their role in communicating the need and benefits of the training to their employees before the course is delivered. (This will require clarity upfront on exactly what new behaviors or results the sponsor expects to see.) Ideally, the employee’s supervisor will also communicate expectations immediately preceding the course. The sponsor needs to ensure that the intended target audience takes the course and follow up with those who did not. Once employees have taken the course, the sponsor and direct supervisor need to reinforce the training to make sure the new knowledge, skills or behaviors are being used. They need to again clearly state what they expect as a result of the training and let the employees know what they are looking for. The employees (individually or as a group) should meet with supervisor to discuss the training and confirm application plans. The sponsor will need to be ready with both positive and negative consequences to elicit the desired results. All of this constitutes a robust, proactive reinforcement plan which should ideally be in place before the training is provided.

As you can tell, this takes time and effort. How much effort? My belief is that it will take at least 20% of the time you dedicate to planning, developing and delivering to do this well. So, if it takes five weeks to plan, develop and deliver a course, are you planning to spend at least 40 hours on reinforcement? Many sponsors and supervisors have no clue about the importance of reinforcement and will not do it unless you can make them understand the importance. Many think that once they have engaged L&D, their job is done and training will take care of everything else. However, this cannot possibly be the case. The target audience employees don’t work for L&D, and L&D professionals cannot make employees in sales or quality of manufacturing do anything. L&D certainly cannot compel these employees to apply their learning – only the leaders who asked for the training can do this. So, you need to convince the sponsor that they have a very important role in the training. L&D can do the needs analysis, design and develop the training (with help from the SMEs), and deliver it, but only the sponsor and leaders in the sponsor’s organization can make their employees apply it. Impact and results depend on both L&D and on the sponsor – neither one can do it alone. On this we must stand firm as learning professionals. In fact, if a sponsor tells you that they don’t have time for these reinforcement tasks, you need to respectfully decline to provide the learning because it is likely to be a complete waste of time and money. And don’t let them shift the burden of reinforcement to you. It is your responsibility to assist them but it is their responsibility to do it.

With this understanding, are you dedicating enough time to reinforcement? At Caterpillar we found that we had to redirect resources from design and delivery to reinforcement. In other words, you will likely have to make a trade off. We decided it was better to deliver less learning but do it well (meaning reinforced so it was applied) rather than deliver more learning which would be scrap. What is your strategy?

 

 

Management: The Missing Link in L&D Today by Dave Vance

Despite great progress in so many areas of L&D, there is one area which has not seen much progress. This is the business-like management of the L&D department and L&D programs. Yes, department heads work to implement new LMS’s on time, and program managers and directors work to roll out new programs on time but there is still an opportunity to dramatically improve our management. Let’s look at programs first and then the department as a whole.

At the program level a really good manager would work with the goal owner or sponsor to reach upfront agreement on measures of success for the program like the planned impact on the goal. A really good manager would also work with the goal owner and stakeholders to identify plans or targets for all the key efficiency and effectiveness measures that must be achieved to have the desired impact on the goal. Examples of efficiency measures include number of participants to be put through the program, completions dates for the development or purchase of the content and completion dates for the delivery, and costs. Examples of effectiveness measures include levels 1 (participant reaction, 2 (knowledge check if appropriate), and 3 (application of learned behaviors or knowledge). A really good program manager would also work with the goal owner upfront to identify and reach agreement on roles and responsibilities for both parties, including a robust reinforcement plan to ensure the goal owner’s employees actually apply what they have learned. Today, many program managers do set plans for the number of participants and completion dates. Few, however, set plans for any effectiveness measures and few work with the goal owner to reach agreement on roles and responsibilities, including a good reinforcement plan. Virtually none use monthly reports which show the plan and year-to-date results for each measure and thus are not actively managing their program for success in the same way as their colleagues in sales or manufacturing.

At the department level, a really good department head or Chief Learning Officer would work with the senior leaders in the department to agree on a handful of key efficiency and effectiveness measures to improve for the coming year. Then the team would agree on a plan or target for each as well as an implementation plan for each, including the name of the person responsible (or at least the lead person) and the key action items. Examples of efficiency measures to manage at the department level include number of employees reached by L&D, percentage of courses completed and delivered on time, mix of learning modalities (like reducing the percentage of instructor-led in favor of online, virtual, and performance support), utilization rate (of e-learning suites, instructors or classrooms), and cost. Examples of effectiveness measures to be managed at the department level include level 1 participant and sponsor reaction, level 2 tests, and level 3 applications rates. Both the efficiency and effectiveness measures would be managed at an aggregated level with efficiency measures summed up across the enterprise and effectiveness measures averaged across the enterprise. A really good CLO would use monthly reports to compare year-to-date progress with plan to see where the department is on plan and where it is falling short of plan. A really good department head would use these reports in regularly scheduled monthly meetings to actively manage the department to ensure successful accomplishment of the plans set by the senior department leadership team. Today, very few department heads manage in this disciplined fashion with plans for their key measures, monthly reports which compare progress against plan, and monthly meetings dedicated to just the management of the department where the reports are used to identify where management action must be taken to get back on plan.

In conclusion, there is a great opportunity to improve our management which in turn would enable us to deliver even greater results. This requires getting upfront agreement on the key measures, on the plan for each one, and the actions items required to achieve each plan. For the outcome measures it also requires reaching agreement with the goal owner on mutual roles and responsibilities. Once the year is underway, good management also requires using reports in a regular meeting to identify problem areas and take corrective actions. Our colleagues in other departments have been doing this for a long time and with good reason. Let’s get onboard and manage L&D like we mean

Big Data, Analytics, and L&D By Dave Vance

Big Data, Analytics, and L&D By Dave Vance

 

Big data and analytics continue to generate a lot of excitement in the human capital field. It seems like a new conference is announced almost monthly on the topic. So, what does it all mean for L&D?

Short answer: Not much yet.

Don’t get me wrong. I like data and I like analytics, and the new applications in HR are very exciting, but the buzz is primarily around using statistical tools like regression and correlation to discover relationships among measures (variables) outside of the learning field. Some of the most often cited analytical work is the determination of the factors that explain retention which will allow organizations to take proactive steps to retain desirable employees. There is also work to better explain the factors behind employee engagement, although even this is often in service of increasing retention. I have not seen any conference descriptions or magazine articles where big data and analytics have been applied to learning.

There are certainly applications of analytics to L&D but most of these have been around for a while and are not receiving much current attention. Let’s begin, though, with some definitions.   “Big data” generally refers to the ability to capture more measures at an individual employee level than even before and, once captured, store them in a data base(s) which is easy to manipulate for analysis.  Think of data on employee engagement scores, ratings, individual development plans, past and current assignments, projects, supervisors, referrals, past employment, participation in company clubs and activities, use of benefits, number of sick days, number of vacation days taken, etc., by name for each employee. The “analytics” part is the application of statistical techniques to tease out relationships between these measures which may have been expected or, in some cases, totally unexpected (no one was looking for it). You can see the potential of putting big data together with analytics. For example, you might discover that a leading indicator of regrettable turnover is high use of sick days or lack of involvement in company activities or lack of opportunities to participate in projects. With this type of knowledge you could identify those at risk and intervene early to keep them from leaving.

How might this be applied to L&D? You can imagine using big data to show that application and impact are lower for employees with poor supervisors who don’t reinforce the learning. But we already know this. You can imagine using big data to show the importance of identifying the right target audience, but we already know this, too. I am sure there are opportunities to discover some new relationships, but I believe L&D is more mature than other HR disciplines and thus we already have discovered many important relationships.

While big data is not being used much in L&D today, we have been using “little data” quite effectively for some time. We routinely measure number of participants, courses, hours, costs, completion dates, participant reaction and amount learned. We measure application rates but not nearly as often as we should. Typically we do not put by-name employee measures in a data base which also has information about the supervisor, employee engagement scores and other potentially interesting measures so we end up with “little data” instead of “big data”.

And we have been using analytics for some time. We have been using the intended application rate as a leading indicator of actual application for years. We also use the other “smart” questions from the post event survey as leading indicators for application and impact. Patti and Jack Phillips have helped many organizations use regression analysis to isolate the impact of learning from other factors. This effort has really matured lately with the advent of organizations like Vestrics which use general equilibrium modeling (sets of regression equations estimated simultaneously) to isolate the impact of ALL the factors with a high degree of statistical certainty. So, analytics, including high-level analytics, does have a place in L&D.

In summary, in L&D we have not made much use of big data but we have been using little data for a long time. We have not made extensive use of analytics but we have used leading indicators for years and some are using regression to isolate the impact of learning. In part, the lack of application to L&D currently reflects our relative maturity. We have a better understanding of the basic relationships among measures than our colleagues in talent acquisition or total rewards because we have been measuring longer, and thus there is not as a compelling case to be made for trying to discover new relationships. That said, there are opportunities going forward especially in terms of isolating the impact of learning and demonstrating to nonbelievers the power of strong sponsorship and reinforcement.

Bringing Standards to the L&D Profession: Three Types of Standard Measure: By Dave Vance

By Dave Vance

Accounting has four types of measures (revenue, expense, assets and liabilities) and three standard statements (income or profit & loss, balance sheet, and cash flow). What do we have in L&D which is similar? Five years ago a group of industry leaders came together to consider a standard framework for L&D. The group included leaders like Kent Barnett then at Knowledge Advisors, Tamar Elkeles then at Qualcomm, Laurie Bassi from McBassi and Company, Jack Phillips from the ROI Institute, Jac Fitz-enz from the Human Capital Source, Josh Bersin from Bersin & Associates, Cedric Coco from Lowes, Karen Kocher from Cigna, Rob Brinkerhof from Western Michigan University, Kevin Oakes from i4cp, Carrie Beckstrom from ADP, Lou Tedrick from Verizon, and a host of others – 30 in all, including myself. After 24 revisions, we all agreed on a framework which we believed would begin to provide the type of standards that accountants have enjoyed for some time and which would help us become more professional.

These leaders recommended a framework consisting of three types of standard measures and three standard reports. So three and three versus the accountant’s four and three. The three types or categories of standard measures are effectiveness, efficiency, and outcomes. The three types of reports are Operations, Program, and Summary. The goal was to keep the framework simple and easy to use. We also wanted to build on the excellent work done in our profession over the last 70 years, especially that done by the Association for Talent Development (ATD, then called ASTD).

Let’s look at the measures first. Effectiveness measures are those that address the quality of the learning program or initiative. In learning we are fortunate to have the four levels popularized by Don Kirkpatrick and the fifth level (ROI) popularized by Jack Phillips. These five levels all speak to the quality of the learning. Level 1 measures the participant’s or sponsor’s satisfaction with or reaction to the learning – certainly an initial measure of quality. Level 2 measures the amount learned or transference of skills or knowledge, and level 3 measures the application of that knowledge or change in behavior. If they didn’t learn anything or if they don’t apply what they did learn, I think we can all agree that we don’t have a quality program. (This may reflect a lack of engagement or reinforcement by the sponsor, but we still have a problem.) Level 4 measures impact or results and, since this was the reason for undertaking the learning to begin with, if there are no results it is hard to argue we had a quality program. Last, ROI provides a return on our investment, a final confirmation of quality assuming we have properly aligned the learning to our organization’s needs and achieved high quality on the previous four levels. Most organizations have level 1 and 2 measures but relatively few measure at the higher levels.

Efficiency measures are about the number of courses, participants, and classes as well about utilization rates, completion rates, costs, and reach – to name only the most common. Typically these measures by themselves do not tell us whether our programs are efficient or not. Rather we need to compare them to something else which may be last year’s numbers, the historical trend, benchmark data, or the plan (target) we put in place at the beginning of the program. Now we have a basis to decide if we are efficient and if there is room to improve, to become more efficient. All organizations have efficiency measures with the most common being number of participants, classes, and hours as well as some cost measures like cost per learner or cost per hour.

That leaves outcome measures. Unlike effectiveness and efficiency measures, most organizations are not measuring outcomes and few even talk about them. That is unfortunate because outcome measures are the most important of the three types, especially to senior leaders who make the funding decisions for L&D. In accounting this would be like reporting expense and liabilities but never talking about revenue or assets. No one would have a complete picture of what we do, and it would be hard for anyone to understand why we have the expenses and why we incur the liabilities. So, what are these all-important outcome measures? Simply put, outcomes represent the impact or results that learning has on your organization’s goals or needs. Suppose a needs analysis indicated that your salesforce would benefit from a consultative selling skills program and product features training. And suppose that you and the head of sales agree that it makes sense, and the two of you further agree on program specifics, including mutual roles and responsibilities, especially how the sponsor will reinforce the learning and hold her own employees accountable. Now, what impact on sales can we expect from this training? How much of a difference can training make? A lot? A little? Enough to be worthwhile? This is the realm of outcome measures which will sometimes be subjective (but not always) but very important nonetheless. Sometimes the level 4 impact or results measure from the list of effectiveness measures will do double duty as an outcome measure. That is okay and the same happens sometimes in accounting. Or other measures will be selected. In any case, with outcome measures we are at last focused on how we align with corporate goals or needs and what type of impact we can have, and this is what your senior leaders have been waiting for.

Next month we will look at the three reports and how the three types of measures populate these three reports. In the meantime, as a profession, let’s start talking about these three types of measures. It would be big step forward if we could just adopt a common language, which by the way, is a precondition to be a true profession.

 

We Don’t Exist to Increase Employee Engagement: By Dave Vance

We Don’t Exist to Increase Employee Engagement

By Dave Vance

 

A surprising number of L&D professionals seem to believe that our primary mission, our main purpose, is to increase employee engagement. They are wrong and their mistaken belief will lead them to misallocate resources, choose inappropriate learning, and deliver suboptimal outcomes.

It is always good to start at the beginning, and the beginning for any department is to know why it exists. What is its mission or purpose? Ideally, learning leaders have thought carefully about this and have created a written mission statement or at the very least can articulate the mission informally. In most organizations, the learning department has the capability and potential to do much more than provide or facilitate learning to increase employee engagement. L&D has the potential to help the organization achieve many, if not all, of its goals. This means helping to achieve business goals like increasing revenue by 10%, reducing costs by 5%, improving patient satisfaction by 3 points, or reducing injuries by 20%. This also means helping to achieve HR goals like improving the leadership score on the annual survey by 5 points, meeting all compliance-related goals, and yes, increasing employee engagement. It  means providing the basic skills needed by employees new to a position and the more advanced skills needed by experienced employees especially in knowledge-based companies like those in consulting and accounting. Last, learning can help address many needs and challenges that fall below these high-level goals but which nonetheless must be addressed for the organization’s overall success.

So, learning has a very broad reach and can help an organization achieve many of its goals and address numerous challenges. A good mission statement should reflect this broad reach. For example, “Help our organization achieve its goals” or “Help our organization be successful”. If L&D’s mission is simply to increase employee engagement, then whose mission is it in your organization to help achieve all the other goals and meet the numerous needs and challenges that can be addressed though learning? It is true that your sales department could do its own learning, quality its own, manufacturing its own, customer care its own, and so forth. Basically, any department that needs learning does its own. That leaves L&D to address just HR goals and maybe just employee engagement if a separate department takes care of leadership development. This is a very sad state for L&D and for your organization as a whole. Think about what this state implies. Most often, other departments do not have learning professionals so the quality of the needs analysis and the learning is low. Someone is assigned to take care of the training needs on a part-time basis and they may not be happy about it. And training people in these departments are all isolated from one another with no opportunity to pool resources, share knowledge, and specialize.

For most organizations, the learning department can have a much more powerful impact if it has a broad mission which includes helping the organization achieve its business goals and if it is organized to support more than a single department. Let’s nor limit ourselves to simply addressing HR goals like increasing employee engagement. Instead, let’s address business and HR goals as well as the basic and advanced skills our employees need for success. What is your mission?

A New Year’s Resolution to Better Manage Learning

A New Year’s Resolution to Better Manage Learning

By Dave Vance

In the spirit of the New Year, here are my suggestions for ten steps to better manage learning to have greater impact on your company and to run your department more effectively and efficiently.

  1. Be clear about your mission. Why was the learning function created and what is it expected to do? Is your primary purpose to help the business achieve its goals, increase employee engagement, improve retention, or what? The answer is hugely important.
  2. If your mission is to help the company achieve its business goals, know what those goals are each year. Meet with the owners of those goals and reach agreement on whether learning has a role to play.
  3. If your mission is to increase employee engagement or retention, then meet with the head of HR and other senior leaders to plan how learning can contribute.
  4. In either case, set specific, measurable outcome-related goals for your learning initiatives. How much of an impact can learning have on achieving the business or HR goals? A lot? A little? You and the owner of each goal need to be crystal clear on this.
  5. Once you have agreement on the planned impact of learning, you and the goal owner need to reach agreement on your respective roles and responsibilities. What, exactly, must each of you do for learning to have the agreed-upon planned impact? The planned impact will only happen if both of you work together to make it happen.
  6. Set specific, measurable goals for the L&D department for the coming year. These generally will involve targeted improvements in effectiveness and efficiency measures across all the learning you offer. Examples of effectiveness measures would be improvement in participant and sponsor satisfaction, first-time pass rates, and application rates. Examples of efficiency measures would be number of participants, percentage of courses or hours that are web-based or blended, percentage of employees reached by learning, and completion dates.
  7. Create a convincing business case for the coming year incorporating the planned impacts and actions and the costs to achieve them. Include this in a written business plan for the year which would include an executive summary, last year’s accomplishments, a discussion of current and proposed staffing and spending, the business case, work plans, and a measurement & evaluation strategy. Have the plan approved by your CEO and governing board.
  8. Create reports containing plan and year-to-date results to use each month to manage the department to deliver the planned results. Update monthly. You should have a report for each major goal supported (program report) and a report for the key effectiveness and efficiency measures (operations report). You should also create a one-two page, high-level report with no learning jargon to share with the CEO, governing body, senior leaders and employees to highlight your alignment to and impact on key company goals and your efforts to improve effectiveness and efficiency (summary report).
  9. Use these reports in regularly scheduled monthly meetings to review progress against plan and, when results are not on plan (which will be often in the real world), to discuss and decide on appropriate management actions to get back on plan.
  10. Look for ways to celebrate your success and to continuously improve. It is a journey and you will never be perfect.

Note that the ten steps above did not include a request for more staff or budget. You can begin implementing these steps with your current resources if you are willing to make some trade-offs. For example, you may need to re-assign some staff or reduce the number of projects. At the end of the day, you may have to decide if you want to do better or if you want to do more. My suggestion is to consider doing better rather than doing more, which in the longer run is actually the best way to also do more. Show your company’s senior leaders you are an outstanding manager of the resources they have given you and they are likely to give you more. So, manage learning with business-like discipline by knowing your mission, aligning your learning priorities to that mission, setting specific & measurable goals, and using monthly reports to ensure that planned results are delivered.

Best wishes to all in this great profession in the New Year!

 

 

The Move to Learner-Centric or Self-Directed Learning: Don’t Be Fooled by the Current Hype!

The Move to Learner-Centric or Self-Directed Learning: Don’t Be Fooled by the Current Hype!

By Dave Vance

In just the last month I have seen more and more about the wonderful new world of learner-centric or self-directed learning and it is really beginning to worry me. Article after article talks about the need for companies to stop trying to direct employee learning and instead shift to a role of curator and enabler to make it easier for employees to find whatever learning interests them. Here is what concerns me.

First, the history is wrong. Each article begins by stating that in the “old days” corporate L&D departments directed all the learning for all the employees. This is patently false. At Caterpillar we certainly never dreamed of directing, prescribing or controlling ALL of the learning for ALL of the employees. And I have never met a CLO or L&D department head who believed it was their mission to control all the learning in their company. So, what are these authors talking about? Yes, working with senior company leaders we did develop learning to address particular needs and help accomplish company goals. Yes, we did have orientation programs where we provided helpful learning to new employees who otherwise would not have a clue about what they needed to know. Yes, we had compliance-related learning. And, yes, we provided a number of general education courses and suites of online learning for employees to access to further their own development.  And we also facilitated the creation of individual development plans using required, suggested and discretionary formal and informal learning. But we never even entertained the notion that we, as a corporate university, would prescribe all the formal learning for all the employees or that we would control all the informal learning. The concept itself is ridiculous. So, let’s begin by getting our history right.

Second, the notion that employees always know best and should be left to choose all their own learning is equally ridiculous. Many employees new to a company or position simply don’t know what they need to know. Why wouldn’t the business want to provide direction? A needs analysis may even indicate that experienced employees lack some important skills, knowledge or capability to perform at the desired level. Why wouldn’t the business want to provide that learning to the appropriate employees? Now, it should go without saying that L&D professionals should of course use all the tools available and appropriate to meet the need. So, the “old days” of reliance solely on instructor-led learning should be behind us as should use of only formal learning. Even 10 years ago at Caterpillar we had 35,000 employees actively engaged in 2000 communities of practice and we always considered performance support tools. So, by all means, make full use of all the new learning openly available on the internet both in your business -directed learning and in your employee -directed learning. But don’t stop providing business-directed learning to employees where it makes sense.

Third, the “new role” for L&D is a move in the wrong direction and it is clearly a demotion. Some now suggest that L&D should play a more passive role and simply respond to the desires of employees.  Basically, L&D leaders should find out what employees want and give them more of it. This is fine if the mission of L&D is simply to help boost employee engagement. But our mission should be much, much more. Yes, we can help increase employee engagement by providing or facilitating more learning opportunities, but we are also in the best position to help our companies achieve their business goals. This requires strategic partnerships with the businesses resulting in business-centric or employer-directed learning. More than any other department in a company, L&D is in a unique position to help the company achieve most of its goals by becoming a valued, trusted, strategic business partner. Advocates of self-directed learning appear to be going in the opposite direction and focus only on supporting the goal of engagement or other HR-related goals like retention. HR goals are not the same as business goals like increasing sales, reducing costs, or improving quality. L&D leaders should strive to support BOTH business goals and HR goals – not simply HR goals. So, it would be a huge mistake for L&D to move away from a very active role supporting the company’s business goals to a more passive role supporting only HR goals. I would also suspect that the budget and staffing for an L&D department that moves away from actively supporting business goals will eventually be cut 50%-75% in line with their new nonstrategic, passive role focused only on one HR goal.

Let’s get the discussion going on this topic. Let’s identify where self-directed learning makes sense and where business-directed learning makes sense. Let’s not get carried away by the current hype.

Two Surprises at CLO Symposium By Dave Vance

Two Surprises at CLO Symposium

By Dave Vance

 

I was surprised by two things at the Fall CLO Symposium in Austin. First, I was surprised by the L&D mission statements of many speakers. Second, I was surprised by what is apparently an industry-wide and unchallenged swing to a learner-centric approach. The two surprises may be related.

My first surprise occurred as I listened to three speakers in a row start their talk by stating the purpose of their L&D group. I applaud them for starting with a mission because I believe a good, clear mission statement is vital. My surprise came as each one stated their mission in a way that was not directly tied to their business. The first and third said the purpose of their function was to increase employee engagement. Don’t get me wrong, I firmly believe that providing employees with learning opportunities will generally increase their engagement. Furthermore, highly engaged employees are likely to work harder and provide more discretionary effort, both of which should translate into better corporate results. And the more engaged they are the less likely they are to leave which avoid the considerable cost of replacing them.

So, engagement is a great thing, but should that really be their primary purpose? In industries with very high turnover, perhaps it will be a top business goal. For most organizations, however, there are a number of business goals which the L&D department should be supporting like increasing revenue and productivity, decreasing operating costs, improving quality and safety, enhancing customer or patient satisfaction, and sparking innovation. Not a single mention of any of these. It seems to me that the primary purpose of L&D should be to help the organization achieve its goals. This should be done by aligning programs to the business goals so you can impact those directly AND by finding ways to increase engagement which will contribute indirectly (but importantly) to the business goals. In other words, I think your CEO would like to hear about more than engagement.

The second speaker said their mission was to “leverage technology”. Period.  This reflects an absolute confusion between the means and the end.  Leveraging technology is a means to an end – it will never be an end in itself. Why do you want to leverage technology? To accomplish what? The “what” would give us insight into the mission. Of course you should leverage technology, but you need to be clear upfront on why. So, three out three speakers did not provide a compelling reason for the L&D function to exist. And we wonder why L&D gets cut first?

My second surprise may be related to the first. For the first day and a half, it seemed every speaker I heard on learning (I did miss some so consider this an imperfect sample) talked about the wonderful move to a learner-centric or self-directed approach. Some companies have already reorganized their departments around this approach. Common elements in support of this approach are trying to find out what learners want and giving them more of it. Much talk about the need to move away from creating content or teaching classes. L&D’s new role will be that of curator. Some at the conference even told me that company goals and learning needs are changing so rapidly now that there is no role for any business-centric or company-directed learning. Worse, I didn’t hear anybody challenge these assertions. If there was a debate over the last couple of years I missed it, and it seems this is now the accepted wisdom in the field. Pardon me, but I object!

We need to have a good discussion around this topic before our profession does some serious damage to itself. Like so much else, it seems to me there needs to be a middle ground here. It is wonderful that employees want to learn and that there are more ways than ever before for them to do just that. The potential to connect with and learn from others has never been greater, especially when you think about social media and all the online learning available. L&D departments should certainly find ways to help employees find the learning they need and do what they can to encourage informal learning. But that does not mean L&D departments should stop providing business-centric learning which we will define as learning planned to help achieve business goals. Senior leaders and experienced employees have wisdom. Why wouldn’t you want to take advantage of that wisdom and let new employees know what they will need to be successful or what experienced employees need to take their performance to the next level? True, a good employee with enough time will probably figure out what they need to know and may be able to secure that knowledge informally, but why not speed the process up? Plus, I think many employees would appreciate the guidance. So, it seems to me there will always be a role for business-directed learning, not just for compliance but to help employees more quickly be as productive as possible in pursuing the business goals.

I think the learner-centric approach does make perfect sense when the learning is not aligned to business goals but in support of an effort to increase employee engagement. Many employees want opportunities to grow and develop outside the requirements to be successful at their current job. Self-directed learning is perfect in this case since each employee will have unique needs, and L&D departments should help facilitate this process by making as many learning opportunities available as possible.

So, my two surprises may be related after all. If the mission of L&D is simply to increase employee engagement, a learner-centric approach makes perfect sense. However, if the mission of L&D is to help the organization achieve its business goals, then there is most definitely still an important role for business-centric learning. Blind acceptance of the current industry trend away from business-centric learning will put an end to our quest to be valued, strategic business partners. If we continue in this direction we will not deserve a seat at the table and L&D budgets should be dramatically reduced.

Is L&D Fundamentally Different Than Other HR Disciplines?

Is L&D Fundamentally Different Than Other HR Disciplines?

By Dave Vance

I just attended my first “HR” conference. I normally attend L&D – focused conferences like ATD, CLO Symposium, and Skillsoft Perspectives. This conference was 2.5 days and while it had an L&D track, the primary focus of the other four tracks and all the main session topics were on other aspects of HR like talent acquisition.

The conference certainly broadened my understanding of the challenges and issues facing our colleagues in other HR departments. It also provided a much better basis for answering the perennial question of whether L&D “belongs” in HR. I used to answer that question somewhat ambiguously by saying that HR was probably the best fit for most (but not all) L&D functions. (Note: At Caterpillar the L&D function was within the Human Services Division.) My first impression after the 2.5 days is that L&D may indeed be fundamentally different and may not be best positioned in HR.

Here is what struck me and I wonder if you have seen the same in any HR conferences you may have attended. I did not hear a single speaker talk about HR as a direct contributor to achieving the annual business goals of the organization, like increasing sales by 10% or reducing operating costs by 5%. They talked about how HR can help achieve long-run goals like developing a more diverse workforce and ensuring that the right workforce will be in place in five years. They talked about the important role of HR in managing the leadership pipeline and improving the performance management process. Don’t misunderstand me, these are all important goals but they will contribute INDIRECTLY to achieving the long-term business goals and even more indirectly to this year’s business goals.

Speakers also talked about big data and talent analytics and how these can help address current business issues like finding the best candidates for open positions or the best place to locate a new office. Again, very important and issues that HR is now better positioned than ever before to address. But there was no mention of current business plan goals like increasing sales by 10% and how HR might directly contribute.

Using the language of TDRp, the measures in support of these HR initiatives were primarily efficiency measures like number of hires, diversity of the workforce, and cost per hire. There were a few effectiveness, like bench strength and quality of hire. But there were no outcome measures which we define as the direct impact of HR initiatives on organization (business) goals. When speakers talked about goals, they were really talking about efficiency or effectiveness goals – not business goals. So, they talked about how HR initiatives would help accomplish HR goals.

Now, contrast this with L&D’s ability to directly impact business or organizational goals. Think about the common business goals like increasing sales, reducing costs, improving quality or productivity, and reducing injuries. In most organizations, L&D can and typically does contribute directly to achieving these goals. Consequently, we have outcome measures for L&D to measure our impact on these goals. (Of course, L&D can often also contribute to HR goals like improving employee engagement and leadership.)

So, perhaps L&D is fundamentally different than other HR disciplines because we can directly impact many, if not all, business goals. Initiatives from other HR functions play an important role in enabling organizational success in the current year and in the future, but often are not direct contributors. More thought is required here.

Want to Have a Greater Impact?

Want to Have a Greater Impact?

By Dave Vance

Learning and development (L&D) has the potential to significantly contribute to your organization’s results. What steps are you taking to maximize your impact? Here are four for your consideration.

First, know your organization’s goals and be sure you are doing everything possible to support the CEO’s highest-priority goals. This sounds so simple and yet many L&D functions do not do this. To be clear, we are not talking about HR goals, we are talking about your organization’s “real” goals like increasing revenue, reducing operating cost, or improving productivity. Now, if your CEO’s top five goals include improving leadership, increasing retention or boosting employee engagement, then by all means focus on these as well. If you are not currently supporting these “real” goals, what programs could you design that would help achieve them? If you are supporting them, are there any new programs which might do an even better job? As a self-check, what percentage of your total L&D spend is for learning directly aligned to your organization’s top five goals? Most CEOs would want you to do everything you can to help achieve their top goals.

Second, speak the language of business. Senior leaders should not have to learn our language. We need to learn theirs. After all, we exist to support them; they do not exist to support us. So, no talking about competencies and competency models unless they bring it up. No talking about level 1 or level 2. Do not use the words “pedagogy” or “learning modalities”. The language of senior leaders is generally money, results, impact, priorities and trade-offs. (See the 2010 research by Jack Phillips on what CEOs want from L&D where the two highest rated requests are impact and ROI.) They are busy people and they want you to get to the point. How can learning help them achieve their goals? What impact will it have? How much will it cost? They don’t need a lot of technical detail. Start with big picture and talk in business terms. They will ask for more detail if they need it.

Third, given the focus of senior leaders described above, get agreement upfront on the planned impact of your strategic learning initiatives. If you and senior leaders believe learning could help achieve a goal to increase sales by 10%, how much of a difference might learning make? Here we are talking about a plan for the isolated impact of your learning initiatives on sales. This will be a just a plan and there is no guarantee it will be realized, just like there is no guarantee that sales will go up 10% next year because that is the plan. While planning of this sort may be new to you, it is not new to your senior leaders who have to make plans all the time in a world full of uncertainties. So, for example, could your learning initiative, properly designed, delivered, and reinforced, be expected to deliver a 2% increase in sales? What would it take from you and from the head of sales for learning to have this impact? You want to be on the same page as senior leaders upfront on planned impact and on roles and responsibilities, and then you need to hold yourselves mutually accountable for delivering results.

Fourth, now that you are speaking the language of business and have a plan to contribute to the organization’s top goals, you need to execute it with discipline. This means you need to generate monthly reports showing progress towards plan or goal. The report should show the plan, year-to-date (YTD) results, and a forecast for how the year is likely to end if no special actions are taken. This report will be used by the program manager and department head to manage the learning initiatives to come as close as possible to achieving the agreed-upon impact. Each month the report should be used to ask whether you are on plan, and if not, why not and what can be done to get back on plan. Or, you may be on plan but the forecast shows you falling below plan by year end. If so, then you need to discuss why and what steps can be taken to stay on plan. Bottom line, you want to know where you stand each month, and you want to know as soon as possible if it appears plan may not be achieved so you can take some action to get back on plan.

These four steps will help you have the greatest impact possible on the goals that are most important to your CEO. Following them will also make you a much better business partner and increase the likelihood of additional resources.

Steps in Creating a Measurement Strategy

Steps in Creating a Measurement Strategy

By Dave Vance

 

Many learning professionals have been tasked with creating a measurement strategy. If you are among the chosen, here is some advice.

First, meet with your department head. They are the key to a successful and impactful measurement strategy. After all, the measurement strategy should support their management needs so it really is all about them. Unfortunately, many department heads do not understand that measurement is a means to the end and the end is their good management of the department. Here are some questions to get the discussion going:

  • Why do you want a measurement strategy?
  • How will you use the measures? How frequently do you want to see them?
  • Who else will use these measures?
  • What do you think of our current strategy or the measure we currently collect?
  • What measures do we currently collect?
  • What do our current reports look like? Who receives them? Who actually uses them? What do they do with them?
  • What strategic company goals are we supporting this year or next?
  • What are your key initiatives for the department? What goals do you have to improve effectiveness or efficiency? Do you have any measures in mind for these?

Answers to these questions should provide good insight on the current state of measurement.

Second, create three lists of measures: one for effectiveness measures, one for efficiency measures, and one for outcome measures (if you have any – most do not).  Your measurement strategy should include measures in each list so it is important to have a balance. Add any additional measures gleaned from your discussion with the department head. You may also want to meet with other senior leaders in the department to get their input.

Third, reflect on the department’s initiatives in support of the strategic goals. You will need an outcome measure for each company goal you are supporting. You will also want some accompanying efficiency measures (like number of participants, completion date and cost) and effectiveness measures (participant and sponsor satisfaction, amount learned and application rate). You will want to talk with the program directors to learn more about the initiatives and agree on the appropriate measures. Ultimately, the program directors will need to work closely with the goal owners or sponsors (like the VP of sales) to agree on the final list of measures and a plan or target for each.

Fourth, reflect on the department head’s plans for department-level initiatives. These may include leadership programs or programs to increase employee engagement or improve retention which are important but are not strategic company goals. These also include initiatives aimed solely at improving efficiency (for example, improving cycle time or increasing the ratio of web-based to instructor-led learning) or improving effectiveness (e.g., increasing the application rate of learning for the top ten programs or increasing the enterprise participant satisfaction rating). You will need efficiency and effectiveness measures for all of these and you will need outcome measures for your major initiatives like improving leadership. Add any additional measures from your discussions with other department leaders.

Fifth, meet with the department head and senior L&D leaders to share your preliminary thinking about the measures and how they flow from the key initiatives. Undoubtedly, the department leaders will have some additional thoughts and suggest more measures, but hopefully they appreciate the measurement framework and the alignment of measures to their important initiatives. Again, the measurement strategy should be all about helping the department head and senior L&D leaders better manage their initiatives. It should provide the measures that they need and want to see every single month to know how they are doing.

Last, the department head and senior L&D leaders need to decide which measures they want to actively manage versus passively monitor. Any leader can only actively manage 10-20 measures with their direct reports – so what are your vital few measures? Although you have been tasked with creating a measurement strategy, it is your leaders’ responsibility to decide which of the many measures will be managed. And they need to establish a plan or target for measure to be managed and then review progress against plan every month, so work is required here. Once they decide which to manage, you can them create reports like those in Talent Development Reporting Principles (TDRp) to show the measures, plan and year-to-date results.

I hope this provides some helpful structure to the measurement strategy process. It is very much an iterative process and generally requires multiple meetings until everyone agrees on the final list of measures and particularly the final list of measures to be managed. However, it will be time well spent, and your role in pulling the measurement strategy will give you great insight into the department and your leaders.

 

The Right Conversation By Dave Vance

The Right Conversation

By Dave Vance

In Chief Learning Officer’s June issue on the 2015 Learning Elite winners, Kimo Kippen, CLO for Hilton Worldwide and winner of Third Place in the Elite, talks about the importance of having the right conversation with senior leaders. He says the discussion is about “driving better business results and not simply justifying L&D’s existence.” He goes on to comment, “I never, ever got questions as to why learning was important…. I was never asked, ‘Why are we doing this?’”

Why is the right conversation so important, what is it about, and who is it with? First, the right conversation needs to be with the right person, and the right person is the senior business leader responsible for delivering a company goal like higher sales or improved guest satisfaction. Typically these leaders will be SVPs and direct reports to the CEO. In other words, they will be the top leaders in your organization. Now, the discussion about learning and its impact on business results may have started much lower in the senior leader’s organization but ultimately the right conversation will be with the SVP.

Second, the discussion must ultimately be about learning’s ability to help drive business results which is why it needs to be with the owner of the business goal (for example, increase sales by 10%). Both the senior business leader (like the SVP of Sales) and the L&D leader (like the CLO) must agree that learning has a role to play and they must agree on the specifics of the planned initiative like target audience, learning modality, location, duration, objectives, and cost. More than this, though, they must also agree on the planned impact and other measures of success so they both can agree on what is expected from this investment in learning. And last, they must agree on their mutual roles and responsibilities. What must each do to deliver the planned impact? Neither one can do it alone. The senior business leader must rely on the learning professional’s expertise to ensure that learning can contribute to achievement of the goal and, if it can, to design and deliver the right learning. On the other hand, since the employees report into the senior leader’s organization, only the senior business leader can provide the appropriate reinforcement and leadership to ensure her employees actually apply the new knowledge or behaviors which will result in the intended impact. (You can see why this discussion must be with the senior business leader….a lower –level leader in her organization simply does not have the power to compel the needed behavior by employees and leaders in the organization.)

Hopefully, by now you have a sense of why this conversation is so important and transformative. The conversation is about learning’s role in helping the senior business leaders accomplish their goals. Done properly, both parties (business and L&D) will walk away in agreement on the need for learning, the particulars, the planned impact and other measures of success, and what each must do if, together, they are going to deliver the planned business results. As Kimo said, after this conversation, there is no question about learning’s role or importance. You are now a business partner with an important role to play in achieving your company’s goals. Notice, too, the purpose of this discussion was not to ask for resources or justify L&D’s existence. It is a business discussion. If learning has a role to play and both parties agree upfront on the particulars and planned impact, the discussion about cost is straightforward. It will cost $X to achieve the planned impact. If that seems like a good investment, proceed. If not, don’t. No begging. If the available budget is less, you can scale back the program AND the planned impact.

Bottom line, these types of discussions are required if you are to become a valued business partner. More, they are just very interesting and fun. You will learn a lot from your senior leaders and gain a lot of insight. If you are already having these “right conversations” keep up the good work and share your experiences with others. If you are not having these today, resolve to move in this direction going forward. You will be amazed at the difference they make.

Don’t Sell Training to Senior Leaders

Don’t Sell Training to Senior Leaders

By Dave Vance

In the last blog I talked about marketing your training to participants. I concluded that while it may be appropriate for general learning it would not be appropriate for learning strategically aligned to your organization’s goals since the goal owner is responsible for identifying the specific target audience and ensuring that they take it. So, while you may help the owner communicate the benefits and expectations of the training, there is no need for L&D to market it in the traditional sense of marketing. (The owner, however, may very well want to market it to their organization, especially if it is voluntary.)

Today I want to talk about a related topic and that is selling your training to senior leaders. Here the general rule is much simpler: Don’t do it! Now I know that you may be tempted to sell the training because you firmly believe it will be of value. You have completed your needs analysis, and you know that training, specifically the training you are recommending, would close an important gap in employees’ knowledge, skill or behavior resulting in greater organizational capability and better results. Since you are convinced of this it is only natural that you want to convince the leader as well so their employees and your company can benefit from the training.

Here is the problem. At the end of the day, you can design and deliver the very best learning which meets all of your carefully researched and planned objectives. The participants may love it and give you high satisfaction ratings. They may demonstrate that they learned the material through a test. But if they do not apply it, it is scrap learning, a waste of time and company resources. So, who is in a position to ensure that the participants actually apply your great learning? Sadly, it is not you. Only the senior leader is in a position to compel their employees to apply the learning. You can (and should) provide advice to the leader about steps they can take to ensure the highest level of application. You can advise them how to kick off the training, how to let their supervisors and employees know what their expectations are for this training, and how to devise a reinforcement strategy of positive and negative consequences. (The Kirkpatrick’s’ rightly focus on the importance of this step.)

If the senior leader does not have the time or inclination to undertake these steps to ensure application, chances are the application rate will be very low, resulting in low impact and almost certainly a negative ROI. This is why hard selling training to senior leaders simply will not work. They have to believe in it. They have to want it. They have to be committed to it enough to dedicate the time to make it successful. You can share your proposal for training with them and outline the expected benefits, but no hard sells. If they are not convinced you might try explaining the benefits one more time, but if they still do not seem convinced, walk away. These should be business discussions about how you can help them achieve their goals through training. And you need to be clear about what will be required from them in terms of their time and commitment. If they don’t see the benefits, or if they do but don’t have the time or energy to do their part to achieve them, then let it go and don’t take it personally. Respect their judgment.

The danger here is that you go into hard sell mode and you succeed. They give in and you proceed to deliver the training. But since they never really believed in the benefits, or they do not have the time to do their part, the training is not properly positioned or reinforced it, and consequently the application rate is low. It is a waste of time and resources. You succeeded in selling it but the training failed to have impact. And all of this was predictable. Thus, my advice: No hard sells. It simply won’t work because the leaders have to be fully committed and it shouldn’t take a hard sell to get that commitment. They will realize the training is in their best interest or they won’t, and this can be accomplished through a business-like discussion of benefits and costs. Don’t try to force them to do training when they don’t really believe in it or are not fully committed to it. In other words, be more of a consultant and business partner and realize that this means they will not always take your advice. And that is okay.

Should We Be “Marketing” Training to our Employees?

Should We Be “Marketing” Training to our Employees?

By Dave Vance

Recently the issue of marketing training to employees came up. Some suggested that learning leaders need to market training like any other organization markets its products and services, and they offered a number of excellent suggestions to create awareness and demand for the courses. While I don’t have any objections to these suggestions, I don’t think it is a good model for learning and development (L&D) in general. Here is why.

The assumption behind this model is that L&D exists to offer a variety of courses just as a company exists to sell its products. Furthermore, success will be defined by how many sales (participants) we get. Thus, to be successful L&D has to market just like a company to get employees (our customers) to take our training (product). However, I don’t believe this is why L&D exists. I believe L&D exists, first and foremost, to help our organizations achieve their goals. Consequently, L&D should align and prioritize its efforts around the organization’s highest priority goals. This means finding out what the CEO’s top goals are and then working with senior leaders to determine if learning has a role to play and, if it does, then working with goal owners to develop learning initiatives in direct support of those goals. The learning and its ultimate impact on the goals will be a result of close collaboration and partnership between L&D and the goal owner & stakeholders in the owner’s organization.

In this model, why would L&D be marketing or selling the learning? The goal owner, like the SVP of Sales, has agreed that some of her salespeople need the training. You have worked with her to identify the appropriate target audience. Presumably, everyone in the target audience will be required to take the training or to demonstrate the desired proficiency. The SVP of Sales, with your help, will need to convey the expected benefit of the training and the SVP will need to communicate the appropriate positive incentives (and/or negative consequences for not) for applying the desired behaviors, but this is not marketing. If the SVP of Sales decides the learning will simply be recommended, then the two of you will need to communicate its expected benefits, but this still should not represent an all-out selling effort for the course.

So, for learning aligned to high-priority organization goals, L&D leaders should not have to “sell” the training, although they may need to help the goal owner communicate the expectations and benefits. Compliance-related learning will be mandatory as will learning for basic skills required to do a job so no selling required. That leaves the rest of the learning offerings which are unaligned to important goals of the organization. This learning, like team building and effective communication, is important but much less important than the other learning which is aligned or mandatory. Consequently, it just doesn’t make sense to spend a lot of effort selling this learning to employees. I agree that good marketing will lead to more employees taking these courses, but that shouldn’t be the goal of L&D. The goal is to help your company achieve its goals by offering learning aligned to you company’s goals and by providing all the required compliance and basic skill training. Spend your time and resources here, not on marketing the unaligned learning.

The Challenge of Transforming Our Profession, By Dave Vance

The Challenge of Transforming Our Profession

By Dave Vance Executive Director Center for Talent Reporting

 

Readers of this blog know that I am passionate about running learning like a business. Simply put, this means starting with a business plan for learning which contains SMART goals and then executing that plan with discipline through monthly meetings using reports showing progress against plan. The plan would contain your planned contribution toward company goals and planned improvements in effectiveness and efficiency measures. For example, the L&D leadership team, working with the CEO and owners of company goals, may have decided that learning could contribute to four of the six highest-priority goals and by working closely with those owners agreed on a reasonable planned impact from L&D and the steps required to achieve it. Furthermore, the L&D leadership team would have agreed on opportunities for improvement within the department and set plans for improvement which might include improving utilizations rates, reducing development time, or improving participant satisfactions scores, etc.

To many of us this approach (start with a plan and then execute it) simply seems like common sense. It is what most of our colleagues in other departments are already expected to do. We knew most practitioners had not been exposed to this approach at university or on the job, so it would be new and there would be the usual resistance to anything new. What has surprised us is, though, is the amount of resistance, or perhaps just plain lack of interest, from many department heads and HR leaders. Generally speaking, staff, especially in measurement and evaluation, have been very receptive. They see the value immediately but they often struggle to get their leaders on board. When we talk with these leaders we hear several common themes. First, “No one is asking me to do this so why should I?” Second, “I have succeeded so far in my career without using an approach like this so why should I change now?” Last, “Why would I commit to targets or plans when I may not be able to achieve them?”

These comments highlight the challenge we face transforming our profession. We are suggesting a higher level of transparency and accountability which scares many. This approach requires real, active management of the function, which some don’t believe is necessary for success. Furthermore, adopting this approach now in their career raises the question of why they weren’t always managing this way. Many leaders see only a downside from adopting this approach since it will require more work and will expose them to greater scrutiny. And it is certainly true that some plans will not be realized despite hard work and good intentions, so they may have to explain why plan was not achieved. The upside, however, is that they will become better managers and L&D will have a greater impact on the organization. They will have a greater impact by doing a better job of strategically aligning their initiatives to company goals, agreeing upfront on measures of success, establishing plans with specific, measurable goals to improve effectiveness and efficiency, and then executing the plan with discipline. They will become the highly valued, strategic business partners they aspire to be.

Bottom line, while there is more resistance from senior leaders than we had expected, we are more convinced than ever that the profession needs this type of business discipline. The transformation may take longer than expected, but it will come. And if it does not come from those currently in L&D leadership roles, it will come from the next generation of leaders who are already adopting these principles. It will also come from leaders outside L&D or HR who are moved into L&D precisely to bring much needed business discipline. In either case, it will come.

Objections to TDRp

Objections to TDRp

By Dave Vance

Human Capital Media just published a Special Report on Metrics and Measurement which featured TDRp. The report included comments from some early adopters as well as from a number who were not convinced that TDRp or standards were necessary. In the last blog I shared the basics of TDRp for those who were not familiar with it. In this blog I want to address some of the objections and questions. Let’s see what you think.

Objection 1: Senior leaders don’t need to see hundreds of measures. We absolutely agree! TDRp recommends focusing on the vital few which will of course be different for each organization. Department heads might focus on just 10-15 key measures and you might share just 5-10 key measures with senior leaders like the CEO, CFO and a governing body. In each case the senior leaders and department head would agree on the list of select measures.

Objection 2: All organizations are different so standard measures won’t work. True, all organizations are different and therefore they will use different sets of measures. But wouldn’t it be nice if we had a common language for those measures, common reports to put them in and, someday, standard names and definitions? The analogy with accounting is instructive here. Every organizations has some version of an income statement and balance sheet but the financial measures will not be the same for every organization. However, the names and definitions will be.

Objection 3: We don’t need to worry about alignment or outcome measures so that part of TDRp is not relevant for us. We agree if your focus is entirely on basic skill building and compliance. If all of your programs are to convey the skills needed by employees to achieve a minimum level of competence or meet compliance objectives, then just concentrate on meeting these objectives as efficiently and effectively as possible. On the other hand, if you are also charged with helping the organization accomplish its goals (like increase sales or improve quality or reduce costs), then you need to align your initiatives to these goals, agree on outcome measures with the owner of the goal, and agree on roles and responsibilities for each of you to deliver the agreed-upon outcome.

Objection 4: We don’t need TDRp guidance or reports to help us manage. We know there are many outstanding L&D leaders who have developed excellent processes and reports for managing their key initiatives and their department. For these, TDRp may not offer any improvement. But we believe the majority of L&D leaders could use some guidance. For example, we know that most are not having rich, proactive discussions with the owners of company goals to identify outcome measures. Most are not distinguishing between the few measures to actively manage and the many they may just wish to monitor. And very few are setting plan or target for the key measures and actively managing them on a monthly basis. So, unfortunately, most would benefit from some guidance.

Objection 5: We don’t need to prove our value. It is true that TDRp processes and reports will help establish you as a valued, strategic business partner and will help convey your alignment to important company goals and your impact on those goals. If you are already there, you should be proud of what you have accomplished. TDRp, though, might still provide valuable guidance on selecting measures, setting plans, and managing throughout the year. At Caterpillar, I was never asked to demonstrate our alignment or prove our value, but our CEO and senior leaders always appreciated seeing how we were aligned to their goals and what type of contribution we could make.

Bottom line, we believe most practitioners could benefit significantly from TDRp. And if you have some better ways to measure, report, or manage human capital, please share them with us. We are on a journey to improve the profession and your ideas are always welcome

The Discussion over TDRp and Standards: What Does TDRp Really Offer?

The Discussion over TDRp and Standards: What Does TDRp Really Offer?

By Dave Vance Executive Director CTR

Human Capital Media just published a Special Report on Metrics and Measurement which featured TDRp. The report included comments from some early adopters as well as from a number who were not convinced that TDRp or standards were necessary. I believe some in the latter category were not actually familiar with TDRp so in this blog I will share the basics and then in my next blog I will address some of the objections. Let’s see what you think.

At its core, TDRp recommends a standard language for our field and a framework for measuring, reporting and managing human capital. A partial analogy is provided by accounting which groups measures into four broad categories (revenue, expense, assets and liabilities), names and defines the measures, and creates three standard statements which contain the measures (income statement, balance sheet, and cash flow statement). TDRp, however, goes beyond just the reporting to recommend a standard process for how human capital should be managed and in this respect is more analogous to business. We believe our profession would benefit from a common language and framework and from standard processes, just as the accounting and business disciplines have. Benefits include guidance for the measuring, reporting, and management of L&D, reduced time spent debating names and definitions, enhanced clarity in communication, and improved benchmarking. In part, this is what defines a “profession” – that everyone in that profession shares a common language and approach, and it is what enables universities to teach it.

So, what does TDRp suggest for our profession (L&D in particular but all human capital more broadly)? First, we should adopt a common language including three broad categories of measures: outcome, effectiveness, and efficiency. Second, we should move towards using standard names and definitions for our measures. ATD has been using standard names and definitions for years so let’s start with existing best practices. Third, we should align our most important, discretionary initiatives to our company’s goals. (No need to align basic skills training or compliance related training – just do these as effectively and efficiently as possible.) Fourth, we should decide what is most important for the department to accomplish in the coming year. This will likely be a combination of initiatives to provide basic skills and help the company achieve its goals, and initiatives by the department head to improve effectiveness and efficiency. Fifth, select measures tied to these goals and set a plan or target for the few key measures we will manage to ensure our success. Last, use three types of reports (Operations, Program, and Summary) to manage our progress throughout the year. Each report should show the plan or target for the measure to be managed as well as the year-to-date progress and the forecast for the year.

That’s it. Adopt common language, standard processes, and standard reports. Most other professions have already done this. TDRp provides the guidance for us to do so as well, for us to become more “professional” and have even greater impact on our organizations.