Latest Resources

TDRp Message Shared with the Legal Profession

Dave Vance shared the TDRp message with learning professionals in the legal profession on December 1st. He spoke on Day 1 of the Professional Development Institute’s two-day annual conference in Washington, D.C. Three hundred attended the conference which is dedicated to the professional development of attorneys and legal firm staff members. Dave was asked to speak by several who had heard about TDRp and the Center for Talent Reporting and wondered if the principles would translate to the legal field. Dave worked with them over the summer and fall to adapt TDRp to the legal environment, creating a Program report and a Summary report reflecting typical legal firm goals and initiatives. The presentation is titled Strategic Talent Development Collaboration, Management, and Reporting.

Presentation Slides available for download

http://www.centerfortalentreporting.org/download/washington-conference-slides/ ?

What CEOs Know (and You Should Too)

By Dave Vance

Some in our profession have not had the opportunity to work closely with senior leaders like a CEO or CFO. With that in mind I would like to share some observations about senior leaders from my experiences at Caterpillar (first as the Chief Economist and later as the CLO) and at other companies since then. Of course it goes without saying that my sample size is small and thus may not be representative of all CEOs, but I believe most will share the traits discussed below.

First, despite their very lofty position, they are still just people. Now I know that some CEOs believe they have transcended to a whole new level of existence, but in my experience most were down to earth and you could just talk with them. This doesn’t mean you can waste their time because it is incredibly valuable so you need to be prepared and be clear about you want from them. At the same time, though, you don’t need to memorize a script or be scared to death about the meeting. It is okay to ask questions and with most it is okay not to have all the answers. Because they are people, don’t be surprised if they ask questions about your family or interests in addition to work. They really do want to know more about you, and some personal connection time breaks up their day which is otherwise filled with 10-12 hours of work-related issues. So, relax a little bit and enjoy the time you get with them. And be prepared to be surprised at what you might learn about them and the organization.

Second, they have a very high tolerance for some uncertainty and ambiguity. Their tolerance will almost certainly be much greater than yours because they have to deal with it every single day. Having said that, though, we need to provide some context. They will not have a high tolerance for your being unprepared or acting unprofessionally or making excuses. They will, however, understand that a plan for next year is just that: a plan. They know from experience that plans are usually wrong because something unexpected typically happens. So, risk cannot be eliminated because of fundamental uncertainties about the future but it can be managed and that starts with ensuring you have a good process for planning and that you have talked with the right people. So, expect questions about how you came up with the plan and who you talked to. If you talked to the right people (like the Sales SVP to agree on an outcome measure) and made reasonable assumptions, then they are going to feel good about the plan. Put differently, there is nothing they can do to guarantee a perfect plan so instead they will focus on what they can influence (process and people). So, just be humble in setting and communicating plans. Don’t worry that they are not perfect.

Third, they know that you are going to make some mistakes along the way. They themselves are still making mistakes and other senior leaders are still making mistakes so it is only natural that you will, too. (Unless you believe you are far superior to them!) Many in our profession are afraid that the CEO is going to discover they are not perfect, and thus are afraid to share TDRp-type reports which may show that every measure is not on track to meet plan. News flash: Your CEO already knows you are not perfect and they don’t expect you to achieve or exceed plan on every single measure. In fact, any good leader would be very skeptical of someone who says they always achieve or exceed plan because that is not how the world works. Unless you set incredibly easy targets to meet, you are going to achieve some, exceed some, and fall short on the rest. This is what your CEO is expecting. I met with our Board of Governors (CEO and other senior leaders) each quarter and I never reported that we were on plan to achieve or exceed all measures. Likewise, as an economist it was impossible to get all the forecasts right. Since your CEO already knows this (even if you do not) they will be looking to see if you are honest and transparent, willing to share the bad news as well as the good news. They will then want to see if you can address those areas where you are behind without being defensive or blaming anyone else. They want to know that you are on top of the issues and are doing everything that makes sense to deliver the planned results. So, don’t worry about letting the CEO see you are not perfect. Instead show them that you can manage L&D like business.

If you have not already discovered the above for yourself, I hope this makes you a little more confident and at ease about meeting and interacting with your CEO and other senior leaders. At the end of the day they are just people just like you, planning in the face of uncertainty, making mistakes, and moving forward despite it all.

Running Learning Like a Business: What Does It Mean and Is It Still Relevant? By Dave Vance

Running learning like a business simply means applying business discipline to the learning field. More precisely, it means creating a plan with specific, measurable goals and then executing that plan with discipline throughout the year. That’s it. Most organizations already do this at a high level. They create a business plan for the year and then create and use reports on a monthly basis to manage their activities to come as close to plan as possible. At a lower level many other departments already do this as well. Think about a typical sales, quality, or manufacturing department. They establish goals at the start of the year and then they compare their progress each month against those goals, taking appropriate actions as necessary to stay on plan or get back on plan if they are behind. The point is they have a plan and they manage to it.

It is my experience that fewer than 10% of learning departments manage this way. Most would say they work very hard and accomplish a great deal. And they do work hard and they do accomplish a lot. (And most produce activity reports to show just how busy they are (number of courses, participants, etc.)). But they could have an even greater impact on their organization if they worked smarter. They might not deliver more courses or reach more employees, but they will make more of a difference in their organization’s performance. So, what does it mean to work smarter? Well, it means that you plan your actions more carefully and then you execute your plan with discipline. The upfront planning means that you think carefully and critically about what you want to accomplish, why you want to accomplish it, what you will need in terms of resources to accomplish it, and what measures you will use. Invariably this will involve tradeoffs and prioritization since resources are always limited. So, what is more important for you to work on this coming year? Where can you make the greatest difference or add the most value? Once you have some ideas (and they may be different than what you have been working on!) the next task is to settle on a reasonable and achievable goal which means you must think carefully about what resources will be required. It simply may not be possible to accomplish the goal with your resources in which case you need to set a more realistic goal. And this will force you to think about measures to use in setting the plan and lining up your work effort.

While this does require work, I would argue that a disciplined planning process culminating in specific, measurable goals will help ensure you are focused on the right goals for the right reasons with the right resources. In essence this is a learning process and if you don’t plan this way you deprive yourself of this very important learning opportunity. I think many leaders would say they do think about goals and resources but they stop short of setting specific, measurable goals. In this case they have not thought critically about the level of resources required (without a specific measurable plan how do you know how resource will be required?) nor have they created reports with a column for plan to use in managing the learning each month. Theoretically, it would be possible for these leaders to accomplish the same as those with plans but practically speaking it is far less likely. After all, why do you think your colleagues in other departments go through all this work? They wouldn’t if it didn’t lead to better results. Simply put, running learning like a business will ensure you are focused on the right learning with the right goals, measures and resources to deliver the greatest value to the organization.

While I believe more learning leaders are running learning like a business, most are not. One objection I hear is that this focus on planning and executing is no longer relevant. Some tell me that their organization’s goals change every two or three months and thus it makes no sense to plan. This is ridiculous. I don’t think the company’s goals change every few months and, if they do, I would question senior leadership’s abilities to manage the company. It may be that priorities shift during the year but goals like increasing sales, quality, and employee engagement are not going to change every few months. (They may mean that their own projects or work assignments change frequently which may actually be a result of not having a good business plan for learning.) Others say they have stable goals but their company’s products (like cell phones) change once per quarter and thus there is no way to plan. Of course there is. Focus first on the goal of increasing sales and let this be your guide in planning programs. True, new models will come out and you will have to design training programs around them but you don’t need all the specifics at the start of the year. You have an idea of what is required for each product launch so use that to plan. Your specific, measurable plans for reaction, learning and application are probably the same for all your product-related learning and you have ideas about the size of the target audience. So, plan based on what you know and refine them as you get better information through the year.

Bottom line, running learning like a business is simply applying time-tested basic business skills to learning. Outside of learning, it is nothing new. And, outside of learning it is still very much relevant. So, let’s move that percentage up from 10% and see what a difference it makes to manage learning this way. Give it a try and see what results you get. I think you will be pleasantly surprised.

Four Approaches to Break the “No One Is Asking” Cycle By Peggy Parskey

 

Untitled2

 

 

 

 

In the last 18 months, I have noticed a concerning shift in the sentiment of learning professionals about ramping up the quality of their measurement and in particular their reporting. This sentiment had been quite prevalent 10-12 years ago, but abated for a while. Now it’s back with a vengeance.
The essence of the sentiment is “Business leaders aren’t asking for this information, so there is no need for us to provide it.”  Most L&D practitioners don’t state it quite like that of course. They say, “Our business leaders only want business impact data.” Sometimes the stated reason is that business leaders consider L&D self-report data to be useless. Regardless of how they say it, the message is clear: “Business leaders aren’t asking for it (or worse, don’t want it), so I’m not going to bother.”
I’m not imaging this shift. Brandon Hall Group recently published a study entitled, “Learning Measurement 2016: Little Linkage to Performance.” In their study, they found that the pressure to measure learning is coming mostly from within the learning function itself. In addition, of the 367 companies studied, 13% said there was no pressure at all to measure learning.

 

What’s concerning about this situation?

L&D has been saying for years that it wants to be a viewed as a strategic partner and get a seat at the proverbial table.  Yet, the sentiment that “no one is asking for measurement data” is equivalent to saying, “Hey, I’m just an order taker. No orders, no product or service.”   If we are going to break free of an order-taking mentality, then we need to think strategically about measurement and reporting, not just the solutions L&D creates.

 

The sad reality is that often business leaders do not ask for L&D data because they don’t know what data is available or they don’t understand its value.  Many business leaders assume L&D is only capable of reporting activity or reaction (L1) data. Many still pejoratively refer to L&D evaluations as Happy Sheets or Smile Sheets.  If they believe that’s all L&D can do, then of course they won’t ask or exert any pressure to do more.

 

Equally importantly, innovation would come to a standstill if organizations only produced products and services that resulted from a client request. Henry Ford famously said, “If I had asked people what they wanted, they would have said faster horses.”  Steve Jobs talked about the dangers of group-think as a product development strategy when he said, “It’s really hard to design products by focus groups. A lot of times, people don’t know what they want until you show it to them.”  Simply because your internal client cannot envision what you can provide does not mean you shouldn’t provide it and then iterate to get it right.

 

The big problem with a “no one is asking” mentality is that it is a self-fulfilling prophesy and a downward spiral. At some point, someone needs to break the cycle. It might as well be L&D.  (See the Infographic)

 

What can you do differently right now?

Fortunately, as an individual learning practitioner you can break this cycle. Here are four approaches (singly or together) you should consider:Untitled

  1. Reframe the problem: If truly ‘no one is asking”, stop and ask yourself, “Why aren’t leaders asking?” Is it because this information has no value or because they don’t understand the value? Is it because they don’t know what L&D can provide?” Based on your answers to these questions, develop a game plan to demonstrate the value of the data you can provide (and then provide it to them).
  2. Find a friendly: If you want to break the ‘negative reinforcing loop’, find a business leader who is willing to work with you, a ‘friendly.’  You can spot a friendly fairly easily. She is interested in how L&D can add value to her business. He views you as a trusted advisor, brainstorming how to build new skills and capabilities within his team. She is innovative and willing to try new approaches even if they might not succeed the first time out. If you have one or two such business leaders with whom you work, seek them out and discuss what data you could provide to answer their questions.
  3. Report on data that matters to the business leader:From a measurement and reporting standpoint, L&D still puts too much of its energy into gathering and reporting data that matters to L&D but is not compelling to the business.  The number of employees who attended courses or the results of your Level 1 evaluation are simply not important to the business. Look beyond your current measures and educate yourself on best practices in L&D measurement.  Integrate questions about learning application and support into your evaluation instruments. This is data business leaders will care about if you show them how it affects them and their organization.
  4. Tell a compelling story:Do you remember the Magic Eye picture within a picture phenomenon? If you held the picture up to your nose, you might see the constellation Orion buried inside the picture. (I never saw anything.)  If you believe your data is meaningful and can help the business, don’t use the Magic Eye approach. Don’t expect your business partner to find the meaning in the data. Rather, tell the story behind the charts and graphs through dialogue. Help your business partner connect the dots; help her understand the consequences of not acting on the data and the benefits if she does.

A real life example

The ability of employees to apply training in the workplace depends on several conditions, much of it outside control of the L&D department. Factors such as the quality of the training, the motivation of the employee, the opportunity to apply the training to real work and the reinforcement of the direct manager all affect the extent to which training is applied.

 

A few years ago, I worked with a company that sold complex financial software. They re-engineered their implementation process to simplify the client experience, reduce implementation time and accelerate revenue recognition. The business leaders identified project management (PM) skills as critical to the success of this new approach.

 

The Process Transformation Team identified an initial group of employees to attend the PM training and pilot the new approach with several clients. When they reviewed the pilot results they were disappointed. Implementation time had not declined appreciably and the client felt that the process was more complex than expected. The Team Leaders investigated and found that the employees’ managers were not reinforcing the training or directing them to support resources when they struggled to apply the PM methodology in a real life setting. They also found that the job aids L&D had created were too cumbersome and not designed to be used in a dynamic client setting.

 

Imagine you are the L&D partner of the Implementation Transformation Business Leader. What data could you have provided to demonstrate that his people were not building and honing skills in this critical discipline?

 

This leader needed a regular stream of L&D data on actual application on the job and barriers to application.  He needed data on employees’ perceptions of how this training would impact business outcomes of simplification, implementation time and reduced time to revenue recognition. He needed to understand the barriers to application and where the business was accountable to address the issue or L&D. Moreover, he did not yet need incontrovertible proof that the training was improving business outcomes.

 

Data from L&D was essential for this leader to take action and address issues that affected his ability to successfully transform a key client process.  Unfortunately, the business leader didn’t realize that L&D could help him get ahead of this issue and didn’t think to ask. After L&D and the business leader started talking, sharing data and insights, the Leader not only acted, but worked with L&D to develop regular business-oriented reports.

 


Final thoughts

As an L&D practitioner, you can break the negative reinforcing cycle. Why not regularly provide this type of data and use it to create a dialogue about what other insights you can provide?  Why not take the first step to dispel the belief that L&D has no useful data or insights to offer?  It’s in your hands.

 

Have you had a “No One is Asking” moment? I would love to hear from you. Follow me on Twitter @peggyparskey or connect with me on LinkedIn at https://www.linkedin.com/in/peggy-parskey-11634

 

It’s Time to Solve the “The Last Mile Problem” in L&D

 

The telecommunications industry has had to confront a challenge referred to as “The Last Mile Problem.” This phrase describes the disproportional effort and cost to connect the broader infrastructure and communications network to the customer who resides in the ‘last mile.’

Learning Measurement also has a “Last Mile” problem. We have a wealth of data, automated reporting and data analysts who attempt to make sense of the data. Yet, as I observe and work with L&D organizations, I continually witness “Last Mile Problems” where the data doesn’t create new insights, enable wise decisions or drive meaningful action. If we want our investments in measurement to pay off, we need first to recognize that we have a problem and then second, to fix it.

Do you have a “Last Mile” problem?  Here are six indicators:

  1. One-size-fits-all reporting
  2. Mismatch between reporting and decision-making cadence
  3. Lack of context for assessing performance
  4. Poor data visualizations
  5. Little attention to the variability of the data
  6. Insufficient insight that answers the “so what?” question

Let’s explore each indicator with a brief discussion of the problem and the consequences for learning measurement

1.    One size fits all reporting

While L&D organizations generate a plethora of reports, not enough of them tailor their reports to the needs of their various audiences. Each audience, from senior leaders to program managers to instructors has a unique perspective and requires different data to inform decisions.

In an age of personally targeted ads on Facebook (creepy as they may be), we each want a customized, “made for me” experience. A single report that attempts to address the needs of all users will prove frustrating to everyone and meet the needs of no one. Ultimately, the one-size-fits-all approach will lead users to request ad hoc reports, create their own shadow process or simply resort to gut feel since the data provided wasn’t very useful.

2.    Mismatch between Reporting and Decision Making Cadence

Every organization has a cadence for making decisions and the cadence will vary based on the stakeholder and the decisions he/she will make. Instructors and course owners need data as soon as the course has completed. Course owners will also need monthly data to compare performance across the courses in their portfolio. The CLO will need monthly data but may also want a report when a particular measure is above or below a specific threshold.

In a world of high velocity decision making, decision makers need the right information at the right time for them. When the reporting cycle is mismatched with the timing of decision-making, the reports become ineffective as decision-making tools. The result? See #1: shadow processes or ad hoc reporting to address decision needs.

3.    Lack of context to assess performance

Many L&D reports present data without any context for what ‘good’ looks like.  The reports display the data, perhaps comparing across business units or showing historical performance. However, too many reports do not include a goal, a performance threshold or a benchmark.

Leaders don’t manage to trends nor do they manage to comparative data. Without specific performance goals or thresholds, L&D leaders lose the ability to motivate performance on the front end and have no basis for corrective action on the back end. These reports may be interesting, but ultimately produce little action.

4.    Poor data visualization

Data visualization is a hot topic and there is no shortage of books, blogs and training courses available. Despite the abundance of information, L&D’s charts and graphs appear not to have received the memo on the importance of following data display best practices.

Look at the reports you receive (or create). Do your visuals include unnecessary flourishes (also known as chart junk)? Do they contain 3-D charts? Do they present pie charts with more than five slices? Do your annotations simply describe the data on the chart or graph? If you answered “yes” to any of these questions, then you are making the last mile rockier and more challenging for your clients than it needs to be.

No one wants to work hard to figure out what your chart means. They don’t want to struggle to discern if that data point on the 3D chart is hovering near 3.2 (which is how it looks) or if the value is really 3.0.  They won’t pull out a magnifying glass to see which slice of the pie represents their Business Unit.

Poor visualizations fail to illuminate the story.  You have made it more difficult for your users to gain new insights or make decisions.  Over time, your readers shut down and opt to get their information elsewhere.

5.    Little attention to the variability of the data

As the L&D community increasingly embraces measurement and evaluation, it has also ratcheted up its reporting. Most L&D organizations publish reports with activity data, effectiveness scores, test results, cost data and often business results.

What we rarely consider, however, is the underlying variability of the data.  Without quantifying the variance, we may over-react to changes that are noise or under-react to changes that look like noise but in fact are signals.  We are doing our users a disservice by not revealing the normal variability of the data that helps to guide their decisions.

6.    No answers to the “So-what” question

This last mile problem is perhaps the most frustrating. Assume you have addressed the other five issues. You have role-relevant reports. The reports are synchronized with decision cycles. You have included goals and exception thresholds based on data variability. Your visualization are top notch.

Unfortunately, we all too often present data “as is” without any insights, context, or meaningful recommendations. In an attempt to add something that looks intelligent, we may add text that describes the graph or chart (e.g. “This chart shows that our learning performance declined in Q2.”). Perhaps we think our audience knows more than we do about what is driving the result. Often, we are rushed to publish the report and don’t have time to investigate underlying causes. We rationalize that it’s better to get the data out there as is than publish it late or not at all.

Amanda Cox, the New York Times Graphics Editor once said: “Nothing really important is headlined, “Here is some data. Hope you find something interesting.”

If we are going to shorten the last mile, then we have an obligation to highlight the insights for our users. Otherwise, we have left them standing on a lonely road hoping to hitch a ride home.

Recommendations to solve the “Last Mile” problems

There is a lot you can do to address the “Last Mile Problem” in L&D. Here are six suggestions that can help:

  • Create a reporting strategy. Consider your audiences, the decisions they need to make and how to present the data to speed time to insight. Include in your reporting strategy the frequency of reports and data to match the decision-making cadence of each role.
  • Identify a performance threshold for every measure on your report or dashboard that you intend to actively manage.  In the beginning, use historical data to set a goal or use benchmarks if they are available. Over time, set stretch goals to incent your team to improve its performance.
  • Learn about data visualization best practices and apply them.  Start with Stephen Few’s books. They are fun to read and even if you only pick up a few tips, you will see substantive improvements in your graphs and charts.
  • Periodically review the variability of your data to see if its behavior has changed. Use control charts (they are easy to create in Excel) for highly variable data to discern when your results are above or below your control limits.
  • Add meaningful annotations to your graphs and charts. Do your homework before you present your data. If a result looks odd, follow up and try to uncover the cause of the anomaly.
  • For senior leaders, in particular, discuss the data in a meeting (virtual or face-to-face). Use the meeting to explore and understand the findings and agree on actions and accountability for follow up. Employ these meetings to create a deeper understanding of how you can continually improve the quality of your reporting.

If you start taking these recommendations to heart, your last mile could very well shrink to a very small and manageable distance. Have you experienced a “Last Mile” problem? I’d love to hear from you.

Follow me on Twitter @peggyparskey or subscribe to my blog at  parskeyconsulting.blogspot.com/

CTR Is excited to Announce the Release of a Significantly Updated TDRp Measures Libary

CTR is excited to announce the release of a significantly updated version of the TDRp Measures Library. This update focuses on Learning and Development Measures and has expanded from 111 measures to 183 Learning related measures.  The additions include measures related to informal and social learning as well as more detailed breakdown of the cost measures.  This version of the library also includes measures defined by the CEB Corporate Leadership Council as well as updated references to the HCMI Human Capital Metrics Handbook and updated references to ATD, Kirkpatrick and Philips defined measures

If you are a CTR member, you have access to the updated version at no additional charge. https://www.centerfortalentreporting.org/download/talent-acquisition/ .

If you are not a member, join CTR for $299/year

The Missing Component of L&D Management By Dave Vance

The L&D field is more exciting than ever, particularly with the advent of smart systems, mobile learning, big data, and greater access to content than ever before. Even predictive analytics is beginning to move beyond workforce planning, retention and recruiting to some applications in the learning arena. What is missing, however, is an improvement in how we manage our programs and departments. Our management practices have not kept pace and often have not improved at all over the last twenty years. Sadly, most masters and Ph.D. programs in organizational or human capital development do not include even a single course in management, and there is often little opportunity to learn good management practices on the job.

Now, I know what some of you are thinking. You would tell me that you do manage. You hire people into the department, set goals for them, and have performance reviews. You create an annual budget. You implement new systems, you roll out new courses and update existing content, and you might even track where staff spend their time. You address issues when they arise and you may have a long-term strategy. All of these activities are important and, yes, are a part of the overall management of the department. But there is a component of management that is missing in most organizations, and it is a critical component – one that has the potential to drive significant value creation. In fact, in most organizations it would drive much greater improvement than adopting a new LMS, implementing a mobile strategy, or doubling the amount of e-learning available.

What is this magical, powerful management component? It is the disciplined process of creating a plan and then executing it. Sounds simple and obvious, right? It is not, which is why very few L&D departments manage this way today. Let’s put this concept in concrete terms. It means the department head and senior department leaders need to decide what they want to do for the coming year and then create SMART goals for each item. Do you want to improve participant satisfaction or the application rate? Great, set a specific, measurable goal like a 5 percentage point increase in the application rate. Do you want to shift your mix of learning from instructor-led to online? You need to set a specific, measurable goal like increasing the percentage of online learning from 25% to 45%. Of course, you won’t accomplish these goals simply because you wrote them down on paper. You need to create action plans for each one, detailing all that will be necessary to achieve the desired improvement, including any additional staff or budget. You will need to assign someone responsibility for it and everyone will need a clear understanding of their roles and responsibilities.

And that is just the start. Once the plan is complete containing all these SMART goals and the year is underway, you will need to actively manage these initiatives to ensure their success. You will need to use reports every month to see if you are on plan or not. If you are not on plan, or if the forecast shows you falling short of plan by year-end, then you need to take management action to get back on plan. So, the department head should have a dedicated (meaning the only agenda item is management of the department) one or two-hour meeting each month with her direct reports to review progress against goals and decide on actions to get back on plan. (At Caterpillar, this meeting was the second Tuesday of every month, and we always used the full two hours.)

This, then, is the essence of good management. You must spend the time upfront to craft a good plan with specific, measurable goals, and then you must execute that plan with discipline. Very few L&D departments do this today. Nearly all can tell you how many courses were delivered last year to how many participants, and they can tell what the participant satisfaction was with those courses. But very few created a business plan before the year started with SMART goals, and fewer yet produced the reports necessary to manage throughout the year. Almost none have a monthly meeting dedicated to using these reports to manage the key programs and initiatives of the department. So, conceptually, this all sounds pretty straightforward, and colleagues in other departments would assume L&D already operates this way. After all, many of them do and have done so for a long time. (Can you imagine a sales department with no SMART goals and no monthly meetings to compare progress to those goals?) We need to adopt the same disciplined approach to management. It will result in our achieving significantly more than the alternative management approach which is basically that everyone will work hard and we will achieve whatever we achieve. Given the current state of management in L&D, I am convinced that the discipline of creating a good plan and then executing it will deliver far greater value in most organizations than any other single change, program or initiative.

 

 

 

Corporate Learning Week Conference, Dave Vance to Speak: CTR Exclusive Discout

2016 CLW Logo

 

 

 

 

Corporate Learning Week
November 7-10, 2016
Renaissance Downtown, Dallas, Texas
www.clnweek.com

Join us on LinkedIn: https://www.linkedin.com/groups/7052790

Theme: Disrupting Ourselves: L&D Innovation at the Speed of Change

Corporate Learning Week 2016 will focus on powerful new tools and capabilities to help you transform your learning organization to align with your organization’s core needs and core values, to strike the balance between best practices and next practices that will engage today’s dynamic workforce.

New for 2016, we have confirmed 2 exciting site tours:

– Brinker International (Chili’s & Maggiano’s)

– American Airlines Training Center

Corporate Learning Week 2016 is designed for L&D leaders who are:

– Taking charge of the end-to-end talent operations

– Innovating their learning ecosystems to keep pace with the speed of change

– Rethinking learning measurement across the board

– Looking for ways to accelerate leadership development

 

Have some fun while you’re away from the office and build stronger relationships within your team by taking part in our featured team building activities:

– Keynote Meet n Greet

– Team Scavenger Hunt

– Dallas Foodie Crawl

– Dedicated Team Bonding

 

Featured Speakers:

Diane August, Chief Learning Architect, Nationwide
Brad Samargya, Chief Learning Officer, Ericsson
Lisa Cannata, Chief Learning Officer, Orlando Health
Jon Kaplan, CLO & VP of L&D, Discover Financial
Karen Hebert-Maccaro, Chief Learning Officer, AthenaHealth
Katica Roy, Global VP, Learning Analytics, SAP
Tom Spahr, Vice President of L&D, The Home Depot
Marc Ramos, Head of Education for Google Fiber, Google
Matt Smith, Vice President, Talent & Learning, Fairmont Hotels & Resorts
Mary Andrade, Head of Learning Design & Delivery Center of Excellence, McKinsey
Corey Rewis, SVP of Learning & Leadership Development, Bank of America
Nathan Knight, Director of L&D, Sony
Casper Moerck, Head of Learning Technology – Americas, Siemens
Kate Day, VP Workforce Transformation, MetLife
Mong Sai, Director, Learning and Organizational Development, Newegg
Sarah Mineau, AVP Learning & Development, State Farm Insurance
Sheryl Black, Head of Training Operations, American Airlines
Stacy Henry, VP L&D Americas, Bridgestone
Namrata Yadav, Senior Vice President, Head of Diversity & Inclusion Learning, Bank of America
James Woolsey, President, DAU
Sanford Gold, Sr. Director, Global Leadership & Development, ADP
Donna Simonetta, Senior Director, Commercial Services Learning, Hilton Hotels
Melissa Carlson, Director, Leadership Development, Quintiles
Paul Rumsey, Chief Learning Officer, Parkland Health & Hospital System
Priscilla Gill, SPHR, PCC, Leadership and Organization Development, Mayo Clinic
Shveta Miglani, Learning and Development Director, SanDisk
Chenier Mershon, Vice President Operations and Training, La Quinta Inns & Suites
Al Cornish, Chief Learning Officer – Norton University, Norton Healthcare
Angel Green, Director of Talent & Learning, Coca-Cola Beverages Florida
Jeremy Hunter, Creator of The Executive Mind @ Drucker Graduate School of Management
Josh Davis, Director of Research, NeuroLeadership Institute
James Mitchell, CLO & Head of Global Talent Development, Rackspace
Dom Perry, VP of L&D, Brinker (Chili’s & Maggiano’s)
Leah Minthorn, Director of Operations Learning – North America, Iron Mountain
Emmanuel Dalavai, Global Program Manager, Training & Leadership Development, Aviall, A Boeing Company
David Vance, President, Center for Talent Reporting
Sarah Otley, Next Generation Learning Lab Director, Capgemini University, Capgemini

Discounts 

CTR EXCLUSIVE: Save 20% off  2016CLW_CTR. Register now!

It’s Time to Solve the “The Last Mile Problem” in L&D By Peggy Parskey

The telecommunications industry has had to confront a challenge referred to as “The Last Mile Problem.” This phrase describes the disproportional effort and cost to connect the broader infrastructure and communications network to the customer who resides in the ‘last mile.’

Learning Measurement also has a “Last Mile” problem. We have a wealth of data, automated reporting and data analysts who attempt to make sense of the data. Yet, as I observe and work with L&D organizations, I continually witness “Last Mile Problems” where the data doesn’t create new insights, enable wise decisions or drive meaningful action. If we want our investments in measurement to pay off, we need first to recognize that we have a problem and then second, to fix it.

Do you have a “Last Mile” problem?  Here are six indicators:

  1. One-size-fits-all reporting
  2. Mismatch between reporting and decision-making cadence
  3. Lack of context for assessing performance
  4. Poor data visualizations
  5. Little attention to the variability of the data
  6. Insufficient insight that answers the “so what?” question

Let’s explore each indicator with a brief discussion of the problem and the consequences for learning measurement

1.    One size fits all reporting

While L&D organizations generate a plethora of reports, not enough of them tailor their reports to the needs of their various audiences. Each audience, from senior leaders to program managers to instructors has a unique perspective and requires different data to inform decisions.

In an age of personally targeted ads on Facebook (creepy as they may be), we each want a customized, “made for me” experience. A single report that attempts to address the needs of all users will prove frustrating to everyone and meet the needs of no one. Ultimately, the one-size-fits-all approach will lead users to request ad hoc reports, create their own shadow process or simply resort to gut feel since the data provided wasn’t very useful.

2.    Mismatch between Reporting and Decision Making Cadence

Every organization has a cadence for making decisions and the cadence will vary based on the stakeholder and the decisions he/she will make. Instructors and course owners need data as soon as the course has completed. Course owners will also need monthly data to compare performance across the courses in their portfolio. The CLO will need monthly data but may also want a report when a particular measure is above or below a specific threshold.

In a world of high velocity decision making, decision makers need the right information at the right time for them. When the reporting cycle is mismatched with the timing of decision-making, the reports become ineffective as decision-making tools. The result? See #1: shadow processes or ad hoc reporting to address decision needs.

3.    Lack of context to assess performance

Many L&D reports present data without any context for what ‘good’ looks like.  The reports display the data, perhaps comparing across business units or showing historical performance. However, too many reports do not include a goal, a performance threshold or a benchmark.

Leaders don’t manage to trends nor do they manage to comparative data. Without specific performance goals or thresholds, L&D leaders lose the ability to motivate performance on the front end and have no basis for corrective action on the back end. These reports may be interesting, but ultimately produce little action.

4.    Poor data visualization

Data visualization is a hot topic and there is no shortage of books, blogs and training courses available. Despite the abundance of information, L&D’s charts and graphs appear not to have received the memo on the importance of following data display best practices.

Look at the reports you receive (or create). Do your visuals include unnecessary flourishes (also known as chart junk)? Do they contain 3-D charts? Do they present pie charts with more than five slices? Do your annotations simply describe the data on the chart or graph? If you answered “yes” to any of these questions, then you are making the last mile rockier and more challenging for your clients than it needs to be.

No one wants to work hard to figure out what your chart means. They don’t want to struggle to discern if that data point on the 3D chart is hovering near 3.2 (which is how it looks) or if the value is really 3.0.  They won’t pull out a magnifying glass to see which slice of the pie represents their Business Unit.

Poor visualizations fail to illuminate the story.  You have made it more difficult for your users to gain new insights or make decisions.  Over time, your readers shut down and opt to get their information elsewhere.

5.    Little attention to the variability of the data

As the L&D community increasingly embraces measurement and evaluation, it has also ratcheted up its reporting. Most L&D organizations publish reports with activity data, effectiveness scores, test results, cost data and often business results.

What we rarely consider, however, is the underlying variability of the data.  Without quantifying the variance, we may over-react to changes that are noise or under-react to changes that look like noise but in fact are signals.  We are doing our users a disservice by not revealing the normal variability of the data that helps to guide their decisions.

6.    No answers to the “So-what” question

This last mile problem is perhaps the most frustrating. Assume you have addressed the other five issues. You have role-relevant reports. The reports are synchronized with decision cycles. You have included goals and exception thresholds based on data variability. Your visualization are top notch.

Unfortunately, we all too often present data “as is” without any insights, context, or meaningful recommendations. In an attempt to add something that looks intelligent, we may add text that describes the graph or chart (e.g. “This chart shows that our learning performance declined in Q2.”). Perhaps we think our audience knows more than we do about what is driving the result. Often, we are rushed to publish the report and don’t have time to investigate underlying causes. We rationalize that it’s better to get the data out there as is than publish it late or not at all.

Amanda Cox, the New York Times Graphics Editor once said: “Nothing really important is headlined, “Here is some data. Hope you find something interesting.”

If we are going to shorten the last mile, then we have an obligation to highlight the insights for our users. Otherwise, we have left them standing on a lonely road hoping to hitch a ride home.

Recommendations to solve the “Last Mile” problems

There is a lot you can do to address the “Last Mile Problem” in L&D. Here are six suggestions that can help:

  • Create a reporting strategy. Consider your audiences, the decisions they need to make and how to present the data to speed time to insight. Include in your reporting strategy the frequency of reports and data to match the decision-making cadence of each role.
  • Identify a performance threshold for every measure on your report or dashboard that you intend to actively manage.  In the beginning, use historical data to set a goal or use benchmarks if they are available. Over time, set stretch goals to incent your team to improve its performance.
  • Learn about data visualization best practices and apply them.  Start with Stephen Few’s books. They are fun to read and even if you only pick up a few tips, you will see substantive improvements in your graphs and charts.
  • Periodically review the variability of your data to see if its behavior has changed. Use control charts (they are easy to create in Excel) for highly variable data to discern when your results are above or below your control limits.
  • Add meaningful annotations to your graphs and charts. Do your homework before you present your data. If a result looks odd, follow up and try to uncover the cause of the anomaly.
  • For senior leaders, in particular, discuss the data in a meeting (virtual or face-to-face). Use the meeting to explore and understand the findings and agree on actions and accountability for follow up. Employ these meetings to create a deeper understanding of how you can continually improve the quality of your reporting.

If you start taking these recommendations to heart, your last mile could very well shrink to a very small and manageable distance. Have you experienced a “Last Mile” problem? I’d love to hear from you.

Follow me on Twitter @peggyparskey or subscribe to my blog at  parskeyconsulting.blogspot.com/

Please Welcome Jeff Carpenter to the CTR Board

2953551

 

 

 

 

 

Jeff Carpenter is a Senior Principal at Caveo Learning. Jeff works with clients to bridge performance gaps by addressing process improvements as well as front-line knowledge and skill development programs for Fortune 1000 companies.

Jeff has worked in entrepreneurial environments as a senior leader working to build internal organizational structures and business processes while leading teams at many Fortune 500 clients to solve some of their most pressing performance and process issues.

We are excited to welcome Jeff to the CTR Board. His knowledge and expertise will enhance our board and make CTR even greater. We have worked closely with Jeff in our past conferences and he is a great supporter of CTR and TDRp.

Please Welcome Joshua Craver to the CTR Board

AAEAAQAAAAAAAAV7AAAAJDkxYzU3MTUyLTJiYWMtNDE0Ni1hMmFjLWM2NTg2NjViMGQxYg

 

 

 

 

 

Joshua Craver is a values based and results oriented HR executive. In March of 2012 he joined Western Union as their global HR Chief of Staff. In January 2013, Joshua took on a new role as the Head of Western Union University and VP of Talent Management. Previous to this he lived and worked in India, Mexico and Argentina for over 7 years in various HR leadership roles. Based on these experiences he is well versed in growth markets strategy and execution.

Joshua also worked at the strategy consulting firm Booz Allen Hamilton. Companies that he has consulted within include but are not limited to The World Bank, Georgetown University Hospital, GE, CIBA, Scotia Bank, Qwest, Famers Insurance, Electronic Arts, Citibank, Agilent Technologies, Cigna, DuPont, Nissan, Lowes, Chevron and Cisco. He has also conducted business in over 40 countries.

CTR is happy and honored to have Joshua join our CTR Board. Josh has been one of our greatest supporters. We are excited to have his expertise, energy, and insight.

Are You Spending At Least 20% of Your Time Reinforcing Learning?

By Dave Vance

Training may not be rocket science, but it is a lot more complicated than it first appears. This is especially true when it comes to reinforcing the learning so that it will be applied and have the planned impact. After all, if the learning is not actually applied, there will be no impact and it will be “scrap learning”, regardless of how well it is designed or brilliantly delivered. So, your learning can be well received (high level 1 participant reaction) and the target audience can demonstrate mastery of the content (high level 2), but if it is not applied (level 3) there will be no impact (level 4) and the ROI will be negative (level 5). Unfortunately, all too often we don’t even measure level 3 to find out if the learning was applied. We deliver the training, hope for the best, and move on to the next need. We need to do better.

Our greatest opportunity here is to help the sponsors (who requested the learning for their employees) understand their very important role in managing and reinforcing the learning. Learning professionals can provide advice but cannot do this for them. Sponsors need to understand their role in communicating the need and benefits of the training to their employees before the course is delivered. (This will require clarity upfront on exactly what new behaviors or results the sponsor expects to see.) Ideally, the employee’s supervisor will also communicate expectations immediately preceding the course. The sponsor needs to ensure that the intended target audience takes the course and follow up with those who did not. Once employees have taken the course, the sponsor and direct supervisor need to reinforce the training to make sure the new knowledge, skills or behaviors are being used. They need to again clearly state what they expect as a result of the training and let the employees know what they are looking for. The employees (individually or as a group) should meet with supervisor to discuss the training and confirm application plans. The sponsor will need to be ready with both positive and negative consequences to elicit the desired results. All of this constitutes a robust, proactive reinforcement plan which should ideally be in place before the training is provided.

As you can tell, this takes time and effort. How much effort? My belief is that it will take at least 20% of the time you dedicate to planning, developing and delivering to do this well. So, if it takes five weeks to plan, develop and deliver a course, are you planning to spend at least 40 hours on reinforcement? Many sponsors and supervisors have no clue about the importance of reinforcement and will not do it unless you can make them understand the importance. Many think that once they have engaged L&D, their job is done and training will take care of everything else. However, this cannot possibly be the case. The target audience employees don’t work for L&D, and L&D professionals cannot make employees in sales or quality of manufacturing do anything. L&D certainly cannot compel these employees to apply their learning – only the leaders who asked for the training can do this. So, you need to convince the sponsor that they have a very important role in the training. L&D can do the needs analysis, design and develop the training (with help from the SMEs), and deliver it, but only the sponsor and leaders in the sponsor’s organization can make their employees apply it. Impact and results depend on both L&D and on the sponsor – neither one can do it alone. On this we must stand firm as learning professionals. In fact, if a sponsor tells you that they don’t have time for these reinforcement tasks, you need to respectfully decline to provide the learning because it is likely to be a complete waste of time and money. And don’t let them shift the burden of reinforcement to you. It is your responsibility to assist them but it is their responsibility to do it.

With this understanding, are you dedicating enough time to reinforcement? At Caterpillar we found that we had to redirect resources from design and delivery to reinforcement. In other words, you will likely have to make a trade off. We decided it was better to deliver less learning but do it well (meaning reinforced so it was applied) rather than deliver more learning which would be scrap. What is your strategy?

 

 

The Promising State of Human Capital(Report)

CTR is happy to provide the Promising State of Human Capital Report sponsored by CTR, i4cp, and ROI Institute.

This Valuable document  is available for downloa thanks to CTR and our partnerships with i4cp and ROI Institute.  This Report ties in with the ROI Institute webinar with the same name posted on our site for download. https://www.centerfortalentreporting.org/the-promising-state-of-human-capital-anayltics-webinar-by-roi-sponsored-by-ctr/

Partnership with the Corporate Learning Network

The Center for Talent Reporting, is pleased to announce a partnership with the Corporate Learning Network. CLN, is an online resource for corporate learning leaders and academic professionals. The Corporate Learning Network believes the Future of Learning will be created through multi-disciplinary approaches and peer-led exchange. With live conferences, community webinars and virtual forums, they bring together stakeholders across the L&D spectrum to help you realize your plans for improved learning outcomes and organizational success.

Learn more about  CLN at http://www.corporatelearningnetwork.com/

CLN goals are similar to CTR, and we believe this partnership will further the change and growth in HR and L&D.

Management: The Missing Link in L&D Today by Dave Vance

Despite great progress in so many areas of L&D, there is one area which has not seen much progress. This is the business-like management of the L&D department and L&D programs. Yes, department heads work to implement new LMS’s on time, and program managers and directors work to roll out new programs on time but there is still an opportunity to dramatically improve our management. Let’s look at programs first and then the department as a whole.

At the program level a really good manager would work with the goal owner or sponsor to reach upfront agreement on measures of success for the program like the planned impact on the goal. A really good manager would also work with the goal owner and stakeholders to identify plans or targets for all the key efficiency and effectiveness measures that must be achieved to have the desired impact on the goal. Examples of efficiency measures include number of participants to be put through the program, completions dates for the development or purchase of the content and completion dates for the delivery, and costs. Examples of effectiveness measures include levels 1 (participant reaction, 2 (knowledge check if appropriate), and 3 (application of learned behaviors or knowledge). A really good program manager would also work with the goal owner upfront to identify and reach agreement on roles and responsibilities for both parties, including a robust reinforcement plan to ensure the goal owner’s employees actually apply what they have learned. Today, many program managers do set plans for the number of participants and completion dates. Few, however, set plans for any effectiveness measures and few work with the goal owner to reach agreement on roles and responsibilities, including a good reinforcement plan. Virtually none use monthly reports which show the plan and year-to-date results for each measure and thus are not actively managing their program for success in the same way as their colleagues in sales or manufacturing.

At the department level, a really good department head or Chief Learning Officer would work with the senior leaders in the department to agree on a handful of key efficiency and effectiveness measures to improve for the coming year. Then the team would agree on a plan or target for each as well as an implementation plan for each, including the name of the person responsible (or at least the lead person) and the key action items. Examples of efficiency measures to manage at the department level include number of employees reached by L&D, percentage of courses completed and delivered on time, mix of learning modalities (like reducing the percentage of instructor-led in favor of online, virtual, and performance support), utilization rate (of e-learning suites, instructors or classrooms), and cost. Examples of effectiveness measures to be managed at the department level include level 1 participant and sponsor reaction, level 2 tests, and level 3 applications rates. Both the efficiency and effectiveness measures would be managed at an aggregated level with efficiency measures summed up across the enterprise and effectiveness measures averaged across the enterprise. A really good CLO would use monthly reports to compare year-to-date progress with plan to see where the department is on plan and where it is falling short of plan. A really good department head would use these reports in regularly scheduled monthly meetings to actively manage the department to ensure successful accomplishment of the plans set by the senior department leadership team. Today, very few department heads manage in this disciplined fashion with plans for their key measures, monthly reports which compare progress against plan, and monthly meetings dedicated to just the management of the department where the reports are used to identify where management action must be taken to get back on plan.

In conclusion, there is a great opportunity to improve our management which in turn would enable us to deliver even greater results. This requires getting upfront agreement on the key measures, on the plan for each one, and the actions items required to achieve each plan. For the outcome measures it also requires reaching agreement with the goal owner on mutual roles and responsibilities. Once the year is underway, good management also requires using reports in a regular meeting to identify problem areas and take corrective actions. Our colleagues in other departments have been doing this for a long time and with good reason. Let’s get onboard and manage L&D like we mean

The Promising State of Human Capital Anayltics (Webinar by ROI Co-Sponsored by CTR)

Talk with any human capital executive about the field of human capital analytics and they will generally agree: the best is yet to come. That doesn’t mean that many companies aren’t already performing incredible feats with people data—a few are profiled in this report—the statement is a testament to the opportunity that most can see in this burgeoning field. And it’s a testament to the constant new and innovative ways professionals are using people-related data to impact their organizations. This study surveyed analytics professionals in organizations of all sizes worldwide, and asked very pointed questions on how those organizations are using human capital analytics today, if at all. The results were more affirming than they were surprising:

  • Budgets for human capital analytics are increasing along with executive commitment.
  • Relatively few companies are using predictive analytics now, but expect to in the future.
  • Most are using analytics to support strategic planning and organizational effectiveness.

Successful companies tend to be those that purposefully use data to anticipate and prepare rather than to react to daily problems.

It’s clear in both the data from the survey and follow-up interviews that were conducted, that the future focus of
professionals in the human capital analytics field will increasingly be on using analytics to guide strategic decisions and affect
organizational performance. To sum up the state of human capital analytics in one word: Promising.

Objectives

This webinar introduces the new research study that demonstrates the progress that
continues to be made with human capital analytics. Participants will learn:

  • Four key findings on the state of
    human capital analytics
  • How high-performance
    organizations are building
    leading human capital analytics
    teams
  • What Google, HSBC, LinkedIn
    and Intel are doing to drive
    business performance through
    analytics
  • What you can do to move your
    analytics practice forward

We want to thank Patti Phillips, and Amy Armitage for the opportunity to co-sponsor this webinar. The recording and PPT have been made available to anyone who missed the webinar.

PPT  

The Promising State of HCA

3.60 MB 18 downloads

Recording https://roievents.webex.com/roievents/lsr.php?RCID=30bd391d319d4709b4203d48f6b7e4c9

Webinar: Innovation in L&D: Building a Modern Learning Culture

Tuesday, July 19 @ 11:00–12:00 p.m. CST

caveo-learning-logo-640x122px

 

 

Join Caveo Learning CEO Jeff Carpenter and a panel of forward-thinking learning leaders from Microsoft, McDonald’s, and Ford as they explore innovation in L&D.

More and more, stakeholders throughout the business are bypassing the learning function to create learning outside the learning & development organization. To win back the hearts of these stakeholders (and win a bigger share of the organizational budget), learning leaders must deliver solutions that are exciting, cutting-edge, efficient—in a word, innovative.

innovation-in-l-and-d-building-a-modern-learning-culture-1000x500px

 

 

 

 

 

 

 

 

By pushing beyond their comfort zone to embrace new ideas, concepts, and technologies, L&D organizations ensure their continued relevance and enhanced ability to deliver tangible business results.

In this webinar, you’ll learn how to:

  • Foster a culture of innovation and creativity in your learning organization
  • Reexamine and reconfigure outdated training through a lens of strategic innovation
  • Develop innovative training and eLearning programs within the confines of business processes and templates

Register today!

Webinar Presenters

Rich Burton, Group Project Manager at Microsoft

Jeff Carpenter, CEO of Caveo Learning (CTR Advisory Council Member)

Gale Halsey, Chief Learning Officer of Ford Motor Company

Rob Lauber, Chief Learning Officer of McDonald’s Corporation


Who Should Attend

Chief Learning Officers (CLOs), VPs of Training, Training Directors and Managers, Human Resources VPs and Directors, CEOs, and COOs.

Webinar: ROI from a TDRp Perspective: Plan, Deliver, and Demonstrate Business Value

August 17, 11am CT

ROI Institute LOGO TRUE

Join Patti Phillips, CEO of ROI Institute, for an introduction to ROI and to learn how ROI and TDRp work together to help you plan, deliver and demonstrate business value. She will share the steps for a successful program implementation starting with the upfront planning and continuing through the calculation of actual ROI at the end. You will come away with a better appreciation of both ROI and TDRp as well as how they can both help you deliver better business value,

Register Today:- http://bit.ly/21imWZ8

Dr. Patti Phillips is president and CEO of the ROI Institute, Inc., the leading source of ROI competency building, implementation support, networking, and research. A renowned expert in measurement and evaluation, she helps organizations implement the ROI Methodology in 50 countries around the world.

Since 1997, following a 13-year career in the electric utility industry, Phillips has embraced the ROI Methodology by committing herself to ongoing research and practice. To this end, she has implemented ROI in private sector and public sector organizations. She has conducted ROI impact studies on programs such as leadership development, sales, new-hire orientation, human performance improvement, K-12 educator development, and educators’ National Board Certification mentoring.

Phillips teaches others to implement the ROI Methodology through the ROI Certification process, as a facilitator for ASTD’s ROI and Measuring and Evaluating Learning Workshops, and as professor of practice for The University of Southern Mississippi Gulf Coast Campus Ph.D. in Human Capital Development program. She also serves as adjunct faculty for the UN System Staff College in Turin, Italy, where she teaches the ROI Methodology through their Evaluation and Impact Assessment Workshop and Measurement for Results-Based Management. She serves on numerous doctoral dissertation committees, assisting students as they develop their own research on measurement, evaluation, and ROI.

Phillips’s academic accomplishments include a Ph.D. in International Development and a master’s degree in Public and Private Management. She is a certified in ROI evaluation and has been awarded the designations of Certified Professional in Learning and Performance and Certified Performance Technologist.

 

 

New Corporate Memberships Benefits

CTR is happy to announce additional benefits for Corporate Members.

  • A Free Q&A with Dave Vance to answer your TDRp, Measurement, or Reporting Questions
  • A Free Review of your List of Measures or Reports
  • A Free Private Intro to TDRp Webinar for Your Team
  • A $100 Discount on the CTR Confernece
  • A $200 discount for the Basics Workshop
  • A $500 discount for a Custom Workshop

Please contact Andy Vance at avance@centerfortalentreporting.org for more details

Our Newest CTR Advisory Council Member Todd Harrison

Todd Harrison, Ed.D                                                                             Todd

[Director, Talent Solutions]

CTR is pleased to announce Todd Harrison as our newest Advisory Council Member. Todd will be taking over for Kendell Kerekes. We are thrilled to have such an experienced and renowned individual added to our ranks.

Background

Dr. Harrison joined CEB Metrics that Matter (MTM) in 2012, after nearly 15 years of corporate experience in various learning and development leadership roles, where he was an active practitioner of the MTM system.  At MTM, he is accountable for the MTM Professional Services team of approximately 40 people, and provides strategic leadership to the Metrics that Matter (MTM) Client Advisory and Consulting teams within this group.  These two teams have primary responsibility for delivering ongoing support services to MTM technology clients, as well accountability for the development and delivery of learning and human capital consulting measurement solutions, respectively.  Specifically, the Professional Services team helps measure the impact and effectiveness of learning and development programs within an organization intended to unlock the potential of organizations and leaders by advancing the science and practice of talent management.  Dr. Harrison’s business responsibilities include oversight of a portfolio of more than 600 clients and an annual global P&L goal of nearly $5M.

Dr. Harrison has extensive knowledge and expertise in several areas of talent management to include:

·        Succession Planning ·        Leadership Development ·        Talent Analytics
·        Learning Strategies ·        Employee Engagement ·        Competency Design
·        New Hire Onboarding ·        Performance Management ·        Organization Development

 

Education

  • Doctorate in Organizational Leadership (2016) – Indiana Wesleyan University
  • Master of Arts in Human Resource Development (1995) – Webster University
  • Bachelor of Science in Journalism (1986) – Murray State University

Professional Experience

  • Director, Talent Solutions, Metrics that Matter: CEB, Chicago, IL (2015 – present)
  • Director, Consulting Services, Metrics that Matter, Chicago, IL (2014 – 2015)
  • Senior Consultant, Metrics that Matter, Chicago, IL (2012 – 2014)
  • Director, Global Leadership & Organizational Development, Stryker, Kalamazoo, MI (2010 – 2012)
  • Director, Leadership & Associate Development, Anthem Blue Cross/Blue Shield, Indianapolis, IN (2002 – 2010)
  • Vice President, Human Resources, Total eMed, Franklin, TN (1999 – 2001)
  • Director, Leadership & Organizational Development, American Rehability Services, Brentwood, TN (1997 – 1999)
  • Lieutenant Colonel (Retired), United States Army (1984 – 2005)

Professional Affiliations

  • Association for Training Development (1993 – Present)

CEB Platinum Sponsor

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at cebglobal.com.

CEB is a Platinum sponsor of CTR. With their continued support, CTR has continued its mission to bring standards and measures to the HR community. CEB was a big participant in the February 2016 CTR Conference. Helping to draw many participants making this conference, our largest yet. We look forward to continuing our relationship with CEB.

For information on sponsorship CTR please visit https://www.centerfortalentreporting.org/sponsorship-opportunities/ 

Join CEB For a selection of webinars in June.

CEB-Logo-RGB-150x41-color

All time are in CT

 

Title
Description
Addressing Pay Equity
June 7th 11am-12pm
Join CEB on June 7 for their complimentary webinar as they examine pay equity concerns for organizations with U.S. workforces. Register now at http://ceburl.com/1osc
Keep Quality through Onboarding
Jun 9th 11am-12pm
Best practice onboarding can maintain and actually improve quality of hire by up to 12%. Hear how the best companies prioritize employee integration in onboarding during CEB’s complimentary webinar June 9. Register now at http://ceburl.com/1ov5
Organizing HR to Lead Enterprise Change
Jun 9th 9am-13:30am
Only 21% of organizations are able to initiate change as soon as the need arises. Learn how to better influence change implementation where it happens during CEB’s complimentary webinar June 9. Register now at http://ceburl.com/1ov6
The Talent Within
Jun 23rd 11am-12pm
74% of organizations want to increase their internal hiring this year. Hear how the best companies improve internal mobility, and yield greater quality of hire during CEB’s complimentary webinar June 23. Register now at http://ceburl.com/1ov7
Four Imperatives to Increase the Representation of Women in Leadership Positions
June 28th 12pm-1pm
Increasing representation of women in leadership positions has a real, tangible impact on an organization’s financial and talent outcomes. Join CEB on June 28 as they discuss the common myths surrounding women in leadership, and how to engage and retain a more gender-balanced workforce. Register now at http://ceburl.com/1ov8

 

About CEB

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at cebglobal.com.

CEB-Logo-RGB-150x41-color

 

 

Big Data, Analytics, and L&D By Dave Vance

Big Data, Analytics, and L&D By Dave Vance

 

Big data and analytics continue to generate a lot of excitement in the human capital field. It seems like a new conference is announced almost monthly on the topic. So, what does it all mean for L&D?

Short answer: Not much yet.

Don’t get me wrong. I like data and I like analytics, and the new applications in HR are very exciting, but the buzz is primarily around using statistical tools like regression and correlation to discover relationships among measures (variables) outside of the learning field. Some of the most often cited analytical work is the determination of the factors that explain retention which will allow organizations to take proactive steps to retain desirable employees. There is also work to better explain the factors behind employee engagement, although even this is often in service of increasing retention. I have not seen any conference descriptions or magazine articles where big data and analytics have been applied to learning.

There are certainly applications of analytics to L&D but most of these have been around for a while and are not receiving much current attention. Let’s begin, though, with some definitions.   “Big data” generally refers to the ability to capture more measures at an individual employee level than even before and, once captured, store them in a data base(s) which is easy to manipulate for analysis.  Think of data on employee engagement scores, ratings, individual development plans, past and current assignments, projects, supervisors, referrals, past employment, participation in company clubs and activities, use of benefits, number of sick days, number of vacation days taken, etc., by name for each employee. The “analytics” part is the application of statistical techniques to tease out relationships between these measures which may have been expected or, in some cases, totally unexpected (no one was looking for it). You can see the potential of putting big data together with analytics. For example, you might discover that a leading indicator of regrettable turnover is high use of sick days or lack of involvement in company activities or lack of opportunities to participate in projects. With this type of knowledge you could identify those at risk and intervene early to keep them from leaving.

How might this be applied to L&D? You can imagine using big data to show that application and impact are lower for employees with poor supervisors who don’t reinforce the learning. But we already know this. You can imagine using big data to show the importance of identifying the right target audience, but we already know this, too. I am sure there are opportunities to discover some new relationships, but I believe L&D is more mature than other HR disciplines and thus we already have discovered many important relationships.

While big data is not being used much in L&D today, we have been using “little data” quite effectively for some time. We routinely measure number of participants, courses, hours, costs, completion dates, participant reaction and amount learned. We measure application rates but not nearly as often as we should. Typically we do not put by-name employee measures in a data base which also has information about the supervisor, employee engagement scores and other potentially interesting measures so we end up with “little data” instead of “big data”.

And we have been using analytics for some time. We have been using the intended application rate as a leading indicator of actual application for years. We also use the other “smart” questions from the post event survey as leading indicators for application and impact. Patti and Jack Phillips have helped many organizations use regression analysis to isolate the impact of learning from other factors. This effort has really matured lately with the advent of organizations like Vestrics which use general equilibrium modeling (sets of regression equations estimated simultaneously) to isolate the impact of ALL the factors with a high degree of statistical certainty. So, analytics, including high-level analytics, does have a place in L&D.

In summary, in L&D we have not made much use of big data but we have been using little data for a long time. We have not made extensive use of analytics but we have used leading indicators for years and some are using regression to isolate the impact of learning. In part, the lack of application to L&D currently reflects our relative maturity. We have a better understanding of the basic relationships among measures than our colleagues in talent acquisition or total rewards because we have been measuring longer, and thus there is not as a compelling case to be made for trying to discover new relationships. That said, there are opportunities going forward especially in terms of isolating the impact of learning and demonstrating to nonbelievers the power of strong sponsorship and reinforcement.

CTR Accepts the Community Partnership Award

Dave Vance was present and honored to accept the Community Partnership Award, awarded to CTR by the University Of Southern Mississippi Department Of Human Capital Development, one of the few departments in the country to offer a PhD in Human Capital Development. CTR was selected in recognition of our inviting USM PhD students to our conference in Dallas and for our leadership in the profession though TDRp. It was awarded Friday April 29 at the Third Annual Awards Ceremony held at the Gulf Park Campus. Six other awards were given, and graduating Masters and PhD students were recognized.

Dave Vance, (Executive Director of the Center for Talent Reporting).

Dave is a frequent speaker at learning conferences and association meetings. He also conducts workshops and simulations on managing the learning function. Dave is the former President of Caterpillar University, which he founded in 2001. Dave received his Bachelor of Science degree in political science from M.I.T. in 1974, a Master of Science degree in business administration from Indiana University (South Bend) in 1983, and a Ph.D. in economics from the University of Notre Dame in 1988. Dave was named 2006 CLO of the Year by Chief Learning Officer magazine. He was also named 2004 Corporate University Leader of the Year by the International Quality and Productivity Center in its annual CUBIC (Corporate University Best in Class) Awards. Dave’s current research focuses on bringing economic and business rigor to the learning field as well as the development of computer-based simulations to help learning professionals increase their effectiveness and efficiency. His first book, The Business of Learning: How to Manage Corporate Training to Improve Your Bottom Line, was published in October 2010.

Setting Standards for our Profession: Three Standard Reports By Dave Vance

Last month we talked about the need for standards and introduced three types of measures which would serve to organize our measures just as accountants use four types of measures in their profession. This month we will introduce three standard reports for L&D which contain the three types of measures. These three reports, used together, provide a holistic and comprehensive view of learning activity just as the three standard accounting reports (income statement, balance sheet, and cash flow) do for financial activity.

The three reports are Operations, Program, and Summary. While a version of these reports (called detailed reports) contains just historical information (like actual results by month or quarter), we will focus on reports for executive reporting which in addition to last year’s results include the plan or target for the current year, year-to-date (YTD) progress against plan, and, ideally, a forecast of how the year will end if no special actions are taken (like extra staffing, budget or effort beyond what is currently planned). The reports are meant to be used by the department head and program managers in monthly meetings to actively manage the function to deliver results as close to plan as possible. Of course, before the year starts, L&D leaders will need to decide what measures are important for the year and what reasonable plans or targets would be for those measures.

We will start with the Operations Report which contains effectiveness and efficiency measures. Recall from our discussion last month that effectiveness measures are about the quality of the program (like levels 1-5) and efficiency measures are about how many, at what cost, etc. (like number of participants, hours, utilization rate, and cost). The Operations Report is meant to capture the 5-10 most important effectiveness and efficiency measures at an aggregated level which may be for the enterprise, region, business unit or product group depending on the L&D department’s scope. So, for example, the Operations Report might show last year’s actual, current year plan, YTD results, and forecast for unique and total participants for all the courses being offered. It might also show the average level 1, 2 and 3 for all courses being offered. The department head would use this report monthly to see if the department is on track to meet plan for the year. If it appears the department is not on track, then the senior L&D leaders need to understand why and agree on what actions to take to get back on plan.

The Program Report also contains effectiveness and efficiency measures but its focus is at the program or initiative level rather than the enterprise level. It is designed to show the key programs in support of a single company goal like increasing sales by 10% or reducing operating cost by 5%. Under each identified program would be the most important effectiveness and efficiency measures. The Program Report also would contain an outcome measure showing the planned impact or contribution from learning on the goal. So, the report pulls together the key programs and measures required to achieve the desired impact on the goal. The owner of the goal (like the SVP of sales) and the L&D leaders would agree on the programs; outcome, effectiveness and efficiency measures; and plans or targets for the measures before the programs begin. The goal owner and L&D also need to agree on their mutual roles and responsibilities, including how the owner plans to reinforce the learning. Program Reports would be used by the department head and program managers each month to ensure everything is on plan to deliver the agreed-upon results.

While the Operations and Program Reports are used to manage the function each month, the Summary Report is designed to show the alignment and impact of learning as well as its effectiveness and efficiency. The Summary Report starts by listing the company’s top 5-7 goals in the CEO’s priority order and then shows the key learning programs aligned to each. (In some cases there will be no planned learning.) Where learning does have a role to play in helping achieve the company goal, the report ideally will include an outcome measure or some measure of success. Next, the report will share three-four key effectiveness measures and three-four key efficiency measures. The target audience for this report is the CEO, CFO, SVP of HR, other senior leaders as well as employees of the L&D department. The report lets the senior company leaders know four important things: 1) L&D knows what the company’s goals and priorities are, 2) L&D leaders have talked with goal owners and the two parties have agreed on whether learning has a role to play and if it does, what kind of contribution may be expected from learning, 3) L&D leaders are focused on improving the effectiveness and efficiency of learning, and 4) L&D leaders know how to run learning like a business by setting plans and then managing each month in a disciplined way to deliver those plans.

Use of these three standard reports and the processes associated with creating them will dramatically improve the impact, effectiveness, efficiency, stature, and credibility of L&D. Along with adopting the three standard measures, it will also be a major step towards L&D becoming a true profession with our own standards and reports.