Latest Resources

Corporate L&D’s Share of Learning Declining: Should We be Concerned?

By Dave Vance, Executive Director, Center for Talent Reporting

There is concern in the industry that a decline in formal learning is leading to a smaller share of the “learning market” for corporate L&D departments. The analogy is a marketing department’s concern over a declining share of the consumer’s wallet as other products or services displace their own. In our profession the worry is driven by a move toward employee-directed learning where employees are bypassing formal corporate training programs in favor of online learning opportunities offered by outside organizations as well as self-directed learning on the internet and informal learning through their own networks. Several studies in the past two years have confirmed the shift and it is certainly a growing topic for discussion in the profession.

So, should we be concerned about this shift? In general, I don’t think so. First, for the sake of those new to our profession, let’s be clear that it was never the goal of any corporate L&D department to control 100% of the learning for the organization’s employees. Nor was it ever a reality. There has always been informal learning going on outside the reach of a training department. Although not a fan of the 70:20:10 principle it certainly reminds us that formal training plays an important but usually limited role. With regard to control, most organizations have always offered discretionary or elective learning opportunities, including tuition assistance, where employees were free to choose their own learning.

Now, back to the question. I think the answer really depends on the mission of the learning department. If the mission is narrowly defined to focus on providing consistent, uniform basic skills to employees, and if those employees are now getting inconsistent and perhaps incorrect information on their own which detracts from their performance, then, yes, we should be concerned. Think of basic skills training in industries like manufacturing, banking, or food services where the goal is for employees to perform tasks in certain defined ways. Similarly, if your focus is primarily on compliance, you don’t want employees getting a lot of incorrect information on the internet. In these cases the amount of formal training provided has probably not decreased but the informal has increased so the share of formal has declined. More to the point, you may have to work harder or redesign your formal learning to take into account what employees are picking up through their own channels.

If the mission is to provide learning to improve performance through additional training beyond the basics, then I think there is less reason for concern. Think of formal learning to improve the performance of marketing employees or most leadership development programs. As employees learn more about marketing or leadership on their own, the share of the knowledge they have gained on these topics from your formal training will decrease but this is not necessarily a bad thing. In fact, if they are actively looking to learn from outside resources that shows a high level of engagement. Like in the cases above, you may need to take these new sources of learning into account when you design your programs. You could incorporate the good sources that are consistent with your approach and refer to the issues or problems with the approaches not consistent with your formal teaching. In either case, the employees are probably going to learn more and be better prepared for the challenges they face on the job.

Last, consider a mission which is primarily to increase employee engagement. Here there is little reason for concern if employees are migrating to more learning opportunities outside your formal learning framework. Yes, your share of their total learning is decreasing but the good news is that they (hopefully) are finding what they want. In fact, in this case you would want to do all you can to help them find and access great learning – whatever the source. The easier you can make it for them, the more engaged they will be. Mission accomplished.

I think this shift towards deceased share of learning for corporate L&D departments is indeed a permanent shift and the trend is likely to accelerate as more outside resources become available and as organizations make it easier for employees to access them. Some caution is required depending on the department’s mission and type of offerings, and the shift is likely to force changes in how formal learning is designed, which is fine. I think the future remains bright.

Where Does Reporting Fall on the L&D Maturity Spectrum?

By Dave Vance, Executive Director, Center for Talent Reporting

Maturity models show a journey from just beginning to collect data all the way through the use of advanced and sophisticated practices and processes. The models can be very useful in conveying the journey that must be followed from immature to mature and in helping organizations assess both where they are today and where they may wish to go in the future.

Unfortunately, since most practitioners don’t really understand what is required to run learning like a business, most of these models show reporting as the second or third step on what is often a fix or six-rung maturity ladder. Properly understood, reporting should be the next to highest rung.  Let’s see why.

First, we need to define terms which is where the problem begins. For most, reporting represents the creation of monthly dashboards or scorecards which show ONLY actual results. And typically the measures are low level efficiency and effectiveness measures like number of courses, hours and participants, and level 1. You have all seen these. They may show monthly or quarterly data and may contain bar charts or line graphs. Dashboards may show a speedometer. If this is how we define reporting, then I agree it belongs on the second rung since it is merely capturing the most elementary data from the first rung and using it only to discern how the measure is trending. Authors of these maturity models often go to great lengths to contrast this low level “reporting” with higher level predictive analytics which these days is almost the highest rung. So, the model encourages you to move beyond elementary reporting (which is good!) to higher level measures (like levels 3-5, which is also very good!) to big data and predictive analytics (which may or may not be so good), all the while never mentioning the use of reports for management of the department of program.

If we instead define reporting in line with Talent Development Reporting principles (TDRp), the whole model changes. In TDRp executive reports are NEVER simply a compilation of actual results (history). A report must include the plan (target or goal) for that measure, year-to-date (YTD) results, a comparison of YTD results to plan, and ideally a forecast of how the year is likely to end if no special (unplanned) action is taken. The sole purpose of the report is to answer the two basic questions every manager should ask for every important measure: (1) Are we on plan year to date? and (2) Are we going to end the year on plan? Trend for the year and comparison to previous years are interesting but largely irrelevant after the plan has been set for the year. Trend and history would have been used to set an achievable, reasonable plan, but after the plan is set, all that matters is whether you can achieve it. So, these reports are absolutely indispensable to managing key programs and the department as a whole. A good manager simply cannot manage without them.

In this light, good reporting should be near the top of the maturity model since it supports the active, disciplined management of the function which, in my opinion, is the top rung. I believe that this active management supported by good reporting will deliver FAR greater results and impact than big data and predictive analytics. In fact, it is not even close. Predictive analytics is typically used to discover relationships among measures which is great but usually impacts a small number of measures or projects. In contrast, the active management of all your key programs, by definition, will affect everything important you do for the year. In other words, if a CLO is trying to decide between investing in predictive analytics and disciplined management of the department using good reporting, my strong advice is to get your disciplined management in place first.  This will provide far more bang for the buck and has the potential to advance your career in management (versus in analytics).

Please take a second look the next time you see a maturity model to understand how the authors have defined the components. I bet they will have a “dumbed” down definition for reporting and consequently have allocated it to a low rung. If you are constructing your own maturity model or journey chart, I challenge you to aspire to great management of the entire department as your highest rung.

 

 

 

Is Learning a Real Profession?

By Dave Vance Executive Director, Center for Talent Reporting

One of the hallmarks of a profession is a standard language, an agreed-upon framework for measures, and common processes. Think of accounting. The profession has a common language, and agreed-upon measures, statements, and processes. For example, accountants have four basics types or categories to organize their hundreds of measures. You will recognize these as revenue (or income), expense (or cost), assets and liabilities. Most accounting measures can be grouped into one of these categories. Accountants place these measures into three standard statements, each with a different purpose. The three are income statement (or profit & loss, P&L for short), balance sheet, and cash flow statement. The three taken together provide a holistic view of an organization’s financial position and can be used to manage the organization throughout the year. Last, their profession has a host of practices taught at university like how to define and use the measures, how to create and use the statements, how to use measures to analyze issues and solve problems (for example, ratio analysis, the DuPont model, the audit process).

What does learning have that is comparable? Unfortunately, very little. Ed Trolley and David van Adelsberg suggested two categories of measures (effectiveness and efficiency) in their ground breaking 1999 book Running Training like a Business, but the profession did not fully embraced them. We started with their work in 2010 and broke out outcome measures from effectiveness measures for Talent Development Reporting principles (TDRp) to highlight their importance. While more are adopting the  three TDRp categories, you can pick up any professional publication and find the terms used inconsistently. This does not happen in accounting. Accountants have standard names and definitions for all their measures. We in learning do not. People define and calculate the same measures differently. There is no single source of truth, no profession-wide measures library if you will. There is in accounting.

Unlike accounting and prior to TDRp, we have had no standard reports what so ever. Period. So there has been no agreement on what measures go into what reports, or how many to include, or what to do with the measures and reports once they are created. Yes, we have a number of dashboards that typically show just actual results for measures like number of participants, hours, courses developed and delivered, and utilization rates, but these really don’t rise to the level of standard reports and there is no agreement in any case on what should be included or how it should be displayed. Last, we have few standard processes. We have some for instructional design and performance consulting like ADDIE, SAM, SCORM and API, but not much agreement as a profession on most other processes. For example, although it is recommended by all authors and leading practitioners, we don’t always start with the end in mind, we don’t always meet proactively with our CEOs to discover the coming year’s goals, we don’t always meet proactively with the goal owners to reach upfront agreement on the impact of learning and mutual roles and responsibilities, we don’t always align our learning to the organizations’ most important goals, we don’t always create SMART goals for our most important measures, and we don’t always create reports with plan and year-to-date results to let us manage our programs to deliver the agreed-upon goals. Compare these with the audit process where all accountants know how to do an audit and where there are standard expectations for how an audit is to be conducted.

So, if we want to be a true profession, we need standards just like the accountants have. TDRp is designed to help us get there but we are just at the beginning of this long, but very important, journey. To succeed we all need to work together. Please start using the three categories of measures in your workplace. For your key programs and measures, please start using formats like those in the three TDRp reports. Let us know where you believe we need to change, refine or improve this framework. Tell us if you think there is another category which should become part of the standard or if we need a different type of report. Recently, several have asked for some example dashboards that would be consistent with the TDRp framework so we will work on this. Bottom line, learning is still a very immature field so this is a work in progress. No one has all the answers, but it is time for us to grow up and take the next important step to become a true profession.

 

Second Annual BUSINESS WRITERS CONFERENCE

Renaissance  Birmingham  Ross Bridge Golf Resort & Spa

Birmingham, Alabama

Click To Download: Second Annual Business Writers Conference Brochure 2017

*  Build a business around your book. *Use your book to build your business.A Conference with Keynote Sessions Designed Specifically for Business and Professional Writers

“I Couldn’t Put It Down!” How To Write a Book That People Will Buy, Read, and Love with Karl Weber

You’re an expert in an important subject; you’ve got a treasure trove of useful information and colorful stories to share; and friends have been telling you, “You’ve got to write a book!” That’s great—but it’s not enough. To write a book that will captivate readers from page one, hold their interest until the end, and inspire them to recommend you to their friends and colleagues, you need to know the writing techniques that best-selling authors use. Karl Weber is a writer, editor, consultant, and coach who has helped scores of authors turn their ideas into successful books, including many bestsellers.  He’ll share tips on how to plan, organize, draft, and revise your book to make it clear, fascinating, and persuasive.  He’ll also illustrate the pitfalls many authors fall into and show you exactly how to avoid them. You’ll leave this session with a deeper understanding of how to connect with your audience, sell plenty of books, and have a powerful, lasting impact on readers.

Build A Business Around Your Book or Use Your Book To Drive Your Business with Kevin Oakes, Patti Phillips, and Jack Phillips

For business writers, the key motivation is building a business around your book and that’s the focus of this conference. In other situations, books help drive a business that’s already been created. In this session, Kevin Oakes, CEO of the Institute for Corporate Productivity describes how he uses books to help drive the business. Jack and Patti Phillips, of Business Writers Exchange and ROI Institute, will be joining him to describe how they have built businesses around their books. This session will show the variety of ways in which authors can build businesses around books or use books to support their business.

So You Want to Write a Book . . . with Elaine Ebiech

There is nothing like the feeling of holding your own book in your hands. But where do you start? Will anyone read it? It’s a big commitment—why do authors write? What can you gain by writing a book? Being a published author is a sign of credibility and expertise. Writing a book is a way to reach more people with your message. Your book is your legacy. It is an extension of yourself to a world full of readers. Elaine will share insight about why and how she writes. With 65 books to her name and counting, she relates her beliefs and those of other successful authors to inspire and encourage you to begin your book today.

Do You Have a Book Idea? Check Your Elevator Pitch First with Neal Maillet

Most authors don’t understand the critical factors that attract an editor’s eye. The elevator pitch in the first sentence of your proposal will determine your success more than anything else—including a detailed table of contents or any recitation of content. Why is this? I’ll discuss the passion and urgency that any editor hopes to see in a book presentation, and will lead the audience in an exercise to help shape the “big idea” that will set your book on the right path.

New Digital Tools: Make Your Social Media Marketing Easier and More Effective with Fauzia Burke

Social media marketing allows us to find influencers and create communities. However, as we all know it is time-consuming and a labor-intensive activity. This session will present new and established digital tools that can help make the process of managing online marketing easier and more effective. After testing many tools, Burke will present the ones that are most helpful to authors and publishers alike. In a “show-and-tell” session, we’ll review tools that help find influencers and discussions, create visual content and data, collaborate with teams, measure engagement and more.

The Business of Your Words and The Glory of Others: How a Successful Co-author Is Born with Bob Andelman

Bob Andelman, author or co-author of 16 books, including New York Times and Businessweek best-sellers such as Built From Scratch, The Profit Zone, and Mean Business, will talk about his experiences over the last 30 years as a business writer and a co-author/ghostwriter. He will take attendees behind the scenes of his relationships with co-authors, share publishing horror stories, and offer ideas on how to drive a successful, enduring career helping professionals tell their stories in long form.

Writing in the Color of Kaleidoscope with Chip Bell

Business books have changed. Today’s customers are accustomed to the pace of a tweet, the compactness of a blog, with the excitement of a Cirque du Soleil performance. It requires coupling compelling inspiration with pragmatic instruction. And that’s just the authoring part. The “new normal” book buying customer also has altered the platform for effective book promotion. Yet, authoring a well-known business book can transform your brand from being another “me too” to one of distinction and success.  Best-selling author Chip Bell will share secrets for creating, promoting, and integrating business writing into your go-to-market competitive strategy.

How to Establish Your Brand Across All Media Platforms with David Hahn

Three crucial components of an author’s media platform will be examined in this presentation: traditional media coverage (TV, Radio, and Print/Print.com), online placements with bloggers, podcasters, and large platform sites, and a robust social media platform on Twitter, Facebook, and LinkedIn. Advanced techniques for landing interviews with TV and radio stations will be taught along with strategies for developing relationships with online and print editors. In addition, important features of an author’s website will be discussed.

The Lighthouse of Legacy with Scott Mautz

In this keynote, the audience will be engaged head & heart to be intentional about leaving a legacy. They’ll learn about The Five Footprints of Legacy– the manners in which social science teaches us we tend to leave a lasting impact behind, and how each footprint can be imprinted remarkably well by those authoring a book. If we are made aware of the “buckets” of ways in which we can leave footprints, then we can begin working each day to fill each bucket through our writing or how we live our lives. The audience will also get a “double click” on just how powerful it is when we commit to being intentional about articulating and pursuing our specific desired legacy.  This inspiring keynote uses adult learning principles and a mixture of mediums to captivate the audience. The audience will leave inspired and informed on how exactly to be intentional about leaving a legacy of results and resonance (through their writing, work, and in life). Accordingly, their relationship with their writing and work may well change.

Karl Weber

Founder of Karl Weber Literary, author, editor, and collaborator with dozens of successful business books, including many national best-sellers, to his credit

 

Kevin Oakes

CEO and founder of the Institute for Corporate Productivity (i4cp), co-author of The Executive Guide to Integrated Talent Management

 

Patti Phillips

President and CEO of ROI Institute, author of over 50 business books including the award-winning book, The Bottomline on ROI

 

Jack Phillips

Chairman of ROI Institute, former bank president, and author of over 75 business books including Show Me the Money

 

Elaine Biech

President of Ebb Associates, author of over 65 business books including

A Coaches Guide to Exemplary Leaders

 

Neal Maillett

Editorial Director at Berrett-Koehler, one of America’s most respected and innovative business publishers

 

Maureen Orey

Founder and President of the Workplace Learning and Performance Group, author of Stay Afloat

 

Dottie DeHart

Book guru, President of DeHart & Company, key contributor to the best-selling

For Dummies series

 

Tanya Hall

CEO of Greenleaf Book Group

Chris Kennedy

Self-publishing expert, author of 10 books, including the #1 bestselling self-help book, Self-Publishing for Profit

 

Bob Andelman

Editor and publisher of Mr. Media Books, author or co-author of 16 books, including Built From Scratch, The Profit Zone, and Mean Business

 

Fauzia Burke

Founder and president of FSB Associates, author of Online Marketing for Busy Authors

 

Chip Bell

Renowned keynote speaker and customer loyalty consultant, author of 22 books, eight

of which have been national and international best sellers, including Sprinkles: Creating Awesome Experiences Through Innovative Service

 

David Hahn

Managing Director of MEDIA CONNECT, a division of Finn Partners that specializes in powerful book publicity

 

Scott Mautz

CEO of Profound Performance, author of the best-selling book Make It Matter:

How Managers Can Motivate by Creating Meaning

 

Andrew Mueller LID Publishing’s

Business Development Manager

 

Sara Taheri LID Publishing’s Editorial Director

 

Register Today at www.business-writers-exchange.com

 

TDRp Message Shared with the Legal Profession

Dave Vance shared the TDRp message with learning professionals in the legal profession on December 1st. He spoke on Day 1 of the Professional Development Institute’s two-day annual conference in Washington, D.C. Three hundred attended the conference which is dedicated to the professional development of attorneys and legal firm staff members. Dave was asked to speak by several who had heard about TDRp and the Center for Talent Reporting and wondered if the principles would translate to the legal field. Dave worked with them over the summer and fall to adapt TDRp to the legal environment, creating a Program report and a Summary report reflecting typical legal firm goals and initiatives. The presentation is titled Strategic Talent Development Collaboration, Management, and Reporting.

Presentation Slides available for download

http://www.centerfortalentreporting.org/download/washington-conference-slides/ ?

What CEOs Know (and You Should Too)

By Dave Vance

Some in our profession have not had the opportunity to work closely with senior leaders like a CEO or CFO. With that in mind I would like to share some observations about senior leaders from my experiences at Caterpillar (first as the Chief Economist and later as the CLO) and at other companies since then. Of course it goes without saying that my sample size is small and thus may not be representative of all CEOs, but I believe most will share the traits discussed below.

First, despite their very lofty position, they are still just people. Now I know that some CEOs believe they have transcended to a whole new level of existence, but in my experience most were down to earth and you could just talk with them. This doesn’t mean you can waste their time because it is incredibly valuable so you need to be prepared and be clear about you want from them. At the same time, though, you don’t need to memorize a script or be scared to death about the meeting. It is okay to ask questions and with most it is okay not to have all the answers. Because they are people, don’t be surprised if they ask questions about your family or interests in addition to work. They really do want to know more about you, and some personal connection time breaks up their day which is otherwise filled with 10-12 hours of work-related issues. So, relax a little bit and enjoy the time you get with them. And be prepared to be surprised at what you might learn about them and the organization.

Second, they have a very high tolerance for some uncertainty and ambiguity. Their tolerance will almost certainly be much greater than yours because they have to deal with it every single day. Having said that, though, we need to provide some context. They will not have a high tolerance for your being unprepared or acting unprofessionally or making excuses. They will, however, understand that a plan for next year is just that: a plan. They know from experience that plans are usually wrong because something unexpected typically happens. So, risk cannot be eliminated because of fundamental uncertainties about the future but it can be managed and that starts with ensuring you have a good process for planning and that you have talked with the right people. So, expect questions about how you came up with the plan and who you talked to. If you talked to the right people (like the Sales SVP to agree on an outcome measure) and made reasonable assumptions, then they are going to feel good about the plan. Put differently, there is nothing they can do to guarantee a perfect plan so instead they will focus on what they can influence (process and people). So, just be humble in setting and communicating plans. Don’t worry that they are not perfect.

Third, they know that you are going to make some mistakes along the way. They themselves are still making mistakes and other senior leaders are still making mistakes so it is only natural that you will, too. (Unless you believe you are far superior to them!) Many in our profession are afraid that the CEO is going to discover they are not perfect, and thus are afraid to share TDRp-type reports which may show that every measure is not on track to meet plan. News flash: Your CEO already knows you are not perfect and they don’t expect you to achieve or exceed plan on every single measure. In fact, any good leader would be very skeptical of someone who says they always achieve or exceed plan because that is not how the world works. Unless you set incredibly easy targets to meet, you are going to achieve some, exceed some, and fall short on the rest. This is what your CEO is expecting. I met with our Board of Governors (CEO and other senior leaders) each quarter and I never reported that we were on plan to achieve or exceed all measures. Likewise, as an economist it was impossible to get all the forecasts right. Since your CEO already knows this (even if you do not) they will be looking to see if you are honest and transparent, willing to share the bad news as well as the good news. They will then want to see if you can address those areas where you are behind without being defensive or blaming anyone else. They want to know that you are on top of the issues and are doing everything that makes sense to deliver the planned results. So, don’t worry about letting the CEO see you are not perfect. Instead show them that you can manage L&D like business.

If you have not already discovered the above for yourself, I hope this makes you a little more confident and at ease about meeting and interacting with your CEO and other senior leaders. At the end of the day they are just people just like you, planning in the face of uncertainty, making mistakes, and moving forward despite it all.

Running Learning Like a Business: What Does It Mean and Is It Still Relevant? By Dave Vance

Running learning like a business simply means applying business discipline to the learning field. More precisely, it means creating a plan with specific, measurable goals and then executing that plan with discipline throughout the year. That’s it. Most organizations already do this at a high level. They create a business plan for the year and then create and use reports on a monthly basis to manage their activities to come as close to plan as possible. At a lower level many other departments already do this as well. Think about a typical sales, quality, or manufacturing department. They establish goals at the start of the year and then they compare their progress each month against those goals, taking appropriate actions as necessary to stay on plan or get back on plan if they are behind. The point is they have a plan and they manage to it.

It is my experience that fewer than 10% of learning departments manage this way. Most would say they work very hard and accomplish a great deal. And they do work hard and they do accomplish a lot. (And most produce activity reports to show just how busy they are (number of courses, participants, etc.)). But they could have an even greater impact on their organization if they worked smarter. They might not deliver more courses or reach more employees, but they will make more of a difference in their organization’s performance. So, what does it mean to work smarter? Well, it means that you plan your actions more carefully and then you execute your plan with discipline. The upfront planning means that you think carefully and critically about what you want to accomplish, why you want to accomplish it, what you will need in terms of resources to accomplish it, and what measures you will use. Invariably this will involve tradeoffs and prioritization since resources are always limited. So, what is more important for you to work on this coming year? Where can you make the greatest difference or add the most value? Once you have some ideas (and they may be different than what you have been working on!) the next task is to settle on a reasonable and achievable goal which means you must think carefully about what resources will be required. It simply may not be possible to accomplish the goal with your resources in which case you need to set a more realistic goal. And this will force you to think about measures to use in setting the plan and lining up your work effort.

While this does require work, I would argue that a disciplined planning process culminating in specific, measurable goals will help ensure you are focused on the right goals for the right reasons with the right resources. In essence this is a learning process and if you don’t plan this way you deprive yourself of this very important learning opportunity. I think many leaders would say they do think about goals and resources but they stop short of setting specific, measurable goals. In this case they have not thought critically about the level of resources required (without a specific measurable plan how do you know how resource will be required?) nor have they created reports with a column for plan to use in managing the learning each month. Theoretically, it would be possible for these leaders to accomplish the same as those with plans but practically speaking it is far less likely. After all, why do you think your colleagues in other departments go through all this work? They wouldn’t if it didn’t lead to better results. Simply put, running learning like a business will ensure you are focused on the right learning with the right goals, measures and resources to deliver the greatest value to the organization.

While I believe more learning leaders are running learning like a business, most are not. One objection I hear is that this focus on planning and executing is no longer relevant. Some tell me that their organization’s goals change every two or three months and thus it makes no sense to plan. This is ridiculous. I don’t think the company’s goals change every few months and, if they do, I would question senior leadership’s abilities to manage the company. It may be that priorities shift during the year but goals like increasing sales, quality, and employee engagement are not going to change every few months. (They may mean that their own projects or work assignments change frequently which may actually be a result of not having a good business plan for learning.) Others say they have stable goals but their company’s products (like cell phones) change once per quarter and thus there is no way to plan. Of course there is. Focus first on the goal of increasing sales and let this be your guide in planning programs. True, new models will come out and you will have to design training programs around them but you don’t need all the specifics at the start of the year. You have an idea of what is required for each product launch so use that to plan. Your specific, measurable plans for reaction, learning and application are probably the same for all your product-related learning and you have ideas about the size of the target audience. So, plan based on what you know and refine them as you get better information through the year.

Bottom line, running learning like a business is simply applying time-tested basic business skills to learning. Outside of learning, it is nothing new. And, outside of learning it is still very much relevant. So, let’s move that percentage up from 10% and see what a difference it makes to manage learning this way. Give it a try and see what results you get. I think you will be pleasantly surprised.

Four Approaches to Break the “No One Is Asking” Cycle By Peggy Parskey

 

Untitled2

 

 

 

 

In the last 18 months, I have noticed a concerning shift in the sentiment of learning professionals about ramping up the quality of their measurement and in particular their reporting. This sentiment had been quite prevalent 10-12 years ago, but abated for a while. Now it’s back with a vengeance.
The essence of the sentiment is “Business leaders aren’t asking for this information, so there is no need for us to provide it.”  Most L&D practitioners don’t state it quite like that of course. They say, “Our business leaders only want business impact data.” Sometimes the stated reason is that business leaders consider L&D self-report data to be useless. Regardless of how they say it, the message is clear: “Business leaders aren’t asking for it (or worse, don’t want it), so I’m not going to bother.”
I’m not imaging this shift. Brandon Hall Group recently published a study entitled, “Learning Measurement 2016: Little Linkage to Performance.” In their study, they found that the pressure to measure learning is coming mostly from within the learning function itself. In addition, of the 367 companies studied, 13% said there was no pressure at all to measure learning.

 

What’s concerning about this situation?

L&D has been saying for years that it wants to be a viewed as a strategic partner and get a seat at the proverbial table.  Yet, the sentiment that “no one is asking for measurement data” is equivalent to saying, “Hey, I’m just an order taker. No orders, no product or service.”   If we are going to break free of an order-taking mentality, then we need to think strategically about measurement and reporting, not just the solutions L&D creates.

 

The sad reality is that often business leaders do not ask for L&D data because they don’t know what data is available or they don’t understand its value.  Many business leaders assume L&D is only capable of reporting activity or reaction (L1) data. Many still pejoratively refer to L&D evaluations as Happy Sheets or Smile Sheets.  If they believe that’s all L&D can do, then of course they won’t ask or exert any pressure to do more.

 

Equally importantly, innovation would come to a standstill if organizations only produced products and services that resulted from a client request. Henry Ford famously said, “If I had asked people what they wanted, they would have said faster horses.”  Steve Jobs talked about the dangers of group-think as a product development strategy when he said, “It’s really hard to design products by focus groups. A lot of times, people don’t know what they want until you show it to them.”  Simply because your internal client cannot envision what you can provide does not mean you shouldn’t provide it and then iterate to get it right.

 

The big problem with a “no one is asking” mentality is that it is a self-fulfilling prophesy and a downward spiral. At some point, someone needs to break the cycle. It might as well be L&D.  (See the Infographic)

 

What can you do differently right now?

Fortunately, as an individual learning practitioner you can break this cycle. Here are four approaches (singly or together) you should consider:Untitled

  1. Reframe the problem: If truly ‘no one is asking”, stop and ask yourself, “Why aren’t leaders asking?” Is it because this information has no value or because they don’t understand the value? Is it because they don’t know what L&D can provide?” Based on your answers to these questions, develop a game plan to demonstrate the value of the data you can provide (and then provide it to them).
  2. Find a friendly: If you want to break the ‘negative reinforcing loop’, find a business leader who is willing to work with you, a ‘friendly.’  You can spot a friendly fairly easily. She is interested in how L&D can add value to her business. He views you as a trusted advisor, brainstorming how to build new skills and capabilities within his team. She is innovative and willing to try new approaches even if they might not succeed the first time out. If you have one or two such business leaders with whom you work, seek them out and discuss what data you could provide to answer their questions.
  3. Report on data that matters to the business leader:From a measurement and reporting standpoint, L&D still puts too much of its energy into gathering and reporting data that matters to L&D but is not compelling to the business.  The number of employees who attended courses or the results of your Level 1 evaluation are simply not important to the business. Look beyond your current measures and educate yourself on best practices in L&D measurement.  Integrate questions about learning application and support into your evaluation instruments. This is data business leaders will care about if you show them how it affects them and their organization.
  4. Tell a compelling story:Do you remember the Magic Eye picture within a picture phenomenon? If you held the picture up to your nose, you might see the constellation Orion buried inside the picture. (I never saw anything.)  If you believe your data is meaningful and can help the business, don’t use the Magic Eye approach. Don’t expect your business partner to find the meaning in the data. Rather, tell the story behind the charts and graphs through dialogue. Help your business partner connect the dots; help her understand the consequences of not acting on the data and the benefits if she does.

A real life example

The ability of employees to apply training in the workplace depends on several conditions, much of it outside control of the L&D department. Factors such as the quality of the training, the motivation of the employee, the opportunity to apply the training to real work and the reinforcement of the direct manager all affect the extent to which training is applied.

 

A few years ago, I worked with a company that sold complex financial software. They re-engineered their implementation process to simplify the client experience, reduce implementation time and accelerate revenue recognition. The business leaders identified project management (PM) skills as critical to the success of this new approach.

 

The Process Transformation Team identified an initial group of employees to attend the PM training and pilot the new approach with several clients. When they reviewed the pilot results they were disappointed. Implementation time had not declined appreciably and the client felt that the process was more complex than expected. The Team Leaders investigated and found that the employees’ managers were not reinforcing the training or directing them to support resources when they struggled to apply the PM methodology in a real life setting. They also found that the job aids L&D had created were too cumbersome and not designed to be used in a dynamic client setting.

 

Imagine you are the L&D partner of the Implementation Transformation Business Leader. What data could you have provided to demonstrate that his people were not building and honing skills in this critical discipline?

 

This leader needed a regular stream of L&D data on actual application on the job and barriers to application.  He needed data on employees’ perceptions of how this training would impact business outcomes of simplification, implementation time and reduced time to revenue recognition. He needed to understand the barriers to application and where the business was accountable to address the issue or L&D. Moreover, he did not yet need incontrovertible proof that the training was improving business outcomes.

 

Data from L&D was essential for this leader to take action and address issues that affected his ability to successfully transform a key client process.  Unfortunately, the business leader didn’t realize that L&D could help him get ahead of this issue and didn’t think to ask. After L&D and the business leader started talking, sharing data and insights, the Leader not only acted, but worked with L&D to develop regular business-oriented reports.

 


Final thoughts

As an L&D practitioner, you can break the negative reinforcing cycle. Why not regularly provide this type of data and use it to create a dialogue about what other insights you can provide?  Why not take the first step to dispel the belief that L&D has no useful data or insights to offer?  It’s in your hands.

 

Have you had a “No One is Asking” moment? I would love to hear from you. Follow me on Twitter @peggyparskey or connect with me on LinkedIn at https://www.linkedin.com/in/peggy-parskey-11634

 

It’s Time to Solve the “The Last Mile Problem” in L&D

 

The telecommunications industry has had to confront a challenge referred to as “The Last Mile Problem.” This phrase describes the disproportional effort and cost to connect the broader infrastructure and communications network to the customer who resides in the ‘last mile.’

Learning Measurement also has a “Last Mile” problem. We have a wealth of data, automated reporting and data analysts who attempt to make sense of the data. Yet, as I observe and work with L&D organizations, I continually witness “Last Mile Problems” where the data doesn’t create new insights, enable wise decisions or drive meaningful action. If we want our investments in measurement to pay off, we need first to recognize that we have a problem and then second, to fix it.

Do you have a “Last Mile” problem?  Here are six indicators:

  1. One-size-fits-all reporting
  2. Mismatch between reporting and decision-making cadence
  3. Lack of context for assessing performance
  4. Poor data visualizations
  5. Little attention to the variability of the data
  6. Insufficient insight that answers the “so what?” question

Let’s explore each indicator with a brief discussion of the problem and the consequences for learning measurement

1.    One size fits all reporting

While L&D organizations generate a plethora of reports, not enough of them tailor their reports to the needs of their various audiences. Each audience, from senior leaders to program managers to instructors has a unique perspective and requires different data to inform decisions.

In an age of personally targeted ads on Facebook (creepy as they may be), we each want a customized, “made for me” experience. A single report that attempts to address the needs of all users will prove frustrating to everyone and meet the needs of no one. Ultimately, the one-size-fits-all approach will lead users to request ad hoc reports, create their own shadow process or simply resort to gut feel since the data provided wasn’t very useful.

2.    Mismatch between Reporting and Decision Making Cadence

Every organization has a cadence for making decisions and the cadence will vary based on the stakeholder and the decisions he/she will make. Instructors and course owners need data as soon as the course has completed. Course owners will also need monthly data to compare performance across the courses in their portfolio. The CLO will need monthly data but may also want a report when a particular measure is above or below a specific threshold.

In a world of high velocity decision making, decision makers need the right information at the right time for them. When the reporting cycle is mismatched with the timing of decision-making, the reports become ineffective as decision-making tools. The result? See #1: shadow processes or ad hoc reporting to address decision needs.

3.    Lack of context to assess performance

Many L&D reports present data without any context for what ‘good’ looks like.  The reports display the data, perhaps comparing across business units or showing historical performance. However, too many reports do not include a goal, a performance threshold or a benchmark.

Leaders don’t manage to trends nor do they manage to comparative data. Without specific performance goals or thresholds, L&D leaders lose the ability to motivate performance on the front end and have no basis for corrective action on the back end. These reports may be interesting, but ultimately produce little action.

4.    Poor data visualization

Data visualization is a hot topic and there is no shortage of books, blogs and training courses available. Despite the abundance of information, L&D’s charts and graphs appear not to have received the memo on the importance of following data display best practices.

Look at the reports you receive (or create). Do your visuals include unnecessary flourishes (also known as chart junk)? Do they contain 3-D charts? Do they present pie charts with more than five slices? Do your annotations simply describe the data on the chart or graph? If you answered “yes” to any of these questions, then you are making the last mile rockier and more challenging for your clients than it needs to be.

No one wants to work hard to figure out what your chart means. They don’t want to struggle to discern if that data point on the 3D chart is hovering near 3.2 (which is how it looks) or if the value is really 3.0.  They won’t pull out a magnifying glass to see which slice of the pie represents their Business Unit.

Poor visualizations fail to illuminate the story.  You have made it more difficult for your users to gain new insights or make decisions.  Over time, your readers shut down and opt to get their information elsewhere.

5.    Little attention to the variability of the data

As the L&D community increasingly embraces measurement and evaluation, it has also ratcheted up its reporting. Most L&D organizations publish reports with activity data, effectiveness scores, test results, cost data and often business results.

What we rarely consider, however, is the underlying variability of the data.  Without quantifying the variance, we may over-react to changes that are noise or under-react to changes that look like noise but in fact are signals.  We are doing our users a disservice by not revealing the normal variability of the data that helps to guide their decisions.

6.    No answers to the “So-what” question

This last mile problem is perhaps the most frustrating. Assume you have addressed the other five issues. You have role-relevant reports. The reports are synchronized with decision cycles. You have included goals and exception thresholds based on data variability. Your visualization are top notch.

Unfortunately, we all too often present data “as is” without any insights, context, or meaningful recommendations. In an attempt to add something that looks intelligent, we may add text that describes the graph or chart (e.g. “This chart shows that our learning performance declined in Q2.”). Perhaps we think our audience knows more than we do about what is driving the result. Often, we are rushed to publish the report and don’t have time to investigate underlying causes. We rationalize that it’s better to get the data out there as is than publish it late or not at all.

Amanda Cox, the New York Times Graphics Editor once said: “Nothing really important is headlined, “Here is some data. Hope you find something interesting.”

If we are going to shorten the last mile, then we have an obligation to highlight the insights for our users. Otherwise, we have left them standing on a lonely road hoping to hitch a ride home.

Recommendations to solve the “Last Mile” problems

There is a lot you can do to address the “Last Mile Problem” in L&D. Here are six suggestions that can help:

  • Create a reporting strategy. Consider your audiences, the decisions they need to make and how to present the data to speed time to insight. Include in your reporting strategy the frequency of reports and data to match the decision-making cadence of each role.
  • Identify a performance threshold for every measure on your report or dashboard that you intend to actively manage.  In the beginning, use historical data to set a goal or use benchmarks if they are available. Over time, set stretch goals to incent your team to improve its performance.
  • Learn about data visualization best practices and apply them.  Start with Stephen Few’s books. They are fun to read and even if you only pick up a few tips, you will see substantive improvements in your graphs and charts.
  • Periodically review the variability of your data to see if its behavior has changed. Use control charts (they are easy to create in Excel) for highly variable data to discern when your results are above or below your control limits.
  • Add meaningful annotations to your graphs and charts. Do your homework before you present your data. If a result looks odd, follow up and try to uncover the cause of the anomaly.
  • For senior leaders, in particular, discuss the data in a meeting (virtual or face-to-face). Use the meeting to explore and understand the findings and agree on actions and accountability for follow up. Employ these meetings to create a deeper understanding of how you can continually improve the quality of your reporting.

If you start taking these recommendations to heart, your last mile could very well shrink to a very small and manageable distance. Have you experienced a “Last Mile” problem? I’d love to hear from you.

Follow me on Twitter @peggyparskey or subscribe to my blog at  parskeyconsulting.blogspot.com/

CTR Is excited to Announce the Release of a Significantly Updated TDRp Measures Libary

CTR is excited to announce the release of a significantly updated version of the TDRp Measures Library. This update focuses on Learning and Development Measures and has expanded from 111 measures to 183 Learning related measures.  The additions include measures related to informal and social learning as well as more detailed breakdown of the cost measures.  This version of the library also includes measures defined by the CEB Corporate Leadership Council as well as updated references to the HCMI Human Capital Metrics Handbook and updated references to ATD, Kirkpatrick and Philips defined measures

If you are a CTR member, you have access to the updated version at no additional charge. https://www.centerfortalentreporting.org/download/talent-acquisition/ .

If you are not a member, join CTR for $299/year

The Missing Component of L&D Management By Dave Vance

The L&D field is more exciting than ever, particularly with the advent of smart systems, mobile learning, big data, and greater access to content than ever before. Even predictive analytics is beginning to move beyond workforce planning, retention and recruiting to some applications in the learning arena. What is missing, however, is an improvement in how we manage our programs and departments. Our management practices have not kept pace and often have not improved at all over the last twenty years. Sadly, most masters and Ph.D. programs in organizational or human capital development do not include even a single course in management, and there is often little opportunity to learn good management practices on the job.

Now, I know what some of you are thinking. You would tell me that you do manage. You hire people into the department, set goals for them, and have performance reviews. You create an annual budget. You implement new systems, you roll out new courses and update existing content, and you might even track where staff spend their time. You address issues when they arise and you may have a long-term strategy. All of these activities are important and, yes, are a part of the overall management of the department. But there is a component of management that is missing in most organizations, and it is a critical component – one that has the potential to drive significant value creation. In fact, in most organizations it would drive much greater improvement than adopting a new LMS, implementing a mobile strategy, or doubling the amount of e-learning available.

What is this magical, powerful management component? It is the disciplined process of creating a plan and then executing it. Sounds simple and obvious, right? It is not, which is why very few L&D departments manage this way today. Let’s put this concept in concrete terms. It means the department head and senior department leaders need to decide what they want to do for the coming year and then create SMART goals for each item. Do you want to improve participant satisfaction or the application rate? Great, set a specific, measurable goal like a 5 percentage point increase in the application rate. Do you want to shift your mix of learning from instructor-led to online? You need to set a specific, measurable goal like increasing the percentage of online learning from 25% to 45%. Of course, you won’t accomplish these goals simply because you wrote them down on paper. You need to create action plans for each one, detailing all that will be necessary to achieve the desired improvement, including any additional staff or budget. You will need to assign someone responsibility for it and everyone will need a clear understanding of their roles and responsibilities.

And that is just the start. Once the plan is complete containing all these SMART goals and the year is underway, you will need to actively manage these initiatives to ensure their success. You will need to use reports every month to see if you are on plan or not. If you are not on plan, or if the forecast shows you falling short of plan by year-end, then you need to take management action to get back on plan. So, the department head should have a dedicated (meaning the only agenda item is management of the department) one or two-hour meeting each month with her direct reports to review progress against goals and decide on actions to get back on plan. (At Caterpillar, this meeting was the second Tuesday of every month, and we always used the full two hours.)

This, then, is the essence of good management. You must spend the time upfront to craft a good plan with specific, measurable goals, and then you must execute that plan with discipline. Very few L&D departments do this today. Nearly all can tell you how many courses were delivered last year to how many participants, and they can tell what the participant satisfaction was with those courses. But very few created a business plan before the year started with SMART goals, and fewer yet produced the reports necessary to manage throughout the year. Almost none have a monthly meeting dedicated to using these reports to manage the key programs and initiatives of the department. So, conceptually, this all sounds pretty straightforward, and colleagues in other departments would assume L&D already operates this way. After all, many of them do and have done so for a long time. (Can you imagine a sales department with no SMART goals and no monthly meetings to compare progress to those goals?) We need to adopt the same disciplined approach to management. It will result in our achieving significantly more than the alternative management approach which is basically that everyone will work hard and we will achieve whatever we achieve. Given the current state of management in L&D, I am convinced that the discipline of creating a good plan and then executing it will deliver far greater value in most organizations than any other single change, program or initiative.

 

 

 

Corporate Learning Week Conference, Dave Vance to Speak: CTR Exclusive Discout

2016 CLW Logo

 

 

 

 

Corporate Learning Week
November 7-10, 2016
Renaissance Downtown, Dallas, Texas
www.clnweek.com

Join us on LinkedIn: https://www.linkedin.com/groups/7052790

Theme: Disrupting Ourselves: L&D Innovation at the Speed of Change

Corporate Learning Week 2016 will focus on powerful new tools and capabilities to help you transform your learning organization to align with your organization’s core needs and core values, to strike the balance between best practices and next practices that will engage today’s dynamic workforce.

New for 2016, we have confirmed 2 exciting site tours:

– Brinker International (Chili’s & Maggiano’s)

– American Airlines Training Center

Corporate Learning Week 2016 is designed for L&D leaders who are:

– Taking charge of the end-to-end talent operations

– Innovating their learning ecosystems to keep pace with the speed of change

– Rethinking learning measurement across the board

– Looking for ways to accelerate leadership development

 

Have some fun while you’re away from the office and build stronger relationships within your team by taking part in our featured team building activities:

– Keynote Meet n Greet

– Team Scavenger Hunt

– Dallas Foodie Crawl

– Dedicated Team Bonding

 

Featured Speakers:

Diane August, Chief Learning Architect, Nationwide
Brad Samargya, Chief Learning Officer, Ericsson
Lisa Cannata, Chief Learning Officer, Orlando Health
Jon Kaplan, CLO & VP of L&D, Discover Financial
Karen Hebert-Maccaro, Chief Learning Officer, AthenaHealth
Katica Roy, Global VP, Learning Analytics, SAP
Tom Spahr, Vice President of L&D, The Home Depot
Marc Ramos, Head of Education for Google Fiber, Google
Matt Smith, Vice President, Talent & Learning, Fairmont Hotels & Resorts
Mary Andrade, Head of Learning Design & Delivery Center of Excellence, McKinsey
Corey Rewis, SVP of Learning & Leadership Development, Bank of America
Nathan Knight, Director of L&D, Sony
Casper Moerck, Head of Learning Technology – Americas, Siemens
Kate Day, VP Workforce Transformation, MetLife
Mong Sai, Director, Learning and Organizational Development, Newegg
Sarah Mineau, AVP Learning & Development, State Farm Insurance
Sheryl Black, Head of Training Operations, American Airlines
Stacy Henry, VP L&D Americas, Bridgestone
Namrata Yadav, Senior Vice President, Head of Diversity & Inclusion Learning, Bank of America
James Woolsey, President, DAU
Sanford Gold, Sr. Director, Global Leadership & Development, ADP
Donna Simonetta, Senior Director, Commercial Services Learning, Hilton Hotels
Melissa Carlson, Director, Leadership Development, Quintiles
Paul Rumsey, Chief Learning Officer, Parkland Health & Hospital System
Priscilla Gill, SPHR, PCC, Leadership and Organization Development, Mayo Clinic
Shveta Miglani, Learning and Development Director, SanDisk
Chenier Mershon, Vice President Operations and Training, La Quinta Inns & Suites
Al Cornish, Chief Learning Officer – Norton University, Norton Healthcare
Angel Green, Director of Talent & Learning, Coca-Cola Beverages Florida
Jeremy Hunter, Creator of The Executive Mind @ Drucker Graduate School of Management
Josh Davis, Director of Research, NeuroLeadership Institute
James Mitchell, CLO & Head of Global Talent Development, Rackspace
Dom Perry, VP of L&D, Brinker (Chili’s & Maggiano’s)
Leah Minthorn, Director of Operations Learning – North America, Iron Mountain
Emmanuel Dalavai, Global Program Manager, Training & Leadership Development, Aviall, A Boeing Company
David Vance, President, Center for Talent Reporting
Sarah Otley, Next Generation Learning Lab Director, Capgemini University, Capgemini

Discounts 

CTR EXCLUSIVE: Save 20% off  2016CLW_CTR. Register now!

It’s Time to Solve the “The Last Mile Problem” in L&D By Peggy Parskey

The telecommunications industry has had to confront a challenge referred to as “The Last Mile Problem.” This phrase describes the disproportional effort and cost to connect the broader infrastructure and communications network to the customer who resides in the ‘last mile.’

Learning Measurement also has a “Last Mile” problem. We have a wealth of data, automated reporting and data analysts who attempt to make sense of the data. Yet, as I observe and work with L&D organizations, I continually witness “Last Mile Problems” where the data doesn’t create new insights, enable wise decisions or drive meaningful action. If we want our investments in measurement to pay off, we need first to recognize that we have a problem and then second, to fix it.

Do you have a “Last Mile” problem?  Here are six indicators:

  1. One-size-fits-all reporting
  2. Mismatch between reporting and decision-making cadence
  3. Lack of context for assessing performance
  4. Poor data visualizations
  5. Little attention to the variability of the data
  6. Insufficient insight that answers the “so what?” question

Let’s explore each indicator with a brief discussion of the problem and the consequences for learning measurement

1.    One size fits all reporting

While L&D organizations generate a plethora of reports, not enough of them tailor their reports to the needs of their various audiences. Each audience, from senior leaders to program managers to instructors has a unique perspective and requires different data to inform decisions.

In an age of personally targeted ads on Facebook (creepy as they may be), we each want a customized, “made for me” experience. A single report that attempts to address the needs of all users will prove frustrating to everyone and meet the needs of no one. Ultimately, the one-size-fits-all approach will lead users to request ad hoc reports, create their own shadow process or simply resort to gut feel since the data provided wasn’t very useful.

2.    Mismatch between Reporting and Decision Making Cadence

Every organization has a cadence for making decisions and the cadence will vary based on the stakeholder and the decisions he/she will make. Instructors and course owners need data as soon as the course has completed. Course owners will also need monthly data to compare performance across the courses in their portfolio. The CLO will need monthly data but may also want a report when a particular measure is above or below a specific threshold.

In a world of high velocity decision making, decision makers need the right information at the right time for them. When the reporting cycle is mismatched with the timing of decision-making, the reports become ineffective as decision-making tools. The result? See #1: shadow processes or ad hoc reporting to address decision needs.

3.    Lack of context to assess performance

Many L&D reports present data without any context for what ‘good’ looks like.  The reports display the data, perhaps comparing across business units or showing historical performance. However, too many reports do not include a goal, a performance threshold or a benchmark.

Leaders don’t manage to trends nor do they manage to comparative data. Without specific performance goals or thresholds, L&D leaders lose the ability to motivate performance on the front end and have no basis for corrective action on the back end. These reports may be interesting, but ultimately produce little action.

4.    Poor data visualization

Data visualization is a hot topic and there is no shortage of books, blogs and training courses available. Despite the abundance of information, L&D’s charts and graphs appear not to have received the memo on the importance of following data display best practices.

Look at the reports you receive (or create). Do your visuals include unnecessary flourishes (also known as chart junk)? Do they contain 3-D charts? Do they present pie charts with more than five slices? Do your annotations simply describe the data on the chart or graph? If you answered “yes” to any of these questions, then you are making the last mile rockier and more challenging for your clients than it needs to be.

No one wants to work hard to figure out what your chart means. They don’t want to struggle to discern if that data point on the 3D chart is hovering near 3.2 (which is how it looks) or if the value is really 3.0.  They won’t pull out a magnifying glass to see which slice of the pie represents their Business Unit.

Poor visualizations fail to illuminate the story.  You have made it more difficult for your users to gain new insights or make decisions.  Over time, your readers shut down and opt to get their information elsewhere.

5.    Little attention to the variability of the data

As the L&D community increasingly embraces measurement and evaluation, it has also ratcheted up its reporting. Most L&D organizations publish reports with activity data, effectiveness scores, test results, cost data and often business results.

What we rarely consider, however, is the underlying variability of the data.  Without quantifying the variance, we may over-react to changes that are noise or under-react to changes that look like noise but in fact are signals.  We are doing our users a disservice by not revealing the normal variability of the data that helps to guide their decisions.

6.    No answers to the “So-what” question

This last mile problem is perhaps the most frustrating. Assume you have addressed the other five issues. You have role-relevant reports. The reports are synchronized with decision cycles. You have included goals and exception thresholds based on data variability. Your visualization are top notch.

Unfortunately, we all too often present data “as is” without any insights, context, or meaningful recommendations. In an attempt to add something that looks intelligent, we may add text that describes the graph or chart (e.g. “This chart shows that our learning performance declined in Q2.”). Perhaps we think our audience knows more than we do about what is driving the result. Often, we are rushed to publish the report and don’t have time to investigate underlying causes. We rationalize that it’s better to get the data out there as is than publish it late or not at all.

Amanda Cox, the New York Times Graphics Editor once said: “Nothing really important is headlined, “Here is some data. Hope you find something interesting.”

If we are going to shorten the last mile, then we have an obligation to highlight the insights for our users. Otherwise, we have left them standing on a lonely road hoping to hitch a ride home.

Recommendations to solve the “Last Mile” problems

There is a lot you can do to address the “Last Mile Problem” in L&D. Here are six suggestions that can help:

  • Create a reporting strategy. Consider your audiences, the decisions they need to make and how to present the data to speed time to insight. Include in your reporting strategy the frequency of reports and data to match the decision-making cadence of each role.
  • Identify a performance threshold for every measure on your report or dashboard that you intend to actively manage.  In the beginning, use historical data to set a goal or use benchmarks if they are available. Over time, set stretch goals to incent your team to improve its performance.
  • Learn about data visualization best practices and apply them.  Start with Stephen Few’s books. They are fun to read and even if you only pick up a few tips, you will see substantive improvements in your graphs and charts.
  • Periodically review the variability of your data to see if its behavior has changed. Use control charts (they are easy to create in Excel) for highly variable data to discern when your results are above or below your control limits.
  • Add meaningful annotations to your graphs and charts. Do your homework before you present your data. If a result looks odd, follow up and try to uncover the cause of the anomaly.
  • For senior leaders, in particular, discuss the data in a meeting (virtual or face-to-face). Use the meeting to explore and understand the findings and agree on actions and accountability for follow up. Employ these meetings to create a deeper understanding of how you can continually improve the quality of your reporting.

If you start taking these recommendations to heart, your last mile could very well shrink to a very small and manageable distance. Have you experienced a “Last Mile” problem? I’d love to hear from you.

Follow me on Twitter @peggyparskey or subscribe to my blog at  parskeyconsulting.blogspot.com/

Please Welcome Jeff Carpenter to the CTR Board

2953551

 

 

 

 

 

Jeff Carpenter is a Senior Principal at Caveo Learning. Jeff works with clients to bridge performance gaps by addressing process improvements as well as front-line knowledge and skill development programs for Fortune 1000 companies.

Jeff has worked in entrepreneurial environments as a senior leader working to build internal organizational structures and business processes while leading teams at many Fortune 500 clients to solve some of their most pressing performance and process issues.

We are excited to welcome Jeff to the CTR Board. His knowledge and expertise will enhance our board and make CTR even greater. We have worked closely with Jeff in our past conferences and he is a great supporter of CTR and TDRp.

Please Welcome Joshua Craver to the CTR Board

AAEAAQAAAAAAAAV7AAAAJDkxYzU3MTUyLTJiYWMtNDE0Ni1hMmFjLWM2NTg2NjViMGQxYg

 

 

 

 

 

Joshua Craver is a values based and results oriented HR executive. In March of 2012 he joined Western Union as their global HR Chief of Staff. In January 2013, Joshua took on a new role as the Head of Western Union University and VP of Talent Management. Previous to this he lived and worked in India, Mexico and Argentina for over 7 years in various HR leadership roles. Based on these experiences he is well versed in growth markets strategy and execution.

Joshua also worked at the strategy consulting firm Booz Allen Hamilton. Companies that he has consulted within include but are not limited to The World Bank, Georgetown University Hospital, GE, CIBA, Scotia Bank, Qwest, Famers Insurance, Electronic Arts, Citibank, Agilent Technologies, Cigna, DuPont, Nissan, Lowes, Chevron and Cisco. He has also conducted business in over 40 countries.

CTR is happy and honored to have Joshua join our CTR Board. Josh has been one of our greatest supporters. We are excited to have his expertise, energy, and insight.

Are You Spending At Least 20% of Your Time Reinforcing Learning?

By Dave Vance

Training may not be rocket science, but it is a lot more complicated than it first appears. This is especially true when it comes to reinforcing the learning so that it will be applied and have the planned impact. After all, if the learning is not actually applied, there will be no impact and it will be “scrap learning”, regardless of how well it is designed or brilliantly delivered. So, your learning can be well received (high level 1 participant reaction) and the target audience can demonstrate mastery of the content (high level 2), but if it is not applied (level 3) there will be no impact (level 4) and the ROI will be negative (level 5). Unfortunately, all too often we don’t even measure level 3 to find out if the learning was applied. We deliver the training, hope for the best, and move on to the next need. We need to do better.

Our greatest opportunity here is to help the sponsors (who requested the learning for their employees) understand their very important role in managing and reinforcing the learning. Learning professionals can provide advice but cannot do this for them. Sponsors need to understand their role in communicating the need and benefits of the training to their employees before the course is delivered. (This will require clarity upfront on exactly what new behaviors or results the sponsor expects to see.) Ideally, the employee’s supervisor will also communicate expectations immediately preceding the course. The sponsor needs to ensure that the intended target audience takes the course and follow up with those who did not. Once employees have taken the course, the sponsor and direct supervisor need to reinforce the training to make sure the new knowledge, skills or behaviors are being used. They need to again clearly state what they expect as a result of the training and let the employees know what they are looking for. The employees (individually or as a group) should meet with supervisor to discuss the training and confirm application plans. The sponsor will need to be ready with both positive and negative consequences to elicit the desired results. All of this constitutes a robust, proactive reinforcement plan which should ideally be in place before the training is provided.

As you can tell, this takes time and effort. How much effort? My belief is that it will take at least 20% of the time you dedicate to planning, developing and delivering to do this well. So, if it takes five weeks to plan, develop and deliver a course, are you planning to spend at least 40 hours on reinforcement? Many sponsors and supervisors have no clue about the importance of reinforcement and will not do it unless you can make them understand the importance. Many think that once they have engaged L&D, their job is done and training will take care of everything else. However, this cannot possibly be the case. The target audience employees don’t work for L&D, and L&D professionals cannot make employees in sales or quality of manufacturing do anything. L&D certainly cannot compel these employees to apply their learning – only the leaders who asked for the training can do this. So, you need to convince the sponsor that they have a very important role in the training. L&D can do the needs analysis, design and develop the training (with help from the SMEs), and deliver it, but only the sponsor and leaders in the sponsor’s organization can make their employees apply it. Impact and results depend on both L&D and on the sponsor – neither one can do it alone. On this we must stand firm as learning professionals. In fact, if a sponsor tells you that they don’t have time for these reinforcement tasks, you need to respectfully decline to provide the learning because it is likely to be a complete waste of time and money. And don’t let them shift the burden of reinforcement to you. It is your responsibility to assist them but it is their responsibility to do it.

With this understanding, are you dedicating enough time to reinforcement? At Caterpillar we found that we had to redirect resources from design and delivery to reinforcement. In other words, you will likely have to make a trade off. We decided it was better to deliver less learning but do it well (meaning reinforced so it was applied) rather than deliver more learning which would be scrap. What is your strategy?

 

 

The Promising State of Human Capital(Report)

CTR is happy to provide the Promising State of Human Capital Report sponsored by CTR, i4cp, and ROI Institute.

This Valuable document  is available for downloa thanks to CTR and our partnerships with i4cp and ROI Institute.  This Report ties in with the ROI Institute webinar with the same name posted on our site for download. https://www.centerfortalentreporting.org/the-promising-state-of-human-capital-anayltics-webinar-by-roi-sponsored-by-ctr/

Partnership with the Corporate Learning Network

The Center for Talent Reporting, is pleased to announce a partnership with the Corporate Learning Network. CLN, is an online resource for corporate learning leaders and academic professionals. The Corporate Learning Network believes the Future of Learning will be created through multi-disciplinary approaches and peer-led exchange. With live conferences, community webinars and virtual forums, they bring together stakeholders across the L&D spectrum to help you realize your plans for improved learning outcomes and organizational success.

Learn more about  CLN at http://www.corporatelearningnetwork.com/

CLN goals are similar to CTR, and we believe this partnership will further the change and growth in HR and L&D.

Management: The Missing Link in L&D Today by Dave Vance

Despite great progress in so many areas of L&D, there is one area which has not seen much progress. This is the business-like management of the L&D department and L&D programs. Yes, department heads work to implement new LMS’s on time, and program managers and directors work to roll out new programs on time but there is still an opportunity to dramatically improve our management. Let’s look at programs first and then the department as a whole.

At the program level a really good manager would work with the goal owner or sponsor to reach upfront agreement on measures of success for the program like the planned impact on the goal. A really good manager would also work with the goal owner and stakeholders to identify plans or targets for all the key efficiency and effectiveness measures that must be achieved to have the desired impact on the goal. Examples of efficiency measures include number of participants to be put through the program, completions dates for the development or purchase of the content and completion dates for the delivery, and costs. Examples of effectiveness measures include levels 1 (participant reaction, 2 (knowledge check if appropriate), and 3 (application of learned behaviors or knowledge). A really good program manager would also work with the goal owner upfront to identify and reach agreement on roles and responsibilities for both parties, including a robust reinforcement plan to ensure the goal owner’s employees actually apply what they have learned. Today, many program managers do set plans for the number of participants and completion dates. Few, however, set plans for any effectiveness measures and few work with the goal owner to reach agreement on roles and responsibilities, including a good reinforcement plan. Virtually none use monthly reports which show the plan and year-to-date results for each measure and thus are not actively managing their program for success in the same way as their colleagues in sales or manufacturing.

At the department level, a really good department head or Chief Learning Officer would work with the senior leaders in the department to agree on a handful of key efficiency and effectiveness measures to improve for the coming year. Then the team would agree on a plan or target for each as well as an implementation plan for each, including the name of the person responsible (or at least the lead person) and the key action items. Examples of efficiency measures to manage at the department level include number of employees reached by L&D, percentage of courses completed and delivered on time, mix of learning modalities (like reducing the percentage of instructor-led in favor of online, virtual, and performance support), utilization rate (of e-learning suites, instructors or classrooms), and cost. Examples of effectiveness measures to be managed at the department level include level 1 participant and sponsor reaction, level 2 tests, and level 3 applications rates. Both the efficiency and effectiveness measures would be managed at an aggregated level with efficiency measures summed up across the enterprise and effectiveness measures averaged across the enterprise. A really good CLO would use monthly reports to compare year-to-date progress with plan to see where the department is on plan and where it is falling short of plan. A really good department head would use these reports in regularly scheduled monthly meetings to actively manage the department to ensure successful accomplishment of the plans set by the senior department leadership team. Today, very few department heads manage in this disciplined fashion with plans for their key measures, monthly reports which compare progress against plan, and monthly meetings dedicated to just the management of the department where the reports are used to identify where management action must be taken to get back on plan.

In conclusion, there is a great opportunity to improve our management which in turn would enable us to deliver even greater results. This requires getting upfront agreement on the key measures, on the plan for each one, and the actions items required to achieve each plan. For the outcome measures it also requires reaching agreement with the goal owner on mutual roles and responsibilities. Once the year is underway, good management also requires using reports in a regular meeting to identify problem areas and take corrective actions. Our colleagues in other departments have been doing this for a long time and with good reason. Let’s get onboard and manage L&D like we mean

The Promising State of Human Capital Anayltics (Webinar by ROI Co-Sponsored by CTR)

Talk with any human capital executive about the field of human capital analytics and they will generally agree: the best is yet to come. That doesn’t mean that many companies aren’t already performing incredible feats with people data—a few are profiled in this report—the statement is a testament to the opportunity that most can see in this burgeoning field. And it’s a testament to the constant new and innovative ways professionals are using people-related data to impact their organizations. This study surveyed analytics professionals in organizations of all sizes worldwide, and asked very pointed questions on how those organizations are using human capital analytics today, if at all. The results were more affirming than they were surprising:

  • Budgets for human capital analytics are increasing along with executive commitment.
  • Relatively few companies are using predictive analytics now, but expect to in the future.
  • Most are using analytics to support strategic planning and organizational effectiveness.

Successful companies tend to be those that purposefully use data to anticipate and prepare rather than to react to daily problems.

It’s clear in both the data from the survey and follow-up interviews that were conducted, that the future focus of
professionals in the human capital analytics field will increasingly be on using analytics to guide strategic decisions and affect
organizational performance. To sum up the state of human capital analytics in one word: Promising.

Objectives

This webinar introduces the new research study that demonstrates the progress that
continues to be made with human capital analytics. Participants will learn:

  • Four key findings on the state of
    human capital analytics
  • How high-performance
    organizations are building
    leading human capital analytics
    teams
  • What Google, HSBC, LinkedIn
    and Intel are doing to drive
    business performance through
    analytics
  • What you can do to move your
    analytics practice forward

We want to thank Patti Phillips, and Amy Armitage for the opportunity to co-sponsor this webinar. The recording and PPT have been made available to anyone who missed the webinar.

PPT  

The Promising State of HCA

3.60 MB 19 downloads

Recording https://roievents.webex.com/roievents/lsr.php?RCID=30bd391d319d4709b4203d48f6b7e4c9

Webinar: Innovation in L&D: Building a Modern Learning Culture

Tuesday, July 19 @ 11:00–12:00 p.m. CST

caveo-learning-logo-640x122px

 

 

Join Caveo Learning CEO Jeff Carpenter and a panel of forward-thinking learning leaders from Microsoft, McDonald’s, and Ford as they explore innovation in L&D.

More and more, stakeholders throughout the business are bypassing the learning function to create learning outside the learning & development organization. To win back the hearts of these stakeholders (and win a bigger share of the organizational budget), learning leaders must deliver solutions that are exciting, cutting-edge, efficient—in a word, innovative.

innovation-in-l-and-d-building-a-modern-learning-culture-1000x500px

 

 

 

 

 

 

 

 

By pushing beyond their comfort zone to embrace new ideas, concepts, and technologies, L&D organizations ensure their continued relevance and enhanced ability to deliver tangible business results.

In this webinar, you’ll learn how to:

  • Foster a culture of innovation and creativity in your learning organization
  • Reexamine and reconfigure outdated training through a lens of strategic innovation
  • Develop innovative training and eLearning programs within the confines of business processes and templates

Register today!

Webinar Presenters

Rich Burton, Group Project Manager at Microsoft

Jeff Carpenter, CEO of Caveo Learning (CTR Advisory Council Member)

Gale Halsey, Chief Learning Officer of Ford Motor Company

Rob Lauber, Chief Learning Officer of McDonald’s Corporation


Who Should Attend

Chief Learning Officers (CLOs), VPs of Training, Training Directors and Managers, Human Resources VPs and Directors, CEOs, and COOs.

Webinar: ROI from a TDRp Perspective: Plan, Deliver, and Demonstrate Business Value

August 17, 11am CT

ROI Institute LOGO TRUE

Join Patti Phillips, CEO of ROI Institute, for an introduction to ROI and to learn how ROI and TDRp work together to help you plan, deliver and demonstrate business value. She will share the steps for a successful program implementation starting with the upfront planning and continuing through the calculation of actual ROI at the end. You will come away with a better appreciation of both ROI and TDRp as well as how they can both help you deliver better business value,

Register Today:- http://bit.ly/21imWZ8

Dr. Patti Phillips is president and CEO of the ROI Institute, Inc., the leading source of ROI competency building, implementation support, networking, and research. A renowned expert in measurement and evaluation, she helps organizations implement the ROI Methodology in 50 countries around the world.

Since 1997, following a 13-year career in the electric utility industry, Phillips has embraced the ROI Methodology by committing herself to ongoing research and practice. To this end, she has implemented ROI in private sector and public sector organizations. She has conducted ROI impact studies on programs such as leadership development, sales, new-hire orientation, human performance improvement, K-12 educator development, and educators’ National Board Certification mentoring.

Phillips teaches others to implement the ROI Methodology through the ROI Certification process, as a facilitator for ASTD’s ROI and Measuring and Evaluating Learning Workshops, and as professor of practice for The University of Southern Mississippi Gulf Coast Campus Ph.D. in Human Capital Development program. She also serves as adjunct faculty for the UN System Staff College in Turin, Italy, where she teaches the ROI Methodology through their Evaluation and Impact Assessment Workshop and Measurement for Results-Based Management. She serves on numerous doctoral dissertation committees, assisting students as they develop their own research on measurement, evaluation, and ROI.

Phillips’s academic accomplishments include a Ph.D. in International Development and a master’s degree in Public and Private Management. She is a certified in ROI evaluation and has been awarded the designations of Certified Professional in Learning and Performance and Certified Performance Technologist.

 

 

New Corporate Memberships Benefits

CTR is happy to announce additional benefits for Corporate Members.

  • A Free Q&A with Dave Vance to answer your TDRp, Measurement, or Reporting Questions
  • A Free Review of your List of Measures or Reports
  • A Free Private Intro to TDRp Webinar for Your Team
  • A $100 Discount on the CTR Confernece
  • A $200 discount for the Basics Workshop
  • A $500 discount for a Custom Workshop

Please contact Andy Vance at avance@centerfortalentreporting.org for more details

Our Newest CTR Advisory Council Member Todd Harrison

Todd Harrison, Ed.D                                                                             Todd

[Director, Talent Solutions]

CTR is pleased to announce Todd Harrison as our newest Advisory Council Member. Todd will be taking over for Kendell Kerekes. We are thrilled to have such an experienced and renowned individual added to our ranks.

Background

Dr. Harrison joined CEB Metrics that Matter (MTM) in 2012, after nearly 15 years of corporate experience in various learning and development leadership roles, where he was an active practitioner of the MTM system.  At MTM, he is accountable for the MTM Professional Services team of approximately 40 people, and provides strategic leadership to the Metrics that Matter (MTM) Client Advisory and Consulting teams within this group.  These two teams have primary responsibility for delivering ongoing support services to MTM technology clients, as well accountability for the development and delivery of learning and human capital consulting measurement solutions, respectively.  Specifically, the Professional Services team helps measure the impact and effectiveness of learning and development programs within an organization intended to unlock the potential of organizations and leaders by advancing the science and practice of talent management.  Dr. Harrison’s business responsibilities include oversight of a portfolio of more than 600 clients and an annual global P&L goal of nearly $5M.

Dr. Harrison has extensive knowledge and expertise in several areas of talent management to include:

·        Succession Planning ·        Leadership Development ·        Talent Analytics
·        Learning Strategies ·        Employee Engagement ·        Competency Design
·        New Hire Onboarding ·        Performance Management ·        Organization Development

 

Education

  • Doctorate in Organizational Leadership (2016) – Indiana Wesleyan University
  • Master of Arts in Human Resource Development (1995) – Webster University
  • Bachelor of Science in Journalism (1986) – Murray State University

Professional Experience

  • Director, Talent Solutions, Metrics that Matter: CEB, Chicago, IL (2015 – present)
  • Director, Consulting Services, Metrics that Matter, Chicago, IL (2014 – 2015)
  • Senior Consultant, Metrics that Matter, Chicago, IL (2012 – 2014)
  • Director, Global Leadership & Organizational Development, Stryker, Kalamazoo, MI (2010 – 2012)
  • Director, Leadership & Associate Development, Anthem Blue Cross/Blue Shield, Indianapolis, IN (2002 – 2010)
  • Vice President, Human Resources, Total eMed, Franklin, TN (1999 – 2001)
  • Director, Leadership & Organizational Development, American Rehability Services, Brentwood, TN (1997 – 1999)
  • Lieutenant Colonel (Retired), United States Army (1984 – 2005)

Professional Affiliations

  • Association for Training Development (1993 – Present)

CEB Platinum Sponsor

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at cebglobal.com.

CEB is a Platinum sponsor of CTR. With their continued support, CTR has continued its mission to bring standards and measures to the HR community. CEB was a big participant in the February 2016 CTR Conference. Helping to draw many participants making this conference, our largest yet. We look forward to continuing our relationship with CEB.

For information on sponsorship CTR please visit https://www.centerfortalentreporting.org/sponsorship-opportunities/