Latest Resources

Corporate Learning Week Conference, Dave Vance to Speak: CTR Exclusive Discout

2016 CLW Logo





Corporate Learning Week
November 7-10, 2016
Renaissance Downtown, Dallas, Texas

Join us on LinkedIn:

Theme: Disrupting Ourselves: L&D Innovation at the Speed of Change

Corporate Learning Week 2016 will focus on powerful new tools and capabilities to help you transform your learning organization to align with your organization’s core needs and core values, to strike the balance between best practices and next practices that will engage today’s dynamic workforce.

New for 2016, we have confirmed 2 exciting site tours:

– Brinker International (Chili’s & Maggiano’s)

– American Airlines Training Center

Corporate Learning Week 2016 is designed for L&D leaders who are:

– Taking charge of the end-to-end talent operations

– Innovating their learning ecosystems to keep pace with the speed of change

– Rethinking learning measurement across the board

– Looking for ways to accelerate leadership development


Have some fun while you’re away from the office and build stronger relationships within your team by taking part in our featured team building activities:

– Keynote Meet n Greet

– Team Scavenger Hunt

– Dallas Foodie Crawl

– Dedicated Team Bonding


Featured Speakers:

Diane August, Chief Learning Architect, Nationwide
Brad Samargya, Chief Learning Officer, Ericsson
Lisa Cannata, Chief Learning Officer, Orlando Health
Jon Kaplan, CLO & VP of L&D, Discover Financial
Karen Hebert-Maccaro, Chief Learning Officer, AthenaHealth
Katica Roy, Global VP, Learning Analytics, SAP
Tom Spahr, Vice President of L&D, The Home Depot
Marc Ramos, Head of Education for Google Fiber, Google
Matt Smith, Vice President, Talent & Learning, Fairmont Hotels & Resorts
Mary Andrade, Head of Learning Design & Delivery Center of Excellence, McKinsey
Corey Rewis, SVP of Learning & Leadership Development, Bank of America
Nathan Knight, Director of L&D, Sony
Casper Moerck, Head of Learning Technology – Americas, Siemens
Kate Day, VP Workforce Transformation, MetLife
Mong Sai, Director, Learning and Organizational Development, Newegg
Sarah Mineau, AVP Learning & Development, State Farm Insurance
Sheryl Black, Head of Training Operations, American Airlines
Stacy Henry, VP L&D Americas, Bridgestone
Namrata Yadav, Senior Vice President, Head of Diversity & Inclusion Learning, Bank of America
James Woolsey, President, DAU
Sanford Gold, Sr. Director, Global Leadership & Development, ADP
Donna Simonetta, Senior Director, Commercial Services Learning, Hilton Hotels
Melissa Carlson, Director, Leadership Development, Quintiles
Paul Rumsey, Chief Learning Officer, Parkland Health & Hospital System
Priscilla Gill, SPHR, PCC, Leadership and Organization Development, Mayo Clinic
Shveta Miglani, Learning and Development Director, SanDisk
Chenier Mershon, Vice President Operations and Training, La Quinta Inns & Suites
Al Cornish, Chief Learning Officer – Norton University, Norton Healthcare
Angel Green, Director of Talent & Learning, Coca-Cola Beverages Florida
Jeremy Hunter, Creator of The Executive Mind @ Drucker Graduate School of Management
Josh Davis, Director of Research, NeuroLeadership Institute
James Mitchell, CLO & Head of Global Talent Development, Rackspace
Dom Perry, VP of L&D, Brinker (Chili’s & Maggiano’s)
Leah Minthorn, Director of Operations Learning – North America, Iron Mountain
Emmanuel Dalavai, Global Program Manager, Training & Leadership Development, Aviall, A Boeing Company
David Vance, President, Center for Talent Reporting
Sarah Otley, Next Generation Learning Lab Director, Capgemini University, Capgemini


CTR EXCLUSIVE: Save 20% off  2016CLW_CTR. Register now!

It’s Time to Solve the “The Last Mile Problem” in L&D By Peggy Parskey

The telecommunications industry has had to confront a challenge referred to as “The Last Mile Problem.” This phrase describes the disproportional effort and cost to connect the broader infrastructure and communications network to the customer who resides in the ‘last mile.’

Learning Measurement also has a “Last Mile” problem. We have a wealth of data, automated reporting and data analysts who attempt to make sense of the data. Yet, as I observe and work with L&D organizations, I continually witness “Last Mile Problems” where the data doesn’t create new insights, enable wise decisions or drive meaningful action. If we want our investments in measurement to pay off, we need first to recognize that we have a problem and then second, to fix it.

Do you have a “Last Mile” problem?  Here are six indicators:

  1. One-size-fits-all reporting
  2. Mismatch between reporting and decision-making cadence
  3. Lack of context for assessing performance
  4. Poor data visualizations
  5. Little attention to the variability of the data
  6. Insufficient insight that answers the “so what?” question

Let’s explore each indicator with a brief discussion of the problem and the consequences for learning measurement

1.    One size fits all reporting

While L&D organizations generate a plethora of reports, not enough of them tailor their reports to the needs of their various audiences. Each audience, from senior leaders to program managers to instructors has a unique perspective and requires different data to inform decisions.

In an age of personally targeted ads on Facebook (creepy as they may be), we each want a customized, “made for me” experience. A single report that attempts to address the needs of all users will prove frustrating to everyone and meet the needs of no one. Ultimately, the one-size-fits-all approach will lead users to request ad hoc reports, create their own shadow process or simply resort to gut feel since the data provided wasn’t very useful.

2.    Mismatch between Reporting and Decision Making Cadence

Every organization has a cadence for making decisions and the cadence will vary based on the stakeholder and the decisions he/she will make. Instructors and course owners need data as soon as the course has completed. Course owners will also need monthly data to compare performance across the courses in their portfolio. The CLO will need monthly data but may also want a report when a particular measure is above or below a specific threshold.

In a world of high velocity decision making, decision makers need the right information at the right time for them. When the reporting cycle is mismatched with the timing of decision-making, the reports become ineffective as decision-making tools. The result? See #1: shadow processes or ad hoc reporting to address decision needs.

3.    Lack of context to assess performance

Many L&D reports present data without any context for what ‘good’ looks like.  The reports display the data, perhaps comparing across business units or showing historical performance. However, too many reports do not include a goal, a performance threshold or a benchmark.

Leaders don’t manage to trends nor do they manage to comparative data. Without specific performance goals or thresholds, L&D leaders lose the ability to motivate performance on the front end and have no basis for corrective action on the back end. These reports may be interesting, but ultimately produce little action.

4.    Poor data visualization

Data visualization is a hot topic and there is no shortage of books, blogs and training courses available. Despite the abundance of information, L&D’s charts and graphs appear not to have received the memo on the importance of following data display best practices.

Look at the reports you receive (or create). Do your visuals include unnecessary flourishes (also known as chart junk)? Do they contain 3-D charts? Do they present pie charts with more than five slices? Do your annotations simply describe the data on the chart or graph? If you answered “yes” to any of these questions, then you are making the last mile rockier and more challenging for your clients than it needs to be.

No one wants to work hard to figure out what your chart means. They don’t want to struggle to discern if that data point on the 3D chart is hovering near 3.2 (which is how it looks) or if the value is really 3.0.  They won’t pull out a magnifying glass to see which slice of the pie represents their Business Unit.

Poor visualizations fail to illuminate the story.  You have made it more difficult for your users to gain new insights or make decisions.  Over time, your readers shut down and opt to get their information elsewhere.

5.    Little attention to the variability of the data

As the L&D community increasingly embraces measurement and evaluation, it has also ratcheted up its reporting. Most L&D organizations publish reports with activity data, effectiveness scores, test results, cost data and often business results.

What we rarely consider, however, is the underlying variability of the data.  Without quantifying the variance, we may over-react to changes that are noise or under-react to changes that look like noise but in fact are signals.  We are doing our users a disservice by not revealing the normal variability of the data that helps to guide their decisions.

6.    No answers to the “So-what” question

This last mile problem is perhaps the most frustrating. Assume you have addressed the other five issues. You have role-relevant reports. The reports are synchronized with decision cycles. You have included goals and exception thresholds based on data variability. Your visualization are top notch.

Unfortunately, we all too often present data “as is” without any insights, context, or meaningful recommendations. In an attempt to add something that looks intelligent, we may add text that describes the graph or chart (e.g. “This chart shows that our learning performance declined in Q2.”). Perhaps we think our audience knows more than we do about what is driving the result. Often, we are rushed to publish the report and don’t have time to investigate underlying causes. We rationalize that it’s better to get the data out there as is than publish it late or not at all.

Amanda Cox, the New York Times Graphics Editor once said: “Nothing really important is headlined, “Here is some data. Hope you find something interesting.”

If we are going to shorten the last mile, then we have an obligation to highlight the insights for our users. Otherwise, we have left them standing on a lonely road hoping to hitch a ride home.

Recommendations to solve the “Last Mile” problems

There is a lot you can do to address the “Last Mile Problem” in L&D. Here are six suggestions that can help:

  • Create a reporting strategy. Consider your audiences, the decisions they need to make and how to present the data to speed time to insight. Include in your reporting strategy the frequency of reports and data to match the decision-making cadence of each role.
  • Identify a performance threshold for every measure on your report or dashboard that you intend to actively manage.  In the beginning, use historical data to set a goal or use benchmarks if they are available. Over time, set stretch goals to incent your team to improve its performance.
  • Learn about data visualization best practices and apply them.  Start with Stephen Few’s books. They are fun to read and even if you only pick up a few tips, you will see substantive improvements in your graphs and charts.
  • Periodically review the variability of your data to see if its behavior has changed. Use control charts (they are easy to create in Excel) for highly variable data to discern when your results are above or below your control limits.
  • Add meaningful annotations to your graphs and charts. Do your homework before you present your data. If a result looks odd, follow up and try to uncover the cause of the anomaly.
  • For senior leaders, in particular, discuss the data in a meeting (virtual or face-to-face). Use the meeting to explore and understand the findings and agree on actions and accountability for follow up. Employ these meetings to create a deeper understanding of how you can continually improve the quality of your reporting.

If you start taking these recommendations to heart, your last mile could very well shrink to a very small and manageable distance. Have you experienced a “Last Mile” problem? I’d love to hear from you.

Follow me on Twitter @peggyparskey or subscribe to my blog at

Please Welcome Jeff Carpenter to the CTR Board







Jeff Carpenter is a Senior Principal at Caveo Learning. Jeff works with clients to bridge performance gaps by addressing process improvements as well as front-line knowledge and skill development programs for Fortune 1000 companies.

Jeff has worked in entrepreneurial environments as a senior leader working to build internal organizational structures and business processes while leading teams at many Fortune 500 clients to solve some of their most pressing performance and process issues.

We are excited to welcome Jeff to the CTR Board. His knowledge and expertise will enhance our board and make CTR even greater. We have worked closely with Jeff in our past conferences and he is a great supporter of CTR and TDRp.

Please Welcome Joshua Craver to the CTR Board







Joshua Craver is a values based and results oriented HR executive. In March of 2012 he joined Western Union as their global HR Chief of Staff. In January 2013, Joshua took on a new role as the Head of Western Union University and VP of Talent Management. Previous to this he lived and worked in India, Mexico and Argentina for over 7 years in various HR leadership roles. Based on these experiences he is well versed in growth markets strategy and execution.

Joshua also worked at the strategy consulting firm Booz Allen Hamilton. Companies that he has consulted within include but are not limited to The World Bank, Georgetown University Hospital, GE, CIBA, Scotia Bank, Qwest, Famers Insurance, Electronic Arts, Citibank, Agilent Technologies, Cigna, DuPont, Nissan, Lowes, Chevron and Cisco. He has also conducted business in over 40 countries.

CTR is happy and honored to have Joshua join our CTR Board. Josh has been one of our greatest supporters. We are excited to have his expertise, energy, and insight.

Are You Spending At Least 20% of Your Time Reinforcing Learning?

By Dave Vance

Training may not be rocket science, but it is a lot more complicated than it first appears. This is especially true when it comes to reinforcing the learning so that it will be applied and have the planned impact. After all, if the learning is not actually applied, there will be no impact and it will be “scrap learning”, regardless of how well it is designed or brilliantly delivered. So, your learning can be well received (high level 1 participant reaction) and the target audience can demonstrate mastery of the content (high level 2), but if it is not applied (level 3) there will be no impact (level 4) and the ROI will be negative (level 5). Unfortunately, all too often we don’t even measure level 3 to find out if the learning was applied. We deliver the training, hope for the best, and move on to the next need. We need to do better.

Our greatest opportunity here is to help the sponsors (who requested the learning for their employees) understand their very important role in managing and reinforcing the learning. Learning professionals can provide advice but cannot do this for them. Sponsors need to understand their role in communicating the need and benefits of the training to their employees before the course is delivered. (This will require clarity upfront on exactly what new behaviors or results the sponsor expects to see.) Ideally, the employee’s supervisor will also communicate expectations immediately preceding the course. The sponsor needs to ensure that the intended target audience takes the course and follow up with those who did not. Once employees have taken the course, the sponsor and direct supervisor need to reinforce the training to make sure the new knowledge, skills or behaviors are being used. They need to again clearly state what they expect as a result of the training and let the employees know what they are looking for. The employees (individually or as a group) should meet with supervisor to discuss the training and confirm application plans. The sponsor will need to be ready with both positive and negative consequences to elicit the desired results. All of this constitutes a robust, proactive reinforcement plan which should ideally be in place before the training is provided.

As you can tell, this takes time and effort. How much effort? My belief is that it will take at least 20% of the time you dedicate to planning, developing and delivering to do this well. So, if it takes five weeks to plan, develop and deliver a course, are you planning to spend at least 40 hours on reinforcement? Many sponsors and supervisors have no clue about the importance of reinforcement and will not do it unless you can make them understand the importance. Many think that once they have engaged L&D, their job is done and training will take care of everything else. However, this cannot possibly be the case. The target audience employees don’t work for L&D, and L&D professionals cannot make employees in sales or quality of manufacturing do anything. L&D certainly cannot compel these employees to apply their learning – only the leaders who asked for the training can do this. So, you need to convince the sponsor that they have a very important role in the training. L&D can do the needs analysis, design and develop the training (with help from the SMEs), and deliver it, but only the sponsor and leaders in the sponsor’s organization can make their employees apply it. Impact and results depend on both L&D and on the sponsor – neither one can do it alone. On this we must stand firm as learning professionals. In fact, if a sponsor tells you that they don’t have time for these reinforcement tasks, you need to respectfully decline to provide the learning because it is likely to be a complete waste of time and money. And don’t let them shift the burden of reinforcement to you. It is your responsibility to assist them but it is their responsibility to do it.

With this understanding, are you dedicating enough time to reinforcement? At Caterpillar we found that we had to redirect resources from design and delivery to reinforcement. In other words, you will likely have to make a trade off. We decided it was better to deliver less learning but do it well (meaning reinforced so it was applied) rather than deliver more learning which would be scrap. What is your strategy?



The Promising State of Human Capital(Report)

CTR is happy to provide the Promising State of Human Capital Report sponsored by CTR, i4cp, and ROI Institute.

This Valuable document  is available for downloa thanks to CTR and our partnerships with i4cp and ROI Institute.  This Report ties in with the ROI Institute webinar with the same name posted on our site for download.

Partnership with the Corporate Learning Network

The Center for Talent Reporting, is pleased to announce a partnership with the Corporate Learning Network. CLN, is an online resource for corporate learning leaders and academic professionals. The Corporate Learning Network believes the Future of Learning will be created through multi-disciplinary approaches and peer-led exchange. With live conferences, community webinars and virtual forums, they bring together stakeholders across the L&D spectrum to help you realize your plans for improved learning outcomes and organizational success.

Learn more about  CLN at

CLN goals are similar to CTR, and we believe this partnership will further the change and growth in HR and L&D.

Management: The Missing Link in L&D Today by Dave Vance

Despite great progress in so many areas of L&D, there is one area which has not seen much progress. This is the business-like management of the L&D department and L&D programs. Yes, department heads work to implement new LMS’s on time, and program managers and directors work to roll out new programs on time but there is still an opportunity to dramatically improve our management. Let’s look at programs first and then the department as a whole.

At the program level a really good manager would work with the goal owner or sponsor to reach upfront agreement on measures of success for the program like the planned impact on the goal. A really good manager would also work with the goal owner and stakeholders to identify plans or targets for all the key efficiency and effectiveness measures that must be achieved to have the desired impact on the goal. Examples of efficiency measures include number of participants to be put through the program, completions dates for the development or purchase of the content and completion dates for the delivery, and costs. Examples of effectiveness measures include levels 1 (participant reaction, 2 (knowledge check if appropriate), and 3 (application of learned behaviors or knowledge). A really good program manager would also work with the goal owner upfront to identify and reach agreement on roles and responsibilities for both parties, including a robust reinforcement plan to ensure the goal owner’s employees actually apply what they have learned. Today, many program managers do set plans for the number of participants and completion dates. Few, however, set plans for any effectiveness measures and few work with the goal owner to reach agreement on roles and responsibilities, including a good reinforcement plan. Virtually none use monthly reports which show the plan and year-to-date results for each measure and thus are not actively managing their program for success in the same way as their colleagues in sales or manufacturing.

At the department level, a really good department head or Chief Learning Officer would work with the senior leaders in the department to agree on a handful of key efficiency and effectiveness measures to improve for the coming year. Then the team would agree on a plan or target for each as well as an implementation plan for each, including the name of the person responsible (or at least the lead person) and the key action items. Examples of efficiency measures to manage at the department level include number of employees reached by L&D, percentage of courses completed and delivered on time, mix of learning modalities (like reducing the percentage of instructor-led in favor of online, virtual, and performance support), utilization rate (of e-learning suites, instructors or classrooms), and cost. Examples of effectiveness measures to be managed at the department level include level 1 participant and sponsor reaction, level 2 tests, and level 3 applications rates. Both the efficiency and effectiveness measures would be managed at an aggregated level with efficiency measures summed up across the enterprise and effectiveness measures averaged across the enterprise. A really good CLO would use monthly reports to compare year-to-date progress with plan to see where the department is on plan and where it is falling short of plan. A really good department head would use these reports in regularly scheduled monthly meetings to actively manage the department to ensure successful accomplishment of the plans set by the senior department leadership team. Today, very few department heads manage in this disciplined fashion with plans for their key measures, monthly reports which compare progress against plan, and monthly meetings dedicated to just the management of the department where the reports are used to identify where management action must be taken to get back on plan.

In conclusion, there is a great opportunity to improve our management which in turn would enable us to deliver even greater results. This requires getting upfront agreement on the key measures, on the plan for each one, and the actions items required to achieve each plan. For the outcome measures it also requires reaching agreement with the goal owner on mutual roles and responsibilities. Once the year is underway, good management also requires using reports in a regular meeting to identify problem areas and take corrective actions. Our colleagues in other departments have been doing this for a long time and with good reason. Let’s get onboard and manage L&D like we mean

The Promising State of Human Capital Anayltics (Webinar by ROI Co-Sponsored by CTR)

Talk with any human capital executive about the field of human capital analytics and they will generally agree: the best is yet to come. That doesn’t mean that many companies aren’t already performing incredible feats with people data—a few are profiled in this report—the statement is a testament to the opportunity that most can see in this burgeoning field. And it’s a testament to the constant new and innovative ways professionals are using people-related data to impact their organizations. This study surveyed analytics professionals in organizations of all sizes worldwide, and asked very pointed questions on how those organizations are using human capital analytics today, if at all. The results were more affirming than they were surprising:

  • Budgets for human capital analytics are increasing along with executive commitment.
  • Relatively few companies are using predictive analytics now, but expect to in the future.
  • Most are using analytics to support strategic planning and organizational effectiveness.

Successful companies tend to be those that purposefully use data to anticipate and prepare rather than to react to daily problems.

It’s clear in both the data from the survey and follow-up interviews that were conducted, that the future focus of
professionals in the human capital analytics field will increasingly be on using analytics to guide strategic decisions and affect
organizational performance. To sum up the state of human capital analytics in one word: Promising.


This webinar introduces the new research study that demonstrates the progress that
continues to be made with human capital analytics. Participants will learn:

  • Four key findings on the state of
    human capital analytics
  • How high-performance
    organizations are building
    leading human capital analytics
  • What Google, HSBC, LinkedIn
    and Intel are doing to drive
    business performance through
  • What you can do to move your
    analytics practice forward

We want to thank Patti Phillips, and Amy Armitage for the opportunity to co-sponsor this webinar. The recording and PPT have been made available to anyone who missed the webinar.


The Promising State of HCA

3.60 MB 27 downloads


Webinar: Innovation in L&D: Building a Modern Learning Culture

Tuesday, July 19 @ 11:00–12:00 p.m. CST




Join Caveo Learning CEO Jeff Carpenter and a panel of forward-thinking learning leaders from Microsoft, McDonald’s, and Ford as they explore innovation in L&D.

More and more, stakeholders throughout the business are bypassing the learning function to create learning outside the learning & development organization. To win back the hearts of these stakeholders (and win a bigger share of the organizational budget), learning leaders must deliver solutions that are exciting, cutting-edge, efficient—in a word, innovative.










By pushing beyond their comfort zone to embrace new ideas, concepts, and technologies, L&D organizations ensure their continued relevance and enhanced ability to deliver tangible business results.

In this webinar, you’ll learn how to:

  • Foster a culture of innovation and creativity in your learning organization
  • Reexamine and reconfigure outdated training through a lens of strategic innovation
  • Develop innovative training and eLearning programs within the confines of business processes and templates

Register today!

Webinar Presenters

Rich Burton, Group Project Manager at Microsoft

Jeff Carpenter, CEO of Caveo Learning (CTR Advisory Council Member)

Gale Halsey, Chief Learning Officer of Ford Motor Company

Rob Lauber, Chief Learning Officer of McDonald’s Corporation

Who Should Attend

Chief Learning Officers (CLOs), VPs of Training, Training Directors and Managers, Human Resources VPs and Directors, CEOs, and COOs.

Webinar: ROI from a TDRp Perspective: Plan, Deliver, and Demonstrate Business Value

August 17, 11am CT


Join Patti Phillips, CEO of ROI Institute, for an introduction to ROI and to learn how ROI and TDRp work together to help you plan, deliver and demonstrate business value. She will share the steps for a successful program implementation starting with the upfront planning and continuing through the calculation of actual ROI at the end. You will come away with a better appreciation of both ROI and TDRp as well as how they can both help you deliver better business value,

Register Today:-

Dr. Patti Phillips is president and CEO of the ROI Institute, Inc., the leading source of ROI competency building, implementation support, networking, and research. A renowned expert in measurement and evaluation, she helps organizations implement the ROI Methodology in 50 countries around the world.

Since 1997, following a 13-year career in the electric utility industry, Phillips has embraced the ROI Methodology by committing herself to ongoing research and practice. To this end, she has implemented ROI in private sector and public sector organizations. She has conducted ROI impact studies on programs such as leadership development, sales, new-hire orientation, human performance improvement, K-12 educator development, and educators’ National Board Certification mentoring.

Phillips teaches others to implement the ROI Methodology through the ROI Certification process, as a facilitator for ASTD’s ROI and Measuring and Evaluating Learning Workshops, and as professor of practice for The University of Southern Mississippi Gulf Coast Campus Ph.D. in Human Capital Development program. She also serves as adjunct faculty for the UN System Staff College in Turin, Italy, where she teaches the ROI Methodology through their Evaluation and Impact Assessment Workshop and Measurement for Results-Based Management. She serves on numerous doctoral dissertation committees, assisting students as they develop their own research on measurement, evaluation, and ROI.

Phillips’s academic accomplishments include a Ph.D. in International Development and a master’s degree in Public and Private Management. She is a certified in ROI evaluation and has been awarded the designations of Certified Professional in Learning and Performance and Certified Performance Technologist.



New Corporate Memberships Benefits

CTR is happy to announce additional benefits for Corporate Members.

  • A Free Q&A with Dave Vance to answer your TDRp, Measurement, or Reporting Questions
  • A Free Review of your List of Measures or Reports
  • A Free Private Intro to TDRp Webinar for Your Team
  • A $100 Discount on the CTR Confernece
  • A $200 discount for the Basics Workshop
  • A $500 discount for a Custom Workshop

Please contact Andy Vance at for more details

Our Newest CTR Advisory Council Member Todd Harrison

Todd Harrison, Ed.D                                                                             Todd

[Director, Talent Solutions]

CTR is pleased to announce Todd Harrison as our newest Advisory Council Member. Todd will be taking over for Kendell Kerekes. We are thrilled to have such an experienced and renowned individual added to our ranks.


Dr. Harrison joined CEB Metrics that Matter (MTM) in 2012, after nearly 15 years of corporate experience in various learning and development leadership roles, where he was an active practitioner of the MTM system.  At MTM, he is accountable for the MTM Professional Services team of approximately 40 people, and provides strategic leadership to the Metrics that Matter (MTM) Client Advisory and Consulting teams within this group.  These two teams have primary responsibility for delivering ongoing support services to MTM technology clients, as well accountability for the development and delivery of learning and human capital consulting measurement solutions, respectively.  Specifically, the Professional Services team helps measure the impact and effectiveness of learning and development programs within an organization intended to unlock the potential of organizations and leaders by advancing the science and practice of talent management.  Dr. Harrison’s business responsibilities include oversight of a portfolio of more than 600 clients and an annual global P&L goal of nearly $5M.

Dr. Harrison has extensive knowledge and expertise in several areas of talent management to include:

·        Succession Planning ·        Leadership Development ·        Talent Analytics
·        Learning Strategies ·        Employee Engagement ·        Competency Design
·        New Hire Onboarding ·        Performance Management ·        Organization Development



  • Doctorate in Organizational Leadership (2016) – Indiana Wesleyan University
  • Master of Arts in Human Resource Development (1995) – Webster University
  • Bachelor of Science in Journalism (1986) – Murray State University

Professional Experience

  • Director, Talent Solutions, Metrics that Matter: CEB, Chicago, IL (2015 – present)
  • Director, Consulting Services, Metrics that Matter, Chicago, IL (2014 – 2015)
  • Senior Consultant, Metrics that Matter, Chicago, IL (2012 – 2014)
  • Director, Global Leadership & Organizational Development, Stryker, Kalamazoo, MI (2010 – 2012)
  • Director, Leadership & Associate Development, Anthem Blue Cross/Blue Shield, Indianapolis, IN (2002 – 2010)
  • Vice President, Human Resources, Total eMed, Franklin, TN (1999 – 2001)
  • Director, Leadership & Organizational Development, American Rehability Services, Brentwood, TN (1997 – 1999)
  • Lieutenant Colonel (Retired), United States Army (1984 – 2005)

Professional Affiliations

  • Association for Training Development (1993 – Present)

CEB Platinum Sponsor

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at

CEB is a Platinum sponsor of CTR. With their continued support, CTR has continued its mission to bring standards and measures to the HR community. CEB was a big participant in the February 2016 CTR Conference. Helping to draw many participants making this conference, our largest yet. We look forward to continuing our relationship with CEB.

For information on sponsorship CTR please visit 

Join CEB For a selection of webinars in June.


All time are in CT


Addressing Pay Equity
June 7th 11am-12pm
Join CEB on June 7 for their complimentary webinar as they examine pay equity concerns for organizations with U.S. workforces. Register now at
Keep Quality through Onboarding
Jun 9th 11am-12pm
Best practice onboarding can maintain and actually improve quality of hire by up to 12%. Hear how the best companies prioritize employee integration in onboarding during CEB’s complimentary webinar June 9. Register now at
Organizing HR to Lead Enterprise Change
Jun 9th 9am-13:30am
Only 21% of organizations are able to initiate change as soon as the need arises. Learn how to better influence change implementation where it happens during CEB’s complimentary webinar June 9. Register now at
The Talent Within
Jun 23rd 11am-12pm
74% of organizations want to increase their internal hiring this year. Hear how the best companies improve internal mobility, and yield greater quality of hire during CEB’s complimentary webinar June 23. Register now at
Four Imperatives to Increase the Representation of Women in Leadership Positions
June 28th 12pm-1pm
Increasing representation of women in leadership positions has a real, tangible impact on an organization’s financial and talent outcomes. Join CEB on June 28 as they discuss the common myths surrounding women in leadership, and how to engage and retain a more gender-balanced workforce. Register now at


About CEB

CEB is a best practice insight and technology company. In partnership with leading organizations around the globe, we develop innovative solutions to drive corporate performance. CEB equips leaders at more than 10,000 companies with the intelligence to effectively manage talent, customers, and operations. CEB is a trusted partner to nearly 90% of the Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian Titans. More at




Big Data, Analytics, and L&D By Dave Vance

Big Data, Analytics, and L&D By Dave Vance


Big data and analytics continue to generate a lot of excitement in the human capital field. It seems like a new conference is announced almost monthly on the topic. So, what does it all mean for L&D?

Short answer: Not much yet.

Don’t get me wrong. I like data and I like analytics, and the new applications in HR are very exciting, but the buzz is primarily around using statistical tools like regression and correlation to discover relationships among measures (variables) outside of the learning field. Some of the most often cited analytical work is the determination of the factors that explain retention which will allow organizations to take proactive steps to retain desirable employees. There is also work to better explain the factors behind employee engagement, although even this is often in service of increasing retention. I have not seen any conference descriptions or magazine articles where big data and analytics have been applied to learning.

There are certainly applications of analytics to L&D but most of these have been around for a while and are not receiving much current attention. Let’s begin, though, with some definitions.   “Big data” generally refers to the ability to capture more measures at an individual employee level than even before and, once captured, store them in a data base(s) which is easy to manipulate for analysis.  Think of data on employee engagement scores, ratings, individual development plans, past and current assignments, projects, supervisors, referrals, past employment, participation in company clubs and activities, use of benefits, number of sick days, number of vacation days taken, etc., by name for each employee. The “analytics” part is the application of statistical techniques to tease out relationships between these measures which may have been expected or, in some cases, totally unexpected (no one was looking for it). You can see the potential of putting big data together with analytics. For example, you might discover that a leading indicator of regrettable turnover is high use of sick days or lack of involvement in company activities or lack of opportunities to participate in projects. With this type of knowledge you could identify those at risk and intervene early to keep them from leaving.

How might this be applied to L&D? You can imagine using big data to show that application and impact are lower for employees with poor supervisors who don’t reinforce the learning. But we already know this. You can imagine using big data to show the importance of identifying the right target audience, but we already know this, too. I am sure there are opportunities to discover some new relationships, but I believe L&D is more mature than other HR disciplines and thus we already have discovered many important relationships.

While big data is not being used much in L&D today, we have been using “little data” quite effectively for some time. We routinely measure number of participants, courses, hours, costs, completion dates, participant reaction and amount learned. We measure application rates but not nearly as often as we should. Typically we do not put by-name employee measures in a data base which also has information about the supervisor, employee engagement scores and other potentially interesting measures so we end up with “little data” instead of “big data”.

And we have been using analytics for some time. We have been using the intended application rate as a leading indicator of actual application for years. We also use the other “smart” questions from the post event survey as leading indicators for application and impact. Patti and Jack Phillips have helped many organizations use regression analysis to isolate the impact of learning from other factors. This effort has really matured lately with the advent of organizations like Vestrics which use general equilibrium modeling (sets of regression equations estimated simultaneously) to isolate the impact of ALL the factors with a high degree of statistical certainty. So, analytics, including high-level analytics, does have a place in L&D.

In summary, in L&D we have not made much use of big data but we have been using little data for a long time. We have not made extensive use of analytics but we have used leading indicators for years and some are using regression to isolate the impact of learning. In part, the lack of application to L&D currently reflects our relative maturity. We have a better understanding of the basic relationships among measures than our colleagues in talent acquisition or total rewards because we have been measuring longer, and thus there is not as a compelling case to be made for trying to discover new relationships. That said, there are opportunities going forward especially in terms of isolating the impact of learning and demonstrating to nonbelievers the power of strong sponsorship and reinforcement.

CTR Accepts the Community Partnership Award

Dave Vance was present and honored to accept the Community Partnership Award, awarded to CTR by the University Of Southern Mississippi Department Of Human Capital Development, one of the few departments in the country to offer a PhD in Human Capital Development. CTR was selected in recognition of our inviting USM PhD students to our conference in Dallas and for our leadership in the profession though TDRp. It was awarded Friday April 29 at the Third Annual Awards Ceremony held at the Gulf Park Campus. Six other awards were given, and graduating Masters and PhD students were recognized.

Dave Vance, (Executive Director of the Center for Talent Reporting).

Dave is a frequent speaker at learning conferences and association meetings. He also conducts workshops and simulations on managing the learning function. Dave is the former President of Caterpillar University, which he founded in 2001. Dave received his Bachelor of Science degree in political science from M.I.T. in 1974, a Master of Science degree in business administration from Indiana University (South Bend) in 1983, and a Ph.D. in economics from the University of Notre Dame in 1988. Dave was named 2006 CLO of the Year by Chief Learning Officer magazine. He was also named 2004 Corporate University Leader of the Year by the International Quality and Productivity Center in its annual CUBIC (Corporate University Best in Class) Awards. Dave’s current research focuses on bringing economic and business rigor to the learning field as well as the development of computer-based simulations to help learning professionals increase their effectiveness and efficiency. His first book, The Business of Learning: How to Manage Corporate Training to Improve Your Bottom Line, was published in October 2010.

Setting Standards for our Profession: Three Standard Reports By Dave Vance

Last month we talked about the need for standards and introduced three types of measures which would serve to organize our measures just as accountants use four types of measures in their profession. This month we will introduce three standard reports for L&D which contain the three types of measures. These three reports, used together, provide a holistic and comprehensive view of learning activity just as the three standard accounting reports (income statement, balance sheet, and cash flow) do for financial activity.

The three reports are Operations, Program, and Summary. While a version of these reports (called detailed reports) contains just historical information (like actual results by month or quarter), we will focus on reports for executive reporting which in addition to last year’s results include the plan or target for the current year, year-to-date (YTD) progress against plan, and, ideally, a forecast of how the year will end if no special actions are taken (like extra staffing, budget or effort beyond what is currently planned). The reports are meant to be used by the department head and program managers in monthly meetings to actively manage the function to deliver results as close to plan as possible. Of course, before the year starts, L&D leaders will need to decide what measures are important for the year and what reasonable plans or targets would be for those measures.

We will start with the Operations Report which contains effectiveness and efficiency measures. Recall from our discussion last month that effectiveness measures are about the quality of the program (like levels 1-5) and efficiency measures are about how many, at what cost, etc. (like number of participants, hours, utilization rate, and cost). The Operations Report is meant to capture the 5-10 most important effectiveness and efficiency measures at an aggregated level which may be for the enterprise, region, business unit or product group depending on the L&D department’s scope. So, for example, the Operations Report might show last year’s actual, current year plan, YTD results, and forecast for unique and total participants for all the courses being offered. It might also show the average level 1, 2 and 3 for all courses being offered. The department head would use this report monthly to see if the department is on track to meet plan for the year. If it appears the department is not on track, then the senior L&D leaders need to understand why and agree on what actions to take to get back on plan.

The Program Report also contains effectiveness and efficiency measures but its focus is at the program or initiative level rather than the enterprise level. It is designed to show the key programs in support of a single company goal like increasing sales by 10% or reducing operating cost by 5%. Under each identified program would be the most important effectiveness and efficiency measures. The Program Report also would contain an outcome measure showing the planned impact or contribution from learning on the goal. So, the report pulls together the key programs and measures required to achieve the desired impact on the goal. The owner of the goal (like the SVP of sales) and the L&D leaders would agree on the programs; outcome, effectiveness and efficiency measures; and plans or targets for the measures before the programs begin. The goal owner and L&D also need to agree on their mutual roles and responsibilities, including how the owner plans to reinforce the learning. Program Reports would be used by the department head and program managers each month to ensure everything is on plan to deliver the agreed-upon results.

While the Operations and Program Reports are used to manage the function each month, the Summary Report is designed to show the alignment and impact of learning as well as its effectiveness and efficiency. The Summary Report starts by listing the company’s top 5-7 goals in the CEO’s priority order and then shows the key learning programs aligned to each. (In some cases there will be no planned learning.) Where learning does have a role to play in helping achieve the company goal, the report ideally will include an outcome measure or some measure of success. Next, the report will share three-four key effectiveness measures and three-four key efficiency measures. The target audience for this report is the CEO, CFO, SVP of HR, other senior leaders as well as employees of the L&D department. The report lets the senior company leaders know four important things: 1) L&D knows what the company’s goals and priorities are, 2) L&D leaders have talked with goal owners and the two parties have agreed on whether learning has a role to play and if it does, what kind of contribution may be expected from learning, 3) L&D leaders are focused on improving the effectiveness and efficiency of learning, and 4) L&D leaders know how to run learning like a business by setting plans and then managing each month in a disciplined way to deliver those plans.

Use of these three standard reports and the processes associated with creating them will dramatically improve the impact, effectiveness, efficiency, stature, and credibility of L&D. Along with adopting the three standard measures, it will also be a major step towards L&D becoming a true profession with our own standards and reports.

Dave to Talk at the ATD International Conference and Exposition.

Dave be talking at the 2016 International Conference and Exposition in Denver on Wednesday May 25 from 10-11. Topic is Talent Development Reporting Principles: Your Guide to measurement and Reporting for L&D.

Hope to see you there.

2016 CTR Conference Slides

Please visit for access to the CTR conference slides from 2016 and 2014.


Bringing Standards to the L&D Profession: Three Types of Standard Measure: By Dave Vance

By Dave Vance

Accounting has four types of measures (revenue, expense, assets and liabilities) and three standard statements (income or profit & loss, balance sheet, and cash flow). What do we have in L&D which is similar? Five years ago a group of industry leaders came together to consider a standard framework for L&D. The group included leaders like Kent Barnett then at Knowledge Advisors, Tamar Elkeles then at Qualcomm, Laurie Bassi from McBassi and Company, Jack Phillips from the ROI Institute, Jac Fitz-enz from the Human Capital Source, Josh Bersin from Bersin & Associates, Cedric Coco from Lowes, Karen Kocher from Cigna, Rob Brinkerhof from Western Michigan University, Kevin Oakes from i4cp, Carrie Beckstrom from ADP, Lou Tedrick from Verizon, and a host of others – 30 in all, including myself. After 24 revisions, we all agreed on a framework which we believed would begin to provide the type of standards that accountants have enjoyed for some time and which would help us become more professional.

These leaders recommended a framework consisting of three types of standard measures and three standard reports. So three and three versus the accountant’s four and three. The three types or categories of standard measures are effectiveness, efficiency, and outcomes. The three types of reports are Operations, Program, and Summary. The goal was to keep the framework simple and easy to use. We also wanted to build on the excellent work done in our profession over the last 70 years, especially that done by the Association for Talent Development (ATD, then called ASTD).

Let’s look at the measures first. Effectiveness measures are those that address the quality of the learning program or initiative. In learning we are fortunate to have the four levels popularized by Don Kirkpatrick and the fifth level (ROI) popularized by Jack Phillips. These five levels all speak to the quality of the learning. Level 1 measures the participant’s or sponsor’s satisfaction with or reaction to the learning – certainly an initial measure of quality. Level 2 measures the amount learned or transference of skills or knowledge, and level 3 measures the application of that knowledge or change in behavior. If they didn’t learn anything or if they don’t apply what they did learn, I think we can all agree that we don’t have a quality program. (This may reflect a lack of engagement or reinforcement by the sponsor, but we still have a problem.) Level 4 measures impact or results and, since this was the reason for undertaking the learning to begin with, if there are no results it is hard to argue we had a quality program. Last, ROI provides a return on our investment, a final confirmation of quality assuming we have properly aligned the learning to our organization’s needs and achieved high quality on the previous four levels. Most organizations have level 1 and 2 measures but relatively few measure at the higher levels.

Efficiency measures are about the number of courses, participants, and classes as well about utilization rates, completion rates, costs, and reach – to name only the most common. Typically these measures by themselves do not tell us whether our programs are efficient or not. Rather we need to compare them to something else which may be last year’s numbers, the historical trend, benchmark data, or the plan (target) we put in place at the beginning of the program. Now we have a basis to decide if we are efficient and if there is room to improve, to become more efficient. All organizations have efficiency measures with the most common being number of participants, classes, and hours as well as some cost measures like cost per learner or cost per hour.

That leaves outcome measures. Unlike effectiveness and efficiency measures, most organizations are not measuring outcomes and few even talk about them. That is unfortunate because outcome measures are the most important of the three types, especially to senior leaders who make the funding decisions for L&D. In accounting this would be like reporting expense and liabilities but never talking about revenue or assets. No one would have a complete picture of what we do, and it would be hard for anyone to understand why we have the expenses and why we incur the liabilities. So, what are these all-important outcome measures? Simply put, outcomes represent the impact or results that learning has on your organization’s goals or needs. Suppose a needs analysis indicated that your salesforce would benefit from a consultative selling skills program and product features training. And suppose that you and the head of sales agree that it makes sense, and the two of you further agree on program specifics, including mutual roles and responsibilities, especially how the sponsor will reinforce the learning and hold her own employees accountable. Now, what impact on sales can we expect from this training? How much of a difference can training make? A lot? A little? Enough to be worthwhile? This is the realm of outcome measures which will sometimes be subjective (but not always) but very important nonetheless. Sometimes the level 4 impact or results measure from the list of effectiveness measures will do double duty as an outcome measure. That is okay and the same happens sometimes in accounting. Or other measures will be selected. In any case, with outcome measures we are at last focused on how we align with corporate goals or needs and what type of impact we can have, and this is what your senior leaders have been waiting for.

Next month we will look at the three reports and how the three types of measures populate these three reports. In the meantime, as a profession, let’s start talking about these three types of measures. It would be big step forward if we could just adopt a common language, which by the way, is a precondition to be a true profession.


CTR Conference A Great Success


Our 2016 CTR Conference held in Dallas last month was a great success! We had 134 participants, almost double the number at our last conference and far more than we had hoped for.  If you were not there, you really missed a great gathering. Everyone there was interested in measurement, reporting and management so there was a tremendous amount of focused energy, sharing and interaction. Cushing Anderson from IDC and Ed Trolley from NIIT provided the keynotes to start off each day, and they were not shy about sharing their thoughts! We had a great panel on Disruptive Ideas and a very nice tribute to Jac Fitz-enz, the father of human capital analytics. There were 16 breakout sessions hosted by industry thought leaders over the two days and our participants often struggled to decide which session to attend. Unlike many conferences, most of our speakers came for more than just their session and even asked questions of their colleagues during other sessions which livened things up considerably. Finally, we followed the 1.5-day conference with two half-day workshops: CLO for a Day and Strategic Alignment. Attendees really had fun with CLO for a Day which is the only computer-based simulation for our profession.  We are starting to plan next year’s conference so watch for the Save the Date message coming soon.

Conference survey results  

What were the most beneficial aspects of the CTR Conference?

What topics or speakers should we consider for the 2017 CTR Conference?


All Roads Lead to ROI

Have you ever thought about ways to prove the value of your work as well as show proof that it was worth the investment?

If you have, you’re not alone.

As being part of the business world, we all want to know we’re spending money for a good reason.

This one-hour webinar will present real-life case studies detailing what is being accomplished and how it is being used to make human capital analytics work within organizations.

Five types of projects will be discussed:

  • Converting data to money
  • Showing relationships and causation
  • Applying predictive models
  • Conducting impact and ROI analysis
  • Forecasting ROI

It’s time to make human capital analytics work for you! 

Register Now

Develop Your L&D Team’s Competencies to Deliver on Business Goals Webinar Recording

In this webinar, Caveo’s Strategic Learning Partner Alwyn Klein uses real-world case studies and proven concepts to show how to develop your L&D team to become proactive and high performing, balanced with the appropriate mix of roles and competencies. You’ll learn to…

      • Leverage the L&D Compass for your team’s journey
      • Plan team development strategies and tactics
      • Determine future competencies and roles
      • Maintain a focus on current and future business alignment
      • Deliver consistent and measurable business value

Please Visit  for access to the recording.

Develop Your L&D Team’s Competencies to Deliver on Business Goals (Webinar)

Tuesday, March 15 @ 11:00–12:00 p.m. CST

Develop Your L and D Team's Competencies to Deliver on Business Goals

More than half of training spend is done outside of learning & development organizations, and a big reason is the limitations of L&D team members’ competencies.

Strategic thinking, a performance improvement mindset, ROI and financial acumen, effective instructional design, executable program management, and proficient technology skills are just a few competencies that are required for your L&D organization to gain credibility and deliver on business goals.

By taking an inward focus on the development of their teams, learning leaders can solve these common pain points:

  • Team members that lack key L&D competencies
  • Lack of L&D “readiness” to support the next key business strategic initiative
  • Ineffective execution of learning initiatives to meet business goals
  • L&D team members that have “business” experience, but have underdeveloped performance improvement and instructional design experience, or vice versa

In this webinar, Caveo Strategic Learning Partner Alwyn Klein will show you how to develop your L&D team to become proactive and high performing, balanced with the appropriate mix of roles and competencies.

We’ll share real-world case studies and proven concepts to help develop your L&D team. You’ll learn how to:

  • Leverage the L&D Compass for your team’s journey of development
  • Plan team development strategies and tactics
  • Determine future competencies and roles
  • Leverage expert and industry resources
  • Maintain a focus on current and future business alignment
  • Deliver consistent and measurable business value

Register today and take the first step toward building an L&D team that drives business value!

Who Should Attend

Chief Learning Officers (CLOs), VPs of Training, Training Directors and Managers, Human Resources VPs and Directors, CEOs, and COOs.

Webinar Presenter: Alwyn Klein

Alwyn KleinAlwyn Klein is a popular conference presenter and facilitator with over 20 years’ experience in the L&D field, having previously headed up a team of over 60 learning professionals. He is a Certified Performance Technologist with the International Society for Performance and Improvement and a Certified Professional in Learning and Performance with the Association for Talent Development. In 2014, Klein won South Africa’s prestigious Chief Learning and Development Officer of the Year Award. He is constantly working to implement innovative, people-focused training initiatives that are strongly aligned to organizational goals.