Where Does Reporting Fall on the L&D Maturity Spectrum?

Maturity models show a journey from just beginning to collect data all the way through the use of advanced and sophisticated practices and processes. The models can be very useful in conveying the journey that must be followed from immature to mature and in helping organizations assess both where they are today and where they may wish to go in the future.

Unfortunately, since most practitioners don’t really understand what is required to run learning like a business, most of these models show reporting as the second or third step on what is often a fix or six-rung maturity ladder. Properly understood, reporting should be the next to highest rung.  Let’s see why.

First, we need to define terms which is where the problem begins. For most, reporting represents the creation of monthly dashboards or scorecards which show only actual results. And typically the measures are low level efficiency and effectiveness measures like number of courses, hours and participants, and level 1. You have all seen these. They may show monthly or quarterly data and may contain bar charts or line graphs. Dashboards may show a speedometer. If this is how we define reporting, then I agree it belongs on the second rung since it is merely capturing the most elementary data from the first rung and using it only to discern how the measure is trending. Authors of these maturity models often go to great lengths to contrast this low level “reporting” with higher level predictive analytics which these days is almost the highest rung. So, the model encourages you to move beyond elementary reporting (which is good!) to higher level measures (like levels 3-5, which is also very good!) to big data and predictive analytics (which may or may not be so good), all the while never mentioning the use of reports for management of the department of program.

If we instead define reporting in line with Talent Development Reporting principles (TDRp), the whole model changes. In TDRp executive reports are never simply a compilation of actual results (history). A report must include the plan (target or goal) for that measure, year-to-date (YTD) results, a comparison of YTD results to plan, and ideally a forecast of how the year is likely to end if no special (unplanned) action is taken. The sole purpose of the report is to answer the two basic questions every manager should ask for every important measure: (1) Are we on plan year to date? and (2) Are we going to end the year on plan? Trend for the year and comparison to previous years are interesting but largely irrelevant after the plan has been set for the year. Trend and history would have been used to set an achievable, reasonable plan, but after the plan is set, all that matters is whether you can achieve it. So, these reports are absolutely indispensable to managing key programs and the department as a whole. A good manager simply cannot manage without them.

In this light, good reporting should be near the top of the maturity model since it supports the active, disciplined management of the function which, in my opinion, is the top rung. I believe that this active management supported by good reporting will deliver FAR greater results and impact than big data and predictive analytics. In fact, it is not even close. Predictive analytics is typically used to discover relationships among measures which is great but usually impacts a small number of measures or projects. In contrast, the active management of all your key programs, by definition, will affect everything important you do for the year. In other words, if a CLO is trying to decide between investing in predictive analytics and disciplined management of the department using good reporting, my strong advice is to get your disciplined management in place first.  This will provide far more bang for the buck and has the potential to advance your career in management (versus in analytics).

Please take a second look the next time you see a maturity model to understand how the authors have defined the components. I bet they will have a “dumbed” down definition for reporting and consequently have allocated it to a low rung. If you are constructing your own maturity model or journey chart, I challenge you to aspire to great management of the entire department as your highest rung.

Comments

  1. Dave,
    I agree with your observations and in fact for that reason I set out to find a maturity model that better demonstrated the progression of an organization implementing detailed measures of actual output and outcomes.

    I have found that the Capability Maturity Model or CMM which is a process improvement model is what applies best to what we are experiencing. This model places “detailed measures of the processes and output quality collected” at level 4 out of 5 and “continuous process improvement is enabled by quantitative feedback of the process and from piloting innovative new ideas and technologies” is at level 5.

    This may not be fully representative of a talent management model but I think from a measurements perspective aligns better.

    Warmly,

    Karina Izaguirre, SPHR
    Global HR Leader People & Organization Development
    EthosEnergy

Speak Your Mind

*