I was scrolling through my Facebook feed last week and shrieked in horror when I noticed that a friend of mine was spreading this nonsense:
I understand that sharing things like this comes from a good place, but just because things have numbers in them doesn’t mean those numbers mean anything. In this particular case, these numbers are simply made up (don’t get me started on the topic of “overhead” and nonprofit funding). In many other cases, especially in the world of learning and development, the numbers may not be made up, but they need to be combined with a few more data points to be useful.
Let’s take a big one, ripped straight from ATD’s 2016 State of the Industry Report: Employees averaged 33.5 hours of training. Is this a good number? It is up from 32.4 hours the previous year. It does represent an increase in organization’s investments in L&D.
Of course, the answer is: this could be a good thing. If employees are doing something with this training – if they’re growing more efficient in what they do, if they’re reducing the number of workplace accidents, if they’re increasing the diversity of their workforce, if they’re reducing behaviors that lead to hostile work environments, if they’re increasing sales.
If they’re not doing any of these things, it’s actually a problem. 33.5 hours times the average hourly wage at your organization times the total number of employees can be a big waste.
I was talking about this with my colleague, Todd Hudson, from the Maverick Institute recently. Todd applies lean principles to learning and development problems. He shared this spreadsheet with me (download the Excel file here).
Anyone who has spent time in the learning and development space knows that you’ll never get a direct ROI on every training program. New Employee Orientation, for example, may not lead directly to an increase in sales, but if it’s effective it may help other metrics including employee engagement which can in turn impact retention, which could allow for a pipeline of strong, tenured sales staff. The point here is that Level 0 metrics (the “butts-in-seats” metric, such as the number of attendees or the number of hours spent in training per employee per year) need some other data points.
I like Todd’s “Training Burden Estimate” worksheet because it’s a useful template to gauge the cost of training – for individual training programs, organization-wide learning initiatives, by format (elearning, instructor-led) – at your organization on an annual basis. The only two columns I might add to this worksheet are:
- What is “success” for this training program?
- What has been done as a result of this investment?
Without multiple data points, and certainly without a definition of “success” to provide context and justification of the investment in training, all we have are meaningless numbers.
Would you use this “Training Burden” worksheet? What would you add or change to make it useful for your own training program analysis?