Learning & Performance is Longitudinal

When you speak of building optimal performance in the workplace what do you think of?

Did you think of a holistic program of work that engaged with grass roots concerns and engaged on an ongoing basis to partner closely with those doing the work and ultimately responsible for measuring the impact?

Or did you think of a project which was run with content that supported the initial ‘push’, usually from a higher up decision maker that ‘sets people up’ and then is forgotten within weeks, if not days?

Whilst many of us talk the good talk with the former, inevitably it is the latter which holds sway in the vast majority. Especially when it comes to the role of learning in the performance improvement landscape.

Does this exchange look familiar?

Me: ‘So on how are we measuring the results of this intervention?’

Project Lead: ‘Oh we’ll be running an LMS report at the end of the course, or some sort of satisfaction sheet.’ comes the answer.

Me: ‘Are you not measuring things at the point of work then, or through the individual’s performance review process?’

Project Lead: [cue blank/panicked look] ‘We don’t have the time to do that’ is inevitably the answer that comes, or some variant of it.


Where did this ‘one shot’ attitude come from?

From the hangover of running one day ‘courses’ perhaps?

From time pressures?

From external pressure from senior leaders?

From system limitations that only measure a completion score?

From faulty understanding/knowhow about how people learn?

From simple, good ol’ fashioned laziness?

Probably a mix of all of the above!
Or possibly, from the fact that learning and development, to my experience has been and continues in many cases to be, detached from the performance landscape as a whole, not only that but in the vast majority of instances is seen as an ‘add on’ to other projects run in other areas of the business.

I’ve been on quite a few projects where learning teams have only been brought on after the project has started and then ‘dropped in it’ to sort out the mess brought on by that approach, then criticised for ‘not delivering’!

But have we made our own beds to lie in here?

Likewise I do still see persistent in many individuals, even in the learning trade, that we still need to be in ‘delivering’ materials mode.

We need to get over the view that learning = materials delivery.

Learning is not an afterthought! Learning is longitudinal, and yes…performance is longitudinal. And let’s be clear here, the two are inevitable bedfellows!

Seems a bit obvious to say it, but it’s startling how few organisations I’ve worked in see a connection between ‘running trainings’ and the overall performance process, not only that but I have yet to see a single organisation deal with performance reviews in a reasonable, evidence based and rational way that is tracked in anything past lip service.

This isn’t just an ‘learning problem’ this is a systemic performance improvement/management problem

The whole structure and fallacious nature of performance evaluation is simply not up to par, and in developing learning systems we are struggling with these systemic failures.

Let’s face it, the systems, processes and people management mechanisms we have in place in most organisations are woefully inadequete, shallow, and tied to some very shaky organisational objective or generic ‘criteria’ for performance often only in a couple of hours then slapped on a system! Is it any wonder that performance improvement fails to gain traction in many cases?

BUT this is not a time for hand wringing! It’s also an opportunity!

Instead of prescribing how to do it, I recommend asking the following questions both of yourself and the others on your project/program, because every program is different:

Questions to ask yourself:

  • Do you see this learning as a one-shot affair or continual? Have you asked yourself why it is seen this way?
  • What are the KPIs used to measure not just learning but also transfer?
  • Are your KPIs agreed upon by all involved?
  • How will information/data on these KPIs be gathered?
  • In the gathering of information are you relying on ‘remote’ methods only (using only the systems you have) or are you mixing your data collection methods?
  • If using a mixed method, who will do the measuring? How will they feed back?
  • How are the end opinions/feelings of the targeted groups measured and not just the skills/knowledge?
  • Are you going beyond feedback on ‘materials’/’materials delivery’ alone?
  • How is data stored and used? What will it tell you once all is said and done?
  • How often will you collect data over a given period?
  • Is your method standalone or integrated with other mechanisms and data sets?
  • What sample size have you engaged in understanding the baseline need?
  • Is this an ask coming from the top or the bottom? Have you joined up the dots between the two?
  • In measuring your performance how are you building a case for how different teams are contributing to the bottom line value that this program is supposed to deliver?
  • Do you really really believe this program is necessary/valuable? If not then stop and evaluate why you believe this, you might not be the only one!

In summary, here’s some obvious stuff….people need to be supported over a transition period with learning materials and yes…in many cases this will mean more than just measuring some abstract learning metric somewhere.

This will mean a complex of in-person support, from a real person who knows what the overall objectives of the intervention are and yes, who buys into those objectives, and had a role in creating them!

If they don’t we shouldn’t be looking to change their attitudes through designing newer fancier ways of materials development to be shoved down their throat!

Let’s start looking at the above questions with more than a ‘delivery’ hat on, let’s start thinking beyond content, beyond curation, beyond all that stuff which is just noise. Ultimately that stuff doesn’t matter if we haven’t established the performance need at root, and don’t have a mechanism for unilateral, longitudinal measurement that will anchor those materials to reality!

What do you think?

Agree/disagree? Had a challenge with the above? Get in touch!


Leave a Comment

Your email address will not be published. Required fields are marked *