I just attended the LT2020 conference in London, it was really great that the trade is making great headway into a brave new world of learning, that will really make great use of technology, tools and the information that underpins it.
As I walked around I made a general observation. The general gist of more-or-less every slogan across the boards there you’ll see words like ‘simple’ ‘easy’ ‘quick’.
Certainly some stuff can be very simple, and its great that things that used to be complex and difficult to achieve (creating video, publishing eLearning for example) are now much easier, quicker and simpler. Win!
But for anything past the production level, Surely it can’t all be that easy!?
Of course we all love simplicity. Business especially. Complexity takes time to unpick, and understand, and is even more difficult to communicate effectively.
As the famous quote attributed to Blaise Pascal in a correspondence with a colleague goes:
“I apologise for this long letter, if I’d had more time, I would have written less.”
It strikes me that inherent complexity and difficulty is a necessary component of understanding! This makes sense from an individual level, certainly, as we know that true learning only comes with effort, but is even more true of organisational learning! My feeling is that we have to be careful that we’re not taking a ‘tools first’ approach, which may bypass the level of effort needed for us to really get to grips with the underlying need.
Let’s take data and reporting as a prime example to explain what I mean, as it’s arguably one of the most complex bits of the learning picture.
As we likely already know, we need to be very aware of the multitude of sins that underpin effective, relevant, meaningful learning, in order to gets to grips with the group/s of learners/workers that we need to reach, and help them not only understand but apply what we need them to learn. To do this we need to look at things not just in terms of content consumption (which is what most systems measure), but how that is retained, applied, forgotten, valued, not just at an individual level, but also in groups and at increasingly wider scales and demographics, that is if we really want or need to diagnose our learning culture…
Putting it simply and crudely; perhaps we should ask ourselves if a line graph showing ‘videos Bob watched this week’ doesn’t perhaps meet the above criteria for understanding a complex learning landscape?
I did have some very good and engaging discussions with some of the vendors on data and dashboard and there’s some really cool developments afoot, and I believe the suppliers in some instances understand the above complexity quite well, however my thesis here is that no matter how slick your dashboards, or how effectively your supplier understands the landscape, there’s no getting away from it. The data, principles & algorithms that underpin a truly empirical approach should be understood by L&Ders.
So here goes….
Some key questions for you to ask yourself, and your vendor:
What are the variables that you really want to measure?
What does that pie chart/bar chart really represent? Are there multiple variables that provide an input to it or just one?
When/how is the data collected?
What algorithms underpin a field/entry/chart?
How is bias accounted for (if at all)?
Is there a weighting etc.?
Was the data triangulated in any way?
Do you have a control group/benchmark of any sort?
How are different populations accounted for? Are there demographic groupings for example?
IS THE DATA USEFUL?
Now I’m not saying you need to start doing linear regressions and standard deviations manually yourself, but interpreting data is often more of an art than a ‘simple easy 123 step process’ which many providers would have you believe.
We may not need to be hardcore scientists or data analysts, but we do need to understand the trade and talk the language. Caveat emptor!
It’s a space we’re going to have to get used to, in order to make valid, important and valuable decisions not only about service providers we use, but also what we need to consider when trying to meet our strategic objectives.
This may not necessitate a fancy system, in some instances your good old basic survey or a face to face conversations may yield far more insight than a generic tool dashboard (though I’ve seen some really good examples of how some providers use longitudinal data as part of their offering).
Whilst I’ve been talking about data as an example of underlying complexity, I think that in a wider schema, we need to accept that learning, at both an individual level and organisational level is complex. There’s no getting around it boys and girls.
There’s no easy fix.
No magic bullet.
No single simple system to fix all that ails you.
Systems are just tools, and how we use and understand our tools is part of the job, a (forgive the parlance) ‘dumb user’ will find themselves be replaced very quickly in the future of work; there is a need to go beyond the dashboard and into the deeper ‘why’. Why are we collecting this information? What will it tell you? That’s where the value lies!
It’s an exciting challenge! One that we have some learning and hard work to do to understand what needs to be known on these levels. Complexity is part of the job; in many ways it’s what I love about it, but we can’t reduce complexity very easily without some relatively advanced techniques and deep thinking about the landscapes in which we work.
It’s fine to love simplicity, it has its place. Communicating to learners or stakeholders needs to be clean and concise, visual aids are AWESOME in getting a complex topic across. BUT the caveat being that in understanding that we need to have an eyeball on the big picture in order to make sure we’re not being reductionist. Otherwise we run the risk of operating on perhaps some faulty assumptions!
I’ve boiled all that stuff into three sentences which I think capture the essence of it all.
I’ll leave you with that thought and will perhaps explore more on the above in later posts.
Get in touch if you’ve got thoughts/opinions or abuse on the above!