Adding performance support to the trainer’s toolbox

The way that people work in any organisation is influenced by several factors. When I conduct a performance analysis I look at factors such as expectations, capacity, incentives, feedback, tools and skills.

If you put a group of people in a room and ask them to describe a performance problem at work and then to classify these, you will find that about 15% are due to a lack of skills & knowledge. I’ve seen this on several occasions and my own experience with workplace performance analysis bears it out as well.

Training is an effective instrument to address a lack of skills and knowledge, but not any other performance factors. That means that at best, training helps with less than 1/5 of an organisation’s human performance issues. On the other hand, performance support tools can be used to address a lack of information resources. By just adding performance support (non-instructional interventions) to a training designer’s toolbox, you are likely doubling your value to your organisation or your clients.

My own performance toolbox is a start to learn more, and here are some basic reference books I’ve used over the years:

DIF Analysis

Previously, I had mentioned DIF (difficulty, importance, frequency) Analysis as a tool that I used in the military to determine if job tasks required training. I finally got around to creating the expanded model in a digital format, so here it is.

expanded-dif.jpg

In making these tools available online some people ask if I’m giving away some secrets to the trade. I don’t think so, because these are pretty basic tools which I’ve been using for over a decade and many others use as well. Also, the world of work is getting to a point where performance improvement may not be the best approach. In knowledge-intensive workplaces, procedures and tasks can’t be easily quantified. Tools like DIF analysis only work when there are similar jobs done by several people. They won’t help in a creative work environment like a design shop.

My own interest is to develop new tools and methods, beyond human performance technology and instructional design. Methods like online personal knowledge mastery are of current interest.

Job Aids & Performance Support

I’m currently working on a project that requires me to get back to some performance and training analysis. Of course, my initial outlook is that training can often be a problem looking for a solution.

I had to review the basics and decided to read Rossett & Schaffer’s, Job Aids & Performance Support. This is a good introduction to performance support, and more up to date than Gery’s classic EPSS. The section on when performance support is appropriate is a good reminder for everyone in our field:

  • When performance is infrequent
  • When the situation is complex
  • When the consequence of errors is intolerable
  • When performance depends on a large body of information
  • When performance is dependent on knowledge or information that changes frequently
  • When performance can be improved through self-assessment
  • When there is a high turnover rate
  • When there is little time or money for training

Sound like any workplace you know?

There is an excellent sidebar in the book by Marc Rosenberg, author of Beyond e-Learning:

This is our challenge when we blend interventions to solve performance problems. We must recognize that relying solely on blending instructional solutions is not always the best way to meet the economic worth test for long-term, sustainable and valued performance improvement. Including performance support in the mix lowers overall investment, reduces time to competence, and makes the solution more durable over time …

I’m still amazed that performance support is not seen as a standard intervention for all training and learning organisations. The data are there; it works.

What do you want people to do?

I’ve been looking at some training documentation and it seems that when we get into complicated (not complex) cases of lots of stuff to examine, we miss the forest for the trees. Dave sums it all up quite nicely:

Even if the client’s model of training involves only lectures and PowerPoint, “What do you want people to do?” shifts the focus to the reason they’re on the job – the results they’re supposed to accomplish. (If the client focuses only on how they perform, you can ask about the results they produce – whatever’s left over when the workers go home.)

There are lots of tools to help see the forest in my toolbox.

Performance Analysis

In my continuing series of adding more stuff to my Toolbox, here’s another way of looking at the performance analysis process:

pa-process.jpg

Often, analysis work starts with a big blob of information and unrelated facts plus a few pressing issues tossed in for a sense of urgency. The actual work consists mostly of clumping and dividing, in an effort to find patterns. This graphic represents a particular view of that process.

First, you look at the organisational context and see what the big issues are. Then you try to determine what are the main factors affecting the work performance. Usually you find out that the real problems and challenges are not quite exactly what you were told when you started. That’s where these kind of charts come in handy – explaining the process to your clients.

This chart also shows that there are a lot of performance factors that can only be addressed by non-learning interventions. In other words, there’s more to performance improvement than just training. On top of that, even some of the learning interventions don’t necessarily require training & education solutions (aka “the course).

Anyway, it’s a simple model that doesn’t tell the entire story (what model does?), but I’ve found it useful from time to time.

My Performance Toolbox

I’ve just created a new page that will list the practical job aids for workplace performance that I’ve developed. With over a thousand posts on this blog, they were getting lost in the crowd.

You can now find all of them listed in my Performance Toolbox. I’ve put a permanent link in the Consulting section. Of course, everything is licensed for sharing, using a Creative Commons Attribution – Non-Commercial license.

Workplace Performance Analysis Job Aid

In much of my work I’m asked to help out with situations that appear to be rather complex. For instance, we had a situation that required skills development for hundreds of staff preparing to implement a new nursing care methodology, all to be done in a very short time. An initial analysis, conducted in one week, showed where there could be major cost savings by focusing only on the important areas of performance.

I’ve used performance analysis methods for several projects and have found it to be a good way to separate the important signals from just the noise that surrounds many large organisations, especially those in transition. To learn more on how to conduct a performance and cause analysis, I’ve attached a job aid that I use for myself and to communicate with clients. This is one of the tools that I use to help see patterns in the chaos of daily work.

Download: jarche_analysis_process.pdf

Process improvement is bad for innovation

I’ve had this feeling for a while and now there is evidence that process improvement, like Six Sigma, stifles innovation. Oligopoly Watch feels that, “The management moves that cheer stockholders and financial analysts, when taken too far, can lead to the long-term decline of the company in question.” Their article today reports that Six Sigma process improvement has resulted in less innovation at 3M, a company renowned for its innovative products, like the Post-It Note:

But, according to the article, 3M is hurting this year. Its operations are far more efficient, but this is company that has thrived on having a variety of new and sometimes breakthrough products coming to market. No longer. Financial results are down, and the general sense is that 3M is doing everything more efficiency except innovation. Six Sigma is great for speeding up the assembly lines or minimizing errors, but fails at producing new ideas.

About ten years ago I became immersed in Human Performance Technology (HPT), another process improvement method, but not as lucrative as Six Sigma or Lean Manufacturing. The tools and perspectives were beneficial but that is all that they are – tools. Process improvement is a tool set, not an overarching or unifying concept for an organisation.  Process improvement is a means and not an end in itself, and this seems to be the trap that 3M fell into.

I left the HPT fold about a year ago when I realized that being a Certified Performance Technologist was not an achievable end, but a costly merry-go-round that just kept spinning.  I have learned a lot from HPT, but you cannot look at things one way, to the exclusion of all others. The fundamental problem with all of these process improvement methodologies is that you get myopic. It seems that 3M is learning this lesson as well.

The all too real effects of artificial structures

Stephen Downes says that teams are a fiction that purport to represent everyone when in fact they reflect only a select subset of opinions [such as the team leader?].

Liong Huai Yu, highlights this quote in his review of Dave Weinberger’s Everything is Miscellaneous“The world is too diverse for any single classification system to work for everyone in every culture at every time.”

Classification systems, like teams, are artificial structures. Liong goes on to compare Weinberger’s premises with education:

To bring the discussion further from what is discussed in the book, what are the artificial structures and organisational methods have we put in our schools? Artificial subject segregation, timetabling and even teacher-specialisation. As we move forward facing new challenges, fighting regional and global competitions, we may have to re-examine the structures we have in place, as most of the time, these structures were created for a world that was last century. Also, are they benefiting the users (both students and teachers) the way it set out to be.

Any change initiative or attempt at systems improvement has little chance of success if you don’t take the time and effort to really examine the underlying structures. All of our management models and organisational structures are artificial structures and we have the collective intelligence to change them. Usually what is standing in the way are the vested interests of those with power and the all too powerful ingrained culture that we take for granted.

Remembering that it’s all artificial may be a good first step in seeing with new eyes.

Training, for all that ails you

“Canadian companies aren’t spending enough on training,” said the announcer on the radio this morning. My first thought was that we would never hear the news that we weren’t spending enough on bandages in our healthcare system. Once again, the mass media and the so-called experts get it wrong. It makes you wonder if there’s a training industry lobby out there.

According to the Conference Board of Canada:

“Canadian organizations are under increasing pressure, due to a tight labour market and competitive demands, to renew and upgrade workers’ skills. Building workers skills through training, learning and development is one way for organizations to compete. Yet, TLD spending in Canada is stagnant,” said Michael Bloom, Vice-President, Organizational Effectiveness and Learning.

Read in its entirety, this makes sense, as TLD is only one way to improve performance. There are many other ways and usually training is the most expensive method. I’ve noticed that many large organisations have a tendency to slap on the training bandaid once any problem has been labelled a human performance issue. It seems that the media and research institutes reinforce this behaviour. However, training that is not directly related to developing specific skills and knowledge wastes time, bores workers and costs money.

This is not the first, nor the second, but the third time that I have heard our national broadcaster report the unfounded notion that training can solve unrelated performance problems. This is the same as prescribing medication without a diagnosis. Of course I don’t really blame the CBC, because it is getting this misinformation from our training and learning “experts”. The snake oil salesmen have jumped on the Conference Board report and are demanding that companies spend more on training. That would be a costly mistake.

I also noticed from the Conference Board’s report that informal learning is actually being mentioned:

Informal learning, which is not well tracked or monitored, may be occurring more frequently. Respondents said 42 per cent of all learning occurs informally.

I get the sinking feeling that informal learning will soon be commoditized by the TLD industry and sold like training currently is – as a solution looking for a problem.

To read the complete report you would have to spend $975 to find out what many of us already know. Training is a means (one of several, not limited to learning & development), while performance is the real goal.