we need simulation!

The background to this story, explaining the difficulties I had in trying to establish a methodology to select simulation in the support of training programs is here — L&D Outside the Box. That story started in 1994 and ended in 2013. I do not know what has transpired since then, but I do hope that the training field has developed an informed process to select and use simulation to support learning. Somehow, I have doubts, and would love to be proven wrong.

In that article I concluded that L&D professionals have to master their own field as well the business they support. In addition, they have to understand that few outside L&D think that what they do is important. It’s a big challenge, and learning is becoming critical to all businesses. It is up to L&D to be part of this by developing science-based and practice-based methods.

When I had looked at how simulation was being developed in 2007 I learned that there was a total lack of a proven framework for task selection and fidelity levels in determining appropriate simulation, in this case for aircraft maintenance. After months of study, the question remained — How do we specifiy the optimal level of simulation fidelity? At that time, it appeared that there was no specific method other than informed estimates, based on the experience of those who work and instruct in the field.

In Evolutional Authoring Support Environment (2004) the authors showed the limitations of using simulation devices to meet training needs at the time. Is there new evidence now?

  • Limited instructional strategies are available for this type of device;
  • There is limited student modelling that can be done; and
  • These devices are mostly for procedural skills.

In The Use of Work Domain Analysis for the Design of Training Systems (2000) the authors stated that the only trustworthy means of resolving fidelity issues was to rely on transfer relationships that have been demonstrated in scientific research.

  • The layout of instruments and controls must correspond closely to the layout found in the mission system.

  • Information must have the same format and should correspond in legibility to that found in the mission system.

  • The physical implementation of devices (eg. displays and controls) need not be of high fidelity.

  • As a general rule, information that supports operator decision, judgement and response must be accurate but there is no need to mimic characteristics of the mission system that are not informative within the operator’s perception-cognition-action cycle.

  • Implementations of functional requirements should not offer information that encourages an inappropriate control strategy.

The core concepts in my studies were two types of ‘fidelity’, though there are several others.

  1. Physical — Spatial, tactile, and appearance
  2. Functional — Format, content, and response

I created a draft heuristic guide to be tested [it never was].

  1. Physical Fidelity
    • Low — Does not work mechanically
    • Medium — Works with limited effect
    • High — Works with effect
  2. Functional Fidelity
    • Low — Absent
    • Medium — 2D
    • High — 3D

I suggested five options to look at using simulation.

  1. Low Physical + High Functional — May be acceptable for certain tasks
  2. Medium Physical + High Functional — Desirable
  3. High Physical + Low Functional — May be acceptable for certain tasks
  4. High Physical + Medium Functional — Desirable
  5. High Physical + High Functional — Desirable, but cost may be an issue

But how would we get to selecting tasks for simulation? One option was to use DIF Analysis [Difficulty, Importance, Frequency]. For example, if a task was 1) Difficult, and 2) Important, but not 3) Frequent, then it would be highly suitable for simulation. For example, it is not safe to have a tail rotor failure in the air with a helicopter, but this could be safely practiced in a simulator, if the fidelity was good enough to transfer skills for use in the aircraft. A tail rotor failure does not happen frequently but when it does, it is critical to react properly. I know, a colleague of mine had a real one.

In addition, selecting tasks for simulation could be assessed against two organizational criteria.

  1. Will simulation reduce instructor load?
  2. Will simulation decrease training time?

My final conclusions and suggestions were simple.

  • Select high priority tasks for training.
  • Develop a short list of priority tasks for simulation.
  • For at least one task, develop a prototype with high physical and medium functional fidelity and for the SAME task develop another prototype with high functional and medium physical fidelity.
  • Compare the results.

For aircrew training, and other regulated occupations, simulation requirements continue to be mandated by their regulatory agencies. Learning & development professionals, for the most part, do not inform this process. Much of the literature in the field refers to Work Domain Analysis (2008). This could inform L&D professionals.

In my experience with tender evaluation, training system designs are assessed for compliance to specified requirements and the winning tender is selected on the basis of an unintegrated series of compliance assessments. There is no systematic way of determining whether the tendered solution actually fulfils the training functions of the work domain.

Work domain analysis provides a framework for tender evaluation that is both functional and integrative because sets of specifications may be easily related to the training functions of the work domain. Such an approach moves assessment beyond whether a tenderer complies with the specifications of a training system to whether a tenderer fulfils the training functions of the work domain.

As more computer simulations become available, there is an increasing demand to use them, but few professionals know how they influence learning. When these systems can cost millions of dollars, training professionals need to gain a deep understanding of how they help, or even hinder, learning. The aviation industry may have some knowledge that could inform other industries.

work domain analysis

Source: PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 52nd ANNUAL MEETING—2008

Leave a Reply

  • (will not be published)