«Abstract This paper introduces a new formal approach to ﬁnd potential mode confusion sit- uations in shared-control systems such as service and ...»
Both state machines are modeled as two distinct sets of processes in Hoare’s CSP language . CSP-processes engage in events (e.g., communication with other processes, or signals) and proceed from one state to another. The processes comprise disjoint sets of states (modeled as subprocesses), disjoint sets of internal communication channels and variables (e.g., the user’s perception of the current speed of the wheelchair may differ from that of the automation), and a shared set of external observables, such as the information whether or not the wheelchair is in a standstill.
The FDR model checking tool is used to carry out reﬁnement checks in the so-called failures model. Here, each process is represented by the set of ﬁnite event sequences it can perform (its traces) and by its refusals (sets of events the process cannot engage in after having performed a certain trace). If a process P reﬁnes a process Q in the failures model, every behavior of P can also be observed for Q with regard to the traces as well as to the refusal sets. If the process U representing the user’s mental model of the automation and the process A representing the implementation of the automation are equal in the failures model, the user will never lose track of the automation behavior and the automation will always accept any action of the user that he or she thinks to be adequate in a speciﬁc situation.
5 5 Results and Future Work So far, the mode confusion analysis has shown that the human operator cannot track the behavior of the automation if the obstacle avoidance module tries to steer back to the original path after an avoidance maneuver. To avoid such mode awareness problems in the future, the human-machine interface will be improved. By means of a speech module, the wheelchair will indicate mode changes. As a consequence, confusing situations in which, for instance, the driving assistant tries to circumvent an obstacle that cannot be seen by the human operator will occur less often. The formal approach brieﬂy sketched in section 4.3 will be generalized to suit a wide range of application domains similar to the one presented here.
Acknowledgments This work was supported by the Deutsche Forschungsgemeinschaft through the priority program “Spatial Cognition”.
References  Butler, R. et al. (1998). A Formal Methods Approach to the Analysis of Mode Confusion.
In: Proc. of the 17th Digital Avionics Systems Conference. Bellevue, Washington.
 Formal Systems (Europe) Ltd. (2000). Failures Divergence Reﬁnement - FDR2 User Manual. Oxford, UK.
 Hoare, C.A. (1985). Communicating Sequential Processes. Prentice-Hall International.
Englewood Cliffs, New Jersey.
 Lankenau, A. et al. (1998). Safety in Robotics: The Bremen Autonomous Wheelchair.
In: Proc. of AMC’98, Fifth Int. Workshop on Advanced Motion Control. Coimbra, Portugal. 524-529.
 Lankenau, A., R¨ fer, T. (2001). The Bremen Autonomous Wheelchair – A Versatile and o Safe Mobility Assistant. In: Intelligent Wheelchairs in Europe, Special Issue of the IEEE Robotics and Automation Magazine. To appear in March 2001.
 Leveson, N. et al. (1997). Analyzing Software Speciﬁcations for Mode Confusion Potential. In: Workshop on Human Error and System Development (Proc.). Glasgow, UK.
 R¨ fer, T., Lankenau, A. (1999). Ensuring Safe Obstacle Avoidance in a Shared-Control o System. In: J.M. Fuertes (Ed.): Proc. on the 7th Int. Conf. on Emergent Technologies and Factory Automation. 1405-1414.
 R¨ fer, T., Lankenau, A. (2000). Architecture and Applications of the Bremen Auo tonomous Wheelchair. In Wang, P. (Ed.): Information Sciences Journal 126:1-4. Elsevier Science BV. 1-20.
 Rushby, J. et al. (1999). An Automated Method to Detect Potential Mode Confusions. In:
Proc. of the 18th AIAA/IEEE Digital Avionics Systems Conference. St. Louis, USA.
 Sarter, N., Woods, D. (1995). How in the World Did We Ever Get into That Mode? Mode Error and Awareness in Supervisory Control. In: Human Factors. Vol. 37