The presence of holes in any one “slice” does not normally cause a bad outcome. In reality, they are more like slices of Swiss cheese, having many holes-although, unlike in the cheese, these holes are continually opening, shutting, and shifting their location. In an ideal world, each defensive layer would be intact. They are mostly effective at this, but there are always weaknesses. Their function is to protect potential victims and assets from local hazards. High-technology systems have many defensive layers: some are engineered (alarms, physical barriers, automatic shutdowns), others rely on people (surgeons, anesthetists, pilots, control room operators), and yet others depend on procedures and administrative controls. THE “SWISS CHEESE” MODEL OF SYSTEM ACCIDENTSĭefenses, barriers, and safeguards occupy a key position in the system approach. The pursuit of greater safety is seriously impeded by an approach that does not seek out and remove the error-provoking properties within the system at large. The same set of circumstances can provoke similar errors, regardless of the people involved. Second, far from being random, mishaps tend to fall into recurrent patterns. First, it is often the best people who make the worst mistakes-error is not the monopoly of an unfortunate few. As a result, 2 important features of human error tend to be overlooked. 5 Engineering a just culture is an essential early step in creating a safe culture.Īnother serious weakness of the person approach is that by focusing on the individual origins of error, it isolates unsafe acts from their system context. 4 Trust is a key element of a reporting culture, and this in turn, requires the existence of a just culture-one possessing a collective understanding of where the line should be drawn between blameless and blameworthy actions. The complete absence of such a reporting culture within the Soviet Union contributed crucially to the Chernobyl disaster. 3 Without a detailed analysis of mishaps, incidents, near misses, and “free lessons,” we have no way of uncovering recurrent error traps or of knowing where the edge is until we fall over it.
2Įffective risk management depends crucially on establishing a reporting culture.
In aviation maintenance-a hands-on activity similar in many respects to medical practice-about 90% of quality lapses were judged blameless. Although some unsafe acts in any sphere are egregious, most are not. Indeed, continued adherence to this approach is likely to thwart the development of safer health care institutions. Nevertheless, the person approach has serious shortcomings and is ill-suited to the medical domain. It is also legally more convenient, at least in Britain. Seeking as much as possible to uncouple a person's unsafe acts from any institutional responsibility is clearly in the interests of managers. If something goes wrong, a person (or group) must have been responsible.
REASON COM FREE
People are viewed as free agents capable of choosing between safe and unsafe modes of behavior. Blaming individuals is emotionally more satisfying than targeting institutions. From some perspectives, it has much to commend it. The person approach remains the dominant tradition in medicine, as elsewhere.