Authors: Steven G. Mandis
Appendix A
Goldman and Organizational Drift
Ship captains set out an intended course and use sophisticated tools for navigation to constantly revise their speed and direction based on an analysis of the external conditions under which they’re sailing. They know that otherwise, no matter how carefully they aim the prow of the ship when they leave port, the cumulative effect of ocean currents and other external factors over long periods will cause the ship to veer off course.
Organizational drift is akin to that deviation from an intended course; it’s the slow, steady uncoupling of practice from original procedures, principles, processes, and standards or protocols that can lead to disasters like the space shuttle
Challenger
explosion or the Black Hawk shoot-down incident in northern Iraq. As Harvard Business School professor Scott Snook argues, detecting organizational drift requires a sensitivity to the passage of time; single snapshots won’t do.
1
To understand what’s happened to Goldman since the writing of the business principles in 1979, then, we have to look back to its performance over time, how its interpretation of the principles has changed, and the conditions in which it operates. But before we get to that analysis, it’s worth developing a deeper understanding of organizational drift and its implications.
Drift into Failure
Sidney Dekker, a professor who specializes in understanding human error and safety, has used complexity theory and systems thinking to better understand how complex systems “drift into failure” over an extended period of time. His theories are worth exploring because they, and the ideas of other researchers, provide us with a clear understanding of how systems interact and drift away from intended goals. In some cases, this can end in disaster. In the case of Goldman, it means that there’s a distinct gap between the principles by which the firm purports to steer itself and what it’s actually doing in the world.
Earlier theories, Dekker argues, have been tripped up by their tendency to explain instances of failure in complex environments by blaming flawed components rather than the workings of the organizational system as a whole.
2
Dekker concludes, by contrast, that failure emerges opportunistically, nonrandomly, from the very webs of relationships that breed success and that are supposed to protect organizations from disaster. Dekker also observes that systems tend to drift in the direction of failure, gradually reducing the safety margin and taking on more risk, because of pressures to optimize the system in order to be more efficient and competitive.
We are able to build complex things—deep-sea oil rigs, spaceships, collateralized debt obligations—all of whose properties we can understand in isolation. But with complex systems in competitive, regulated societies—like most organizations—failure is often primarily due to unanticipated interactions and interdependencies of components and factors or forces outside the system, rather than failure of the components themselves. The interactions are unanticipated, and the signals are missed. Dekker points out that empirical studies show that reliable organizations with low failure rates tend to have a distinct set of traits: safety objectives and a safety culture promoted by leadership, appropriate internal diversity to enable looking at things from multiple perspectives and deploying a variety of responses to disturbances, redundancy in physical and human components, decentralization of safety decision making to enable quick responses by people close to the action, and the ability to continually and systematically learn from experience and adapt. The attitudes in such a safety culture include a preoccupation with avoiding failure, reluctance to simplify, deference to expertise (while recognizing the limits of expertise), sensitivity to operations (with vigilant monitoring to detect problems early), and suspicion of quiet periods. Last, Dekker recommends that when investigating why a failure occurred, we need to remember that what appears clearly wrong in hindsight appeared normal, or at least reasonable, at the time, and that abnormal data can be rationalized away by participants. As investigators of system failure we need to put ourselves in the shoes of the people who were involved.
Dekker argues that drift is marked by small steps. He puts it like this: “Constant organizational and operational adaptation around goal conflicts, competitive pressure, and resource scarcity produces small, step-wise normalization. Each next step is only a small deviation from the previously accepted norm, and continued operational success is relied upon as a guarantee of future safety.”
3
Of course, not all small steps are bad. They allow complex systems to adapt to their environment, producing new interpretations and behaviors. It’s important to remember, too, that the steps are small. “Calling on people to reflect on smaller steps probably does not generate as much defensive posturing as challenging their momentous decision.”
4
Dekker explains that organizations also drift due to uncertainty and competition in their environments. Organizations adapt because of a need to balance resource scarcity and cost pressure with safety. Resources to achieve organizational goals can be scarce because their nature limits supply, because of the activities of regulators and competitors, and because others make them scarce.
5
What about the protective infrastructure that is designed to ensure against failure? Dekker warns, “Complex systems, because of the constant transaction with their environment (which is essential for their adaption and survival), draw on the protective structure that is supposed to prevent them from failing. This is often the regulator, or the risk assessor, or the rating agency. The protective structure (even those inside an organization itself) that is set up and maintained to ensure safety is subject to its interactions and interdependencies with the operation it is supposed to control and protect.”
6
Interestingly, the protective infrastructure, with uncertain and incomplete knowledge, constraints, and deadlines, can contribute to drift—as well as failing to function when it should.
7
Its functioning or lack thereof is legitimized.
Practical Drift
You can see exactly this kind of drift in complex systems in Scott Snook’s analysis of the accidental shoot-down of two US Army Black Hawk helicopters over northern Iraq in 1994 by US Air Force F-15 fighters. All twenty-six UN peacekeepers onboard were killed. With almost twenty years in uniform and a PhD in organizational behavior, Lieutenant Colonel Snook, now a Harvard Business School professor, uses sociological analysis to thoroughly examine individual, group, and organizational accounts of the accident. Using his practical experience, combined with his academic training, he concludes that what happened was what he calls “practical drift”— the slow, steady uncoupling of practice from written procedure.
8
Snook describes the potential pitfalls of organizational complacency that every executive should take to heart. He insightfully applies several key sociological theories of organizational behavior, structure, and change to analyze how bad things can happen to good organizations. His resultant theory of practical drift provides dramatic insight into how such seemingly impossible events can be expected to occur in complex organizations. He describes how a practical drift of local adaptations and procedures can lead to a widening gap between safety regulations and practical operations. Individually, the adaptations can be inconsequential. But over a longer period, the accumulated drift results in a vulnerable system. Snook uses the word
practical
because he is looking at everyday practices. Culture and practice are interrelated. Also, Snook examines local practices and how a subunit changes and creates practices to adhere to its own norms but doesn’t necessarily coordinate with the dominant practices of the units they must connect with. So the expected coordinated action is not coordinated. The Black Hawk accident happened because, or perhaps in spite of, everyone behaving just the way we would expect them to behave, just the way rational theory would predict. Snook also points out that, depending on one’s perspective, the slow, steady uncoupling can seem random and dangerous in hindsight.
9
Social Normalization
Why don’t those involved in organizational drift see what’s going on and correct for it? When norms for behavior shift within an organization, members of that organization can become so accustomed to deviant behavior that they don’t consider it abnormal, despite the fact that it strays far from their own codified rules or principles. The firm’s mission, its reason for existence, is assumed to be implicit in its business principles; it is those principles, together with the firm’s strategic decisions, that drive the actions of employees. Eventually, the execution of the strategy adapts to match incremental changes in the interpretation and meaning of the principles, but the perception remains one of business as usual. It is a complex process with a certain level of organizational acceptance: the people outside recognize the change, whereas the people inside get accustomed to it and do not. And new members accept it as standard practice.
A number of sociologists have offered explanations as to why organizations drift: scarce resources, misaligned incentives, system complexity, multiple goal conflicts, and more. There rarely is a single moment or event that one can point to as that moment when change occurs, or a single individual that one can point to as the responsible party. Rather, multiple small steps occur over an extended period of time. Therefore, the effects go unnoticed, and norms are continuously and subtly redefined; a “new normal” is established with each incremental step. This is how a succession of small, everyday decisions can produce breakdowns that can be massive in scale.
Man-Made Disasters
, by Barry Turner, originally published in 1978, suggested the possibility of systematically analyzing the causes of a wide range of disasters. The working subtitle of the book was
The Failure of Foresight
, and in it, Turner gives a very good description of the successive stages that lead to failure, particularly what he calls the “incubation period,” during which the preconditions for disaster develop but go unnoticed.”
10
The sequence of events is as follows: (1) A notionally normal starting point where culturally accepted beliefs about the world and its hazards are regulated by precautionary norms set out in rules and codes of practice; (2) An incubation period where the accumulation of an unnoticed set of events is at odds with accepted beliefs about norms; (3) Precipitating events bring attention to themselves and transform general perceptions of the incubation period; (4) The immediate consequence of the collapse of cultural elements becomes evident; (5) The change is recognized, and there is an ad hoc rescue and salvage mission to make adjustments; and (6) Real cultural readjustment occurs, where a serious inquiry and assessment is carried out and cultural elements are adjusted.
11
Lisa Redfield Peattie, a now-retired anthropology professor at MIT, uses the extreme example of the Nazi death camps to describe normalization. (And, no, I’m not comparing Goldman to the Nazis in any way.) The two key mechanisms through which normalization occurs, she states, are “division of labor, which separates, in understanding and potential for collective organization, what it makes interdependent in functioning” and “the structure of rewards and incentives which makes it to individuals’ personal and familiar to undermine daily, in countless small steps, the basis of common existence.”
12
Peattie describes how work and daily routines, right down to scrubbing the cobblestones in the crematorium yard, were normalized for long-term prisoners and personnel alike by the division of labor, affording a degree of distance from personal responsibility for the atrocities being committed. People were simply doing their normal jobs and carrying out the normal routines established by those in charge; they had lost sight of all aspects of the big picture.
Edward S. Herman, a retired professor of finance at the Wharton School who specialized in regulation, borrowed the term “normalizing the unthinkable” from Peattie to describe how once unthinkable acts become routine. Herman explains, “Normalization of the unthinkable comes easily when money, status, power, and jobs are at stake. Companies and workers can always be found to manufacture poison gases, napalm, or instruments of torture, and intellectuals will be dredged up to justify their production and use.”
13
Obviously these are extreme examples, but they are reminders that complicity, obscured by the routines of the work, the division of labor, and distance from the results, is possible even in the most egregious of acts. Herman goes on, “The rationalizations are hoary with age: government knows best; ours is a strictly defensive effort; or, if it wasn’t me, somebody else would do it. There is also the retreat to ignorance—real, cultivated, or feigned. Consumer ignorance of process is important.”
14
Diane Vaughan’s now classic investigation of the decision making that led to the launch of the space shuttle
Challenger
in 1986 focuses on organizational factors. Vaughan learned that work groups typically develop a concept of “acceptable risk” that becomes part of the culture.
15
Managers pay attention to only a few parameters and do not develop a formal, systematic definition of what is acceptable risk for the organization. More troublesome, it is difficult for them to see the full implications of their actions. Deviance from the norm starts to become institutionalized, resulting in “an incremental descent into poor judgment.”
16
The acceptance of risk sets a precedent, and it is repeated and becomes the norm. This process mushrooms as the organization becomes larger and more complex. To the outside world, what is going on in the organization may look deviant, but to the work group everything is normal, and people believe they are adhering to what the organization expects of them. Once produced, deviant interpretations are reinforced and maintained by what Vaughan describes as an “institutionalized belief system that shapes interpretation, meaning, and action at the local level.”
17
Initially, the regrettable decision to launch the
Challenger
and the ensuing tragedy appeared to be a case of individuals—NASA managers—who, under competitive pressure, violated rules in order to meet the launch schedule. But what at first appeared to be a clear case of misconduct proved to be something entirely different: Vaughan discovered that the managers had not violated rules at all, but had actually conformed to all NASA requirements. Her work revealed, however, that they were also conforming to NASA’s need to meet schedules, which ended up affecting engineering rules about what constituted acceptable risks in space flight technologies and the decisions that were made regarding those risks. She discovered that NASA managers could set up rules that conformed to the basic engineering principles yet allowed them to accept more and more risk. A social normalization of that deviance occurred, meaning that once they accepted the first technical anomaly, they continued to accept more and more with each launch, because to do so was not deviant to them. In their view, they were conforming to engineering and organizational principles. As with practical drift, the normalization of deviance concerns practices.
18
Thus, the first time the O-rings were found to be damaged, the engineers found a solution and decided the shuttle could fly with acceptable risk. The second time that damage occurred, they thought the trouble came from something else. Believing that they had fixed the newest problem, they again defined it as an acceptable risk and just kept monitoring the situation. As they observed the problem recurring with no consequences, they got to the point that flying with the flaw was normal and acceptable. Of course, after the accident, they were horrified.
Something similar happened at Goldman. There was a social normalization of the change as the culture slowly shifted; the changes were so subtle that everything seemed normal. On a virtually day-to-day basis, normal was redefined. Once the partners normalized a given deviation—a shift in policies related to recruiting, promotion, compensation, underwriting, client relations, risk management—the deviation became compounded. Goldman was experiencing the norm shift described by Diane Vaughan: “When the achievement of the desired goals receives strong cultural emphasis, while much less emphasis is placed on the norms regulating the means, these norms will tend to lose their power to regulate behavior.”
19
Vaughan explains why people within organizations do not pick up on the fact that drift has occurred: “Secrecy is built into the very structure of organizations. As organizations grow large, actions that occur in one part of the organization are, for the most part, not observable in others. Divisions of labor between subunits, hierarchy, and geographic dispersion segregate knowledge about tasks and goals. Distance—both physical and social—interferes with the efforts of those at the top to ‘know’ the behavior of others in the organization, and vice versa. Specialized knowledge further inhibits knowing. The language associated with a different task, even within the same organization, can conceal rather than reveal.”
20
She explains how “structural secrecy” develops within organizations, defining it as “the way that patterns of information, organizational structure, processes, and transactions and the structure of regulatory relations systematically undermine the attempt to know [the extent of the danger]” and the ability of people to make decisions on the basis of that knowledge to manage risk.
21
Structural secrecy helps explain how people failed to notice the signs of impending disaster that were present during the incubation period first described by Turner, and it supports Peattie’s and Herman’s ideas regarding the normalization of the unthinkable. Vaughan explains that in the
Challenger
case, information was filtered as it moved up the chain of command, so people at the higher levels were largely unaware of events that may have been dealt with as “acceptable risk.”
22
The managers missed signals because some signals were mixed, some were weak, and some seemed routine. Vaughan concludes that deep-rooted structural factors were responsible for the tragic decision to launch the
Challenger
. It is important to note that in the case of the
Challenger
disaster, the cause Vaughn identifies is not deviance from rules and norms, but conformity with them. In the case of Goldman, I have identified organizational drift from the principles, with some signals of that drift having been missed as people conformed to the incremental changes and rationalized their behavior.