Bridges Too Far? Peter Rabins on What Causality Can Tell Us About Bridge Collapses

“A correct causal analysis is much more than assigning blame. It provides guidance for preventing future failures and can drive the accumulation of new knowledge.”—Peter Rabins

We conclude our week-long feature on Peter Rabins’s The Why of Things: Causality in Science, Medicine, and Life with an essay by the author which examines the idea of causality through two recent bridge collapses:

In his 1974 book A Bridge Too Far, Cornelius Ryan detailed the failed World War II Allied attempt to invade Germany by capturing several bridges crossing the Rhine River, particularly the bridge at Arnhem in the Netherlands. Ryan makes a strong case that the causes of the doomed operation included poor planning, interpersonal conflict among senior leadership, and hubris, and that a need for the Allies to appear united inhibited those who opposed it from expressing their reservations directly.

In The Why of Things, I present a model for the often times challenging task of identifying the cause or causes of events. Two relatively recent bridge collapses illustrate the challenges in assigning cause to the failure of a human construction endeavor.

The first, the collapse of a section of the Interstate 5 bridge over the Skagit River, about one hour north of Seattle, occurred after a truck carrying an oversized load struck a girder on the bridge and caused the immediate collapse into the river of a section of the bridge. Fortunately no one was killed.

The truck had been preceded by a forward car with blinking lights, a sign warning of an “over-sized load”, and a mechanism to measure whether the load was likely to strike something on the highway, as required by regulation, but no following car. The driver of the truck reported that he had driven to the right because of a passing semi-trailer truck.

Causality would seem easy to assign here. The precipitating event was clearly the truck’s hitting the bridge. A predisposing element was the bridge’s design. However, other factors also need to be considered, include the passing semi-trailer truck, the experience of the driver, the rule that required a preceding warning car but not a trailing warning car. There are also issues that relate to the societal context of public expenditure. The need to use economical design approaches that do not include redundant failure prevention features (jet planes with multiple engines, for example, are designed to fly on only one engine in the event of engine failure while aloft) is not a failure but a choice that society makes in weighing costs and benefits. I refer to it as a programmatic cause because it is a not only a contributor to the event but is also a part of the larger context of society that relates to more than construction design.

The primary method guiding this analysis is empirical or scientific. The vulnerabilities of this bridge design can be expressed mathematically, and the failure was inevitable once the truck struck the girder. The narrative method ties it all together: the truck driver’s perception that he needed to pull over because of the passing truck, the regulation requiring a forward but not rear guide car, and the design features.

The second relatively recent bridge collapse involved the Interstate 35 bridge in Minneapolis in 2007. Thirteen people were killed and 45 injured, according to AP writer Frederick J. Frommer. The precipitating event was the failure of one of the large “gusset” or supporting plates. Because this bridge, too, had a “fracture critical” design, the failure of this one element made collapse inevitable. The cause of the gusset plate failure was both the plate’s manufacture–it was produced at only half the designed thickness–and the heavy load placed on the bridge by concrete resurfacing equipment that had been parked on the bridge in preparation for bridge repair. Why was the plate manufactured at the incorrect thickness? I was not able to find an explanation. Possibilities include a purposeful, fraudulent method for increasing profit, an error in transcription of the plans, or a recalculation of what was necessary. Evidence for any of these would be empirical and categorical.

Perhaps the best and best studied bridge failure in the U. S. was the Tacoma Narrows Bridge collapse in 1940. In his readable and illuminating book, To Engineer Is Human, Henry Petoski makes the general point that engineers learn from failure and that knowledge advances that would otherwise not have occurred have repeatedly been the results of such failures. He identifies two design elements that contributed to the Tacoma Narrows Bridge failure. First, the bridge had an inherent vibrational frequency induced by wind so that one half of the bridge was “up” and the other “down” in a self-reinforcing fashion when the wind blew. A second aerodynamic force also led to the bridge’s flexing or vibrating from side to side, a possibility that was unknown to bridge designers and had never been considered before.

Cal Tech Professor Von Karman calculated that a 40 mile an hour wind would cause the forces needed to induce the vibration that would cause the bridge failure, the speed at which the wind was blowing that day. Thus, the wind was the precipitating cause, the inherent aerodynamic instability a predisposing cause (had the wind never blown at that speed the collapse would not have occurred), but the inevitability of unknown elements of design, described by Petoski as “unplanned experiments that can teach one how to make the next design better,” are better conceptualized as programmatic cause, inherent in the limited nature of human knowledge.

There were also unique features of the Tacoma Narrows Bridge design. It was only two lanes wide, used a new design in which plates replaced the lattice beam trusses that had been used in older suspension bridges. These two elements made the bridge lighter and less costly, but also increased the ease with which the 40 mile an hour wind could induce the aerodynamic instability. These steps were taken to lower the cost of the bridges construction, but in retrospect, contributed to its failure, and are a part of the programmatic context of designing costly structures.

Each of these four stories, the failed Rhine crossing, and the I-5, I-35 and Tacoma Narrows Bridge collapses illustrate the complexity of causal analysis as well as the success that a multi-facet analysis can bring to the task of determining causality. There are instances in which a straight forward, single cause can be identified and others in which different levels of analysis, the use of specific analytic methods and a particular conceptual model best identifies cause. I have tried to balance the complexity of this approach with the belief that the identification of specific causes is usually possible in The Why of Things.

As Petoski emphasizes, a correct causal analysis is much more than assigning blame. It provides guidance for preventing future failures and can drive the accumulation of new knowledge.