Hurricane Katrina's impact on Gulf Coast oil refineries highlighted the risk of relying on a concentrated supply source. Professor Awi Federgruen explains what governments and private-sector companies can do to manage supply chain risk.
When Homeland Security Secretary Michael Chertoff fielded questions from reporters five days after Katrina hit the Gulf Coast, he asserted that government planners could not have predicted that a disaster of this magnitude could ever occur: “There will be plenty of time to go back and say we should hypothesize ever more apocalyptic combinations of catastrophes. Be that as it may, I’m telling you this is what the planners had in front of them. They were confronted with a second wave that they did not have built into the plan.”
In fact, scientists, government officials and journalists had written extensively about the possibility — indeed, high likelihood — of various catastrophic scenarios associated with hurricanes, including the one Katrina precipitated — and worse. In 2002, New Orleans’s main newspaper, the Times-Picayune, ran a five-part series of articles describing scenarios resulting in massive floods, the destruction of homes and oil refineries and the loss of thousands of lives.
Hurricane and public-health research centers had predicted the likely advent of hurricanes of categories 4 and 5. The Army Corps of Engineers was aware, meanwhile, that the levees were only capable of withstanding hurricanes up to category 3 at best; computer models had shown the potential for major havoc with even lesser storms. Last year, 40 government agencies joined in simulating an imaginary storm in the New Orleans region in which half a million buildings were destroyed and the evacuation of one million residents was required.
If the logistics of a mass evacuation plan and the associated coordination challenges among local, state and federal emergency responders have proven daunting, what about the government’s ability, and responsibility, to safeguard supplies vital to the national economy?
Oil is arguably the commodity most critical to the functioning of the American economy. Its supply is primarily constrained by existing refinery capacity. In the past 20 years, as the real valued U.S. gross domestic product grew by 86.5 percent, the number of refineries in the country decreased by more than 50 percent. Economies of scale in the cost structure drove smaller refineries out of the market, while other refineries identified various additional benefits of pooling capacity — for example, statistical economies of scale resulting from the pooling of demand risks. In July and August of 2004, U.S. refineries were operating at 97 percent of available capacity.
Moreover, in the event of a domestic supply disruption, little recourse can be expected from overseas refineries: this year’s run-up in oil prices — even before Katrina — to record-high levels is generally attributed to a lack of global refinery capacity. At the beginning of 2005, the Department of Energy predicted that current “financial, environmental and legal considerations make it unlikely that new refineries will be built in the United States.”
In a research paper I coauthored in June with Nan Yang, one of my doctoral students, we stated what appeared to be self-evident: “Most ominously, close to half of our capacity is located in a relatively small region on the Gulf Coast; disruption of its refinery and distribution process could have a crippling effect on our economy.” Katrina has given us but a small indication of this potential: The temporary disruption of approximately 10 percent of the national refinery capacity caused the price of gasoline to increase, overnight, by approximately 40 percent beyond the record levels it had reached before Katrina made its brutal appearance. Few expect the equilibrium price to decrease significantly in the foreseeable future.
In the private sector, planning for disaster has become one of the foci of supply chain planning, representing, for example, a major theme in the Longitudes 2004 conference, whose participants included CEOs, academic leaders, and former government officials and heads of state. In some industries, the ability to manage supply risks effectively has developed into a major competitive advantage.
The cellular phone industry represents one such example. Ericsson and Nokia, who in 2000 were among the prime global competitors, adopted very different approaches to the threat of supply disruptions. Ericsson relied on a single supplier for several critical chips and had no contingency plans in place. In contrast, Nokia, while using the same manufacturer as the primary supplier, identified alternative sources to mitigate risks and had an elaborate contingency strategy in place. When a fire broke out in the chip plant, Ericsson suffered major and long-term losses in profits and market shares. Nokia, on the other hand, was fully prepared and, in fact, picked up some of Ericsson’s market. Now, Cisco and Hewlett-Packard consider measures of supply risks along with traditional criteria like cost and quality when selecting suppliers for any given component.
I’ve recently begun developing, again with doctoral candidate Yang, planning models to aid in supplier selections, incorporating measures of supply risks along with various cost and capacity measures. The models employ probabilistic descriptions of the demand that needs to be met as well as each of the supply risks, that is, each of the potential suppliers’ yield factor, which is defined as the percentage of an order actually completed to specification. The planning models help in selecting which of the given set of suppliers to retain and how much to order from each to minimize aggregate costs — while ensuring that the uncertain demand is met with a given probability. The total procurement costs consist of both variable and fixed costs for each participating supplier, incurred irrespective of his supply level. The costs of carrying inventories and those of lost sales may be included as well.
The models provide, in addition, several general insights into the supplier selection challenge: First, even when the potential suppliers have ample supply (in the absence of supply disruptions), whether a set of suppliers allows for a feasible solution depends not just on how many of them there are but also on each supplier’s predictability, as measured by the ratio of the mean and the standard deviation of the supplier’s yield factor. (This measure is closely related to the so-called Sharpe ratio for portfolios of financial instruments.) The square of this predictability ratio translates into each potential supplier’s so-called Base Supplier Equivalents.
A set of suppliers is feasible (or not) depending on how its total number of Base Supplier Equivalents compares to a critical number, given by a simple function of the permitted shortfall probability only. In particular, whether a set of suppliers is feasible does not depend on the shape of the demand distribution, its mean and standard deviation included. The number of suppliers required can be reduced by improving the suppliers’ reliability; moreover, the benefits of predictability improvements become progressively larger, giving support to management philosophies like Six Sigma. The allocation scheme, which splits the aggregate order in proportion to the suppliers’ mean-to-variance ratios of their yield distributions, has the best chance of enabling feasibility: if a feasible solution fails to exist under this scheme, it fails to exist under any.
When minimizing variable procurement costs, additional suppliers always help to reduce the total cost when the expected cost per effectively delivered unit is identical for all. When these cost rates differ, one faces a real tradeoff between solutions involving fewer and less expensive suppliers but a larger aggregate order to provide adequate protection against supply risks, and solutions with additional, more expensive suppliers but reduced aggregate orders. In any case, the optimal set of suppliers always includes a certain number of the least expensive suppliers. Interested readers may consult our working papers for additional insight.
Should the government be expected to safeguard the supply of such critical commodities as heating oil, gasoline or vaccines and medications against potential disruptions by natural or terrorist-induced disasters? Since the 1950s, all U.S. administrations, whether Republican or Democratic, have taken the position that the government needs to intervene in the oil market to mitigate the impact of sudden supply problems. To this end, the government maintains a stockpile of strategic reserves, but the stockpile is largely in crude oil, when refinery capacity has become the system’s bottleneck. It is also the most vulnerable part of the oil supply chain, since repairs to refinery equipment can take months to years.
In addition to managing the demand for oil via the promotion of conservation programs and alternative fuels, the government could provide incentives to elevate the supplier base and its reliability to an adequate level. The planning models outlined above may assist in determining the desired targets; much additional thought needs to be given to effective, market-driven incentive structures to guide the country toward these goals.
Awi Federgruen is the Charles E. Exley Professor of Management and chair of the Decision, Risk and Operations Division. To read more about his research, visit Columbia Ideas at Work.