Unprepared: Global Health in a Time of Emergency

Unprepared: Global Health in a Time of Emergency

by Andrew Lakoff
Unprepared: Global Health in a Time of Emergency

Unprepared: Global Health in a Time of Emergency

by Andrew Lakoff

Paperback(First Edition)

$29.95 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

Recent years have witnessed an upsurge in global health emergencies—from SARS to pandemic influenza to Ebola to Zika. Each of these occurrences has sparked calls for improved health preparedness. In Unprepared, Andrew Lakoff follows the history of health preparedness from its beginnings in 1950s Cold War civil defense to the early twenty-first century, when international health authorities carved out a global space for governing potential outbreaks. Alert systems and trigger devices now link health authorities, government officials, and vaccine manufacturers, all of whom are concerned with the possibility of a global pandemic. Funds have been devoted to cutting-edge research on pathogenic organisms, and a system of post hoc diagnosis analyzes sites of failed preparedness to find new targets for improvement. Yet, despite all these developments, the project of global health security continues to be unsettled by the prospect of surprise.

Product Details

ISBN-13: 9780520295766
Publisher: University of California Press
Publication date: 08/01/2017
Edition description: First Edition
Pages: 240
Product dimensions: 5.90(w) x 8.90(h) x 0.60(d)

About the Author

Andrew Lakoff is Professor of Sociology and Communication at the University of Southern California. He is the author of Pharmaceutical Reason: Knowledge and Value in Global Psychiatry and coeditor of Biosecurity Interventions: Global Health and Security in Question

Read an Excerpt

CHAPTER 1

A Continuous State of Readiness

One evening the week after Hurricane Katrina struck the Gulf Coast in August 2005, the television news anchor Anderson Cooper was interviewed by talk show host Charlie Rose. Cooper was still on the scene in New Orleans, with the inundated city in the background and a look of harried concern on his face. Cooper had been among the first reporters to challenge official accounts that hurricane relief operations were functioning smoothly, based on the stark contradiction between disturbing images on the ground and governmental claims of a competent response effort. He was shocked and dismayed by what he was finding in New Orleans, but also seemed moved, even transformed, by his role as a chronicler of domestic catastrophe. He had covered disasters in Somalia, Sri Lanka, and elsewhere, he said, but never expected to see images like these in the United States: hungry refugees, widespread looting, corpses left on the street to decompose. Toward the end of the interview, Rose asked him what he had learned from covering the event. Cooper paused, reflected for a moment, and then answered: "We are not as ready as we can be."

There were a number of possible lessons that could have been drawn from the hurricane and its aftermath: concerning the role of urban poverty in increasing vulnerability to disaster, the social isolation of the elderly, the deterioration of flood protection infrastructure, and so on. But Cooper's intuitive, if inchoate, interpretation of the hurricane's meaning — in terms of a state of collective "readiness" — was common among authorized observers of the event, from journalists to public officials. Notably, this shared interpretation was prospective and generic, alluding to a range of potential future disasters rather than focusing on the hurricane itself.

In the aftermath of Katrina, it was common to see comparisons made between the federal government's failed response to the hurricane and its ostensibly more successful response to the terrorist attacks of September 11, 2001. To an observer a decade earlier, it might have been surprising that a natural disaster and a terrorist attack would be considered part of a shared problem space. And the image, three weeks after Katrina struck, of President George W. Bush flying to the headquarters of USNORTHCOM, a military installation designated for use in national security crises, to monitor the progress of Hurricane Rita as it hurtled toward Texas would have been equally perplexing. The failed governmental response to Katrina also pointed toward the onset of other possible emergencies, such as the outbreak of a novel and deadly infectious disease. "The danger of a major hurricane hitting the Gulf Coast was ignored until it was too late," declared Senator Ted Kennedy in early October. "We must not make the same mistake with pandemic flu. Other nations have taken effective steps to prepare, and America cannot afford to continue to lag behind."

Six months after the hurricane, the White House released its official assessment of the "lessons learned" from the federal government's response to the event. What Cooper's response and the White House report had in common was their understanding of the occurrence of a catastrophic event in terms of the failure of a preparedness system. Preparedness, noted the report, was essential not only in anticipation of natural disasters but also for managing intentional acts of malice: the task of the White House report was to help the nation to become "better prepared for any challenge of nature or act of evil men that could threaten our people." The report interpreted the hurricane, in retrospect, as a test of the nation's preparedness system. It sought to assess "the key failures during the Federal response to Hurricane Katrina," not to affix blame but rather "to identify systemic gaps and improve our preparedness for the next disaster — natural or man-made." According to the report, "four critical flaws in our national preparedness became evident" in the government's response to the hurricane. Each of these flaws concerned the political administration of emergency: the unified management of the national response, command and control structures, knowledge of preparedness plans, and regional planning and coordination. In sum, the report outlined a schema of governmental preparedness designed to serve as the basis for corrective action.

This chapter describes the historical emergence and gradual consolidation of "national preparedness" as a governmental approach to managing perceived threats. As a normative rationality coupled to a set of administrative techniques, national preparedness provides authorities with tools for grasping uncertain future events and bringing them into a space of present intervention. This analysis helps to explain an otherwise puzzling aspect of contemporary governmental practice: how a range of seemingly disparate potential threats — including terrorist attacks, natural disasters, and pandemics — have been brought into a common framework of collective security. It also points forward to the broader theme of the book: how the norm of preparedness came to be applied to the category of "emerging disease" at a global scale.

Preparedness marks out a limited but generally agreed-on terrain for the management of threats to collective life, making it possible to gather together a range of possible events under a common rubric. Its techniques operate to bring these potential events into the present as potential future disasters that expose current vulnerabilities. In making future disasters into objects of present reflection and action, preparedness also generates responsibility: the fact that one might have prepared means that, if the anticipated disaster does occur and response is insufficient, one should in retrospect have been better prepared. The techniques of preparedness, then, are a response to the political demand posed by the contemporary category of emergency. The chapter begins with a schematic contrast between two styles of reasoning about potential future threats: risk and preparedness. It argues that preparedness is an especially salient approach to events that seem to exceed the capacities of the tools of risk assessment — threats whose likelihood is difficult to calculate using statistical means but whose consequences could be catastrophic.

THE LIMITS OF RISK ASSESSMENT

In its technical sense, the term "risk" does not signify a danger or peril per se, but rather a "specific mode of treatment of certain events capable of happening to a group of individuals," as the historian François Ewald writes. In the field of private insurance, where tools of risk assessment were invented, these events might include accidents, illnesses, or unemployment. Risk as a "mode of treatment" of such events involves, first, tracking the historical incidence of such events over time within a given population and then, based on this data, calculating the anticipated rate of occurrence of such events in the future. As Ewald notes, the assessment of risk is a way of reordering reality: what was previously understood as a singular event that disrupted the normal order comes to be seen as a normal, relatively predictable occurrence. Knowledge of this rate of incidence, gathered through actuarial tables, has enabled insurers to rationally distribute risk across a population.

The assessment of risk has also been central to governmental efforts to know and improve the welfare of national populations, beginning in the nineteenth century when European governments began to adopt the actuarial tools that were initially developed in the context of private insurance. Such efforts have focused in particular on phenomena occurring with regularity over a population such as disease, poverty, and industrial accidents. Through programs such as public health and social insurance, modern governments have sought to manage the risks faced by national populations.

This chapter takes up the history of governmental risk management at a novel conjuncture, in the second half of the twentieth century. As social theorists such as Ulrich Beck and Anthony Giddens have argued, this period saw the appearance of a new problem for the government of risk: how to approach threats whose probability was difficult or impossible to calculate but whose consequences could be catastrophic. In the decades after World War II, experts and policy-makers became increasingly concerned with the challenge posed by such catastrophic threats — from nuclear accidents, to mass casualty terrorism, to anthropogenic climate change. These dangers, Beck writes, "shape a perception that uncontrollable risk is now irredeemable and deeply engineered into all the processes that sustain life in advanced societies." For such observers, the existing knowledge practices and institutions of risk governance were inadequate for understanding and managing the new catastrophic threats. Indeed, some of the very technological systems that had been designed to improve collective well-being were now identified as sources of vulnerability.

In recent years, a number of analysts have pursued the question of how governments and the public should assess and manage the risk posed by catastrophic threats. Much of this work has focused on the limits of calculative rationality in approaching the threat of occurrences that are either uncertain — because knowledge about their frequency or harm is lacking — or that would be so devastating in their consequences that the usual calculus of cost and benefit cannot be applied. This has led to questions such as the following: Can formal techniques of risk assessment be applied to uncertain or potentially catastrophic threats? And what mechanisms of risk governance might be relevant where formal risk assessment reaches a limit?

For one line of scholarship on risk and rationality, which arose from social anthropology, psychology, and economics in the 1970s, the central issue concerns the divide between expert and lay evaluations of risk and the implications of this divide for public policy. Classic studies sought to explain why individuals and social groups either over- or undervalued certain risks, through the examination of collective values, common cognitive heuristics, or structures of bounded rationality. Such approaches often share an assumption, whether implicit or explicit, that lay "misperceptions" of risk can be contrasted with neutral expert understandings. The role of government, in this view, is to use objective methods of risk assessment to shape rational public policy, even in the face of highly uncertain or potentially catastrophic threats.

Scholars in the social studies of science have cast doubt on any assumption that there is an objective and secure position of expert knowledge about risk that can be reliably used to correct public misperceptions. Observing disagreement among experts on issues such as drug safety or the regulation of environmental toxins, and broad uncertainties in areas like emerging pathogens and terrorism, they have argued that any framework of risk governance must acknowledge the limits of technical risk assessment. Moreover, these scholars have also pointed out that certain members of the public — or "lay experts" — may have important insights into the nature of risk that authorized experts miss, given their narrow framings of cost and benefit. The limits of expert risk assessment, they argue, point toward a new politics of precaution in the face of catastrophic threats, or toward a democratization of risk governance.

In what follows, I build on this work on the role of expert knowledge in governing catastrophic risk but pose a different set of questions. My analysis builds from the observation that, over the past several decades and in multiple arenas of scientific and governmental activity, experts have invented an array of technical practices that are designed to assess and manage uncertain and potentially catastrophic threats. Here I consider risk assessment and preparedness as distinctive "styles of reasoning" about potential threats, following Ian Hacking's analysis of the plurality of ways of making truth claims in the sciences. In other words, there is not a singular or unified method for technically and politically approaching security problems. Rather, there are different ways of understanding and managing threats that may be incommensurable with one another and that may be associated with particular institutional settings, forms of professional training, or political stances. In looking at the practices of experts charged with managing catastrophic risk, the task for the critical observer is not to judge which approach to future threats is most valid, but rather to explain how a given style of reasoning has emerged in a specific context and then extended into new arenas.

FROM PRECAUTION TO PREPAREDNESS

The framework of risk assessment does not assume that the future will necessarily turn out as calculated. Rather, having performed a risk assessment provides a technically defensible rationale for a given decision so that future blame can be avoided. Niklas Luhmann describes the vantage point of risk assessment as one of "provisional foresight": the present, he writes, can calculate a future that "can always turn out otherwise." So long as one has calculated correctly, one cannot be blamed later for having made the wrong decision. However, Luhmann notes, this approach to the uncertain future is potentially undermined by the prospect of the incalculable but catastrophic threat. Such an event "is the occurrence that no one wants and for which neither probability calculations nor expert opinions are acceptable." One alternative is the principle of precaution, which claims that in the face of uncertainty, one must act to prevent the occurrence of a catastrophic outcome. From the perspective of precaution, having made a risk assessment does not insulate the decision-maker from future blame. According to this logic, in approaching the uncertain future, one must take into account not what is probable or improbable but what is most feared. "I must, out of precaution, imagine the worst possible," writes Ewald.

The principle of precaution has been an influential response to certain hazards, especially the threat of ecological disaster posed by industrial and technological developments such as nuclear power and agricultural biotechnology. Although it operates at the limit point of risk assessment, the principle of precaution is addressed to the same question: whether the potential benefits of a given action outweigh its potential harms. Precaution answers by saying that because we cannot determine the likelihood of the catastrophic event or because its potential consequences are so dire that they cannot be mitigated, we must take action to avoid its occurrence. As we will see, the framework of preparedness poses the question differently. Like precaution, it is applied to threats that "as measured by the existing institutional yardsticks — are neither calculable nor controllable," as Beck puts it. In contrast to precaution, however, preparedness does not prescribe avoidance of the threatening event. Rather, preparedness assumes that the occurrence of the event may not be avoidable and so generates knowledge about its potential consequences through imaginative practices like simulation and scenario planning. Such practices make it possible to gauge vulnerabilities in the present, which can then be the target of anticipatory intervention.

Risk assessment and preparedness are both ways of making the uncertain future available to present intervention, but they require different types of expert knowledge, and they generate different kinds of response (see Table 1.1). From the perspective of preparedness, it may not be possible to evade the onset of a disastrous event. Although the likelihood of its occurrence is not known, one must act as though it were going to happen. The task for preparedness planners, then, is to mitigate present vulnerabilities and to put in place response measures that will prevent the disastrous event from spiraling into a catastrophe. As a governmental strategy, preparedness organizes a set of techniques meant to sustain order and preserve life in a future time of emergency. These techniques include early warning systems, scenario-based exercises, stockpiling of essential supplies, and the capacity for crisis communications. The duration of intensive response by a preparedness apparatus is limited to the immediate onset and aftermath of crisis, but the requirement of vigilant attention to the prospect of catastrophe is ongoing.

As illustrated by the case of Hurricane Katrina, governmental preparedness measures face a number of challenges. First, there is the question of how to prioritize among disparate threats, given a wide range of potential disasters and a limited amount of resources available to address them: should a public health agency, for example, focus on the possibility of a smallpox attack, an influenza pandemic or an outbreak of drug-resistant tuberculosis? Second, there is the problem of how to sustain a condition of ongoing vigilance, over an indefinite period of time, for an event that may or may not occur: how to avoid the fatigue of sustained anticipation, especially as the anticipated event continually fails to appear? Third, there is the question of who is in charge of preparedness in a system of dispersed sovereignty: what are the respective responsibilities of federal, state, and local officials, what authority does the military have, and what is the role of nongovernmental organizations? Given these challenges, governmental preparedness efforts often remain in unstable and highly fragmentary form; however, this condition of "unpreparedness" typically becomes apparent only in the wake of the potential event's actual occurrence and a failure of the response apparatus.

(Continues…)



Excerpted from "Unprepared"
by .
Copyright © 2017 Andrew Lakoff.
Excerpted by permission of UNIVERSITY OF CALIFORNIA PRESS.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Introduction

1. A Continuous State of Readiness
2. The Generic Biological Threat
3. Two Regimes of Global Health
4. Real Time Biopolitics
5. A Fragile Assemblage
6. Diagnosing Failure
Epilogue

Acknowledgments
Notes
Bibliography
Index
From the B&N Reads Blog

Customer Reviews