The Myth of Panic

Koshu Kunii/French police during protest, Paris

The year is 1950. A dead body floats along the New Orleans waterfront. The coroner who examines him realizes something terrifying: this nameless man died sick. The corpse is infected with the pneumonic plague. The city authorities now have 48 hours to find and inoculate every person who came in contact with the man before his death or New Orleans will become the epicenter of a terrible epidemic. At a crisis meeting of the city council, one councilor argues that the only way to save the city is to announce to the public what has happened and seek their cooperation. But the local public health officer—the hero of this story—begs the mayor not to go public with the news. The citizens of New Orleans must be kept in the dark. The press must be kept quiet. The title of the film reveals what he fears will occur if the public discovers the truth: Panic in the Streets.

The story beats charted out in the 1950 film Panic in the Streets have been repeated in every disaster film that has followed it. Experts discover a looming catastrophe of incredible proportions. They race to solve the problem as covertly as possible; to do otherwise would invite a panic more disastrous than the disaster itself. If they fail, audiences get to see images of an unnerved public up close. Society descends into a Hobbesian scramble for resources or open riot against the powers that be. The lesson is clear: the key to disaster response is ensuring the public does not feel fear. Normal citizens who understand the danger they are in will pose a threat to everyone else in calamity’s path. Panic is the true disaster. Disaster management is thus, at its core, a problem of narrative control.

This understanding of disaster is not limited to Hollywood blockbusters. Over the last year, we have seen the consequences of prioritizing panic prevention over disaster response in one country after another. The pattern was set early in Wuhan, China. There, provincial and municipal officials muzzled early warnings of a novel respiratory illness from doctors, virologists, and health officials. They feared what might happen if normal citizens became aware of the disease. “When we first discovered it could be transmitted between people, our hospital head, chairman, medical affairs department, they sat and made endless calls to the city government, the health commission,” wrote one Wuhan nurse in January of 2020. “[But] they said we still can’t wear protective clothing, because it might stir up panic.”

Similar concerns prompted China’s National Health Commission to issue a confidential notice forbidding labs that had sequenced the new virus to publish their data without government authorization. Even as China’s top health official warned the Chinese health system to prepare for the “most severe challenge since SARS in 2003” and ordered the Chinese CDC to declare the highest emergency level possible, public-facing officials were still reporting that the likelihood of sustained transmission between humans was low.

The Chinese continually stalled WHO teams trying to gather information on the pandemic; it was not until the last week of January that Chinese health officials told the WHO the reason for their stonewalling. These officials conceded to the WHO team that they required help “communicating this to the public, without causing panic.” The WHO was sensitive to Beijing’s concerns and delayed its declaration of a global health emergency for several days. “You’ve got to remember this was a novel virus,” one member of a WHO delegation then tasked with the China response would say. “You don’t want to push the panic button until you’ve got reasonable confidence in your diagnosis.”

Unlike Chinese news sites, ordered to censor sensitive words in their reports to prevent coverage of the new disease from fomenting “societal panic,” American newspapers did not operate under the purview of an official censorship regime. But they too were afraid to “push the panic button.” With titles like “Should You Panic About the Coronavirus? Experts Say No” (The LA Times), “The Flu is a Bigger Threat” (NPR), “The Cognitive Bias That Makes Us Panic About the Coronavirus” (Bloomberg), and “The Pandemic Risks Bringing out the Worst In Humanity” (CNN), American magazines and newspapers led the charge to downplay the seriousness of the outbreak and delegitimize fear of it. In a piece from The New York Times titled “Beware the Pandemic Panic,” Farhad Manjoo described the reasoning behind this push: “What worries me more than the new disease is that fear of a vague and terrifying new illness might spiral into panic.”

This attitude was widely shared by the public servants responsible for preparing America for the pandemic to come. Most famously, Donald Trump was aware that the coronavirus was more dangerous than the flu, but refused to raise the alarm because, as he told journalist Bob Woodward, “I don’t want people to be frightened, I don’t want to create panic, as you say, and certainly I’m not going to drive this country or the world into a frenzy.”

But Trump was hardly the only politician to take this stance. As New York City became the center of the American pandemic, the city’s health commissioner successfully argued against lockdowns on the grounds that if New Yorkers became “fearful due to messaging, we could have more permanent harm than we currently have with Covid-19.” That same month, California public health officials argued against wearing masks for fear they might “add to a climate of alarm.” This same argument would reappear a month later when White House officials worried that a mask mandate “might cause panic.”

Perhaps they remembered attacks levied at them by Chicago mayor Lori Lightfoot a few weeks earlier: “I will candidly tell you that I was very disappointed with the comments of the CDC yesterday and members of the Trump administration around coronavirus,” she remarked after the CDC announced that Americans should prepare for the worst. “I want to make sure that people understand they should continue to go about their normal lives
we don’t want to get ahead of ourselves and suggest to the public that there’s a reason for them to be fearful.”

Why this fear of panic? What would have been wrong with allowing the public to feel afraid? Contrary to Lightfoot’s reassurances, there was a reason for the citizens of Chicago—and the rest of us—to “be fearful.” Yet leaders on both sides of the Pacific, at both the local and national levels, among both the politicians and the opinion-makers, were determined to keep their people as far away from fear as possible.

Events proved the anxieties of these elites unfounded: when cities in China, Europe, and finally the United States descended into lockdown, there was no mass panic. There was fear, yes, plenty of it—but that fear did not lead to irrational, hysterical, or violent group behavior. Our fear did not lead to looting, pogroms, or unrest. The fearful of Wuhan did not rise up in rebellion against the Communist Party; even when Italian doctors began rationing medical equipment and supplies, the fearful of Milan did not loot stores or disrupt the medical system; the fearful of New York did not duel each other to the death over toilet paper rolls.

The social “panic” that disturbed mayors, presidents, columnists, and Communists never materialized. It never does. Time and resources that could have been devoted to combatting a very real pandemic were wasted combatting an imaginary social phenomenon. In 2020, we all learned the perils of the myth of panic.

The Fear of the Crowd

Acute fear of the mass of common men by the elites who govern them is not novel. To one extent or another, such distrust exists in any society divided by lines of class and caste. But as a great sea of humanity poured into the new industrial centers of Europe and America in the late 19th century, a distinctly dark vision of mass behavior took root. No work captured this vision more completely than Gustave Le Bon’s 1895 work The Crowd: A Study of the Popular Mind. By the time Le Bon published The Crowd, he was already regarded as a pioneering figure in social science, famous for his anthropological accounts of India and Arabia. Violent industrial strikes and the bloody course of the Paris commune turned the focus of his studies back towards his own people. In the concept of “the crowd,” Le Bon found an explanation for the troubles of his country. This explanation was enormously influential with Western intellectuals in the early 20th century. It took root just as disaster response began to be studied as a discrete field—and it still stalks us today.

For Le Bon, a crowd was any group of individuals overtaken by a common purpose or emotion. This “does not always involve the simultaneous presence of a number of individuals on one spot,” he wrote, for “thousands of isolated individuals may acquire at certain moments, and under the influence of certain violent emotions—such, for example, as a great national event—the characteristics of a psychological crowd.” The chief psychological characteristic of a man caught up in a crowd was the loss of rational, independent judgment. Members of a crowd “feel, think, and act in a manner quite different from that in which each individual of them would feel, think, and act, were he in a state of isolation,” for the “critical spirit” and “conscious personality” of an individual will be lost once they are “sublimated” into a crowd’s collective conscious.

In this irrational and impulse-driven crowd-state, humans are vulnerable to what Le Bon labeled “contagion in the brain.” Infectious emotions like rage, fear, or a spirit of heroic self-sacrifice would race through the collective like a disease, leaving crowds of men “at the mercy of all external exciting causes…the slave of every impulse which they receive.” Merely by joining a crowd, Le Bon’s most famous passage argues:

[A] man descends several rungs in the ladder of civilization. Isolated, he may be a cultivated individual; in a crowd, he is a barbarian — that is, a creature acting by instinct. He possesses the spontaneity, the violence, the ferocity, and also the enthusiasm and heroism of primitive beings, whom he further tends to resemble by the facility with which he allows himself to be impressed by words and images — which would be entirely without action on each of the isolated individuals composing the crowd — and to be induced to commit acts contrary to his most obvious interests and his best-known habits. An individual in a crowd is a grain of sand amid other grains of sand, which the wind stirs up at will.

While Le Bon allowed that courage and heroism could be found in man en masse, he saw undirected mass behavior as a fundamentally destructive force. Civilized order requires “fixed rules, discipline… forethought for the future, an elevated degree of culture—all of these conditions that crowds have invariably shown themselves incapable of realizing.” Thus in a society where order and authority have been weakened, actions by the masses will invariably affect society the way a secondary bacterial infection “hastens the dissolution of enfeebled bodies.” This graphic metaphor captured Le Bon’s great fear: unleashed masses may not be the ailment that sends a social order tottering, but “it is always the masses that bring about its downfall.” For weakened authorities fearful of usurpation, it is not the liberated individual that poses a threat, but the collective consciousness of the crowd.

This pessimistic conclusion was taken seriously by one of the first groups to devote real attention to the problems of disaster management: military strategists responsible for defending civilian targets from the ravages of the air-raid. In the aftermath of the First World War, air theorists understood that the potential of aerial campaigns had barely been explored. These men turned to Le Bon’s ideas for guidance. The early air war theorists saw the bombing run as a way around the terrible attrition campaigns of the Western front. In the next war, they predicted, military aircraft would wreak devastation upon civilian targets mere days or hours after hostility began. These attacks would be enough to end the war altogether.

In the words of military historian Lawrence Freedman, Le Bon’s work provided a “quasi-scientific basis” for this new theory of destruction. The strategists, tutored on Le Bon’s grim view of the crowd, proposed that civilians unused to military discipline would surely meet the destruction of their cities with “panic on such a scale [that] their governments would have to abandon the war.” British strategist J.F.C. Fuller described their visions of future warfare vividly: “if a fleet of 500 aeroplanes” came to London, he wrote, it would “throw the whole city into a panic within half an hour of their arrival.” For several days, London would become “one vast raving Bedlam,” where “the hospitals will be stormed, traffic will cease, [and] the homeless will shriek for help.” As for the government, “it will be swept away by an avalanche of terror. Then will the enemy dictate his terms, which will be grasped at like a straw by a drowning man.” For elites, the disastrous element of this hypothetical panic would be the loss of political control over the masses.

This belief was soon entrenched in militaries across the world—at least those militaries that had no firsthand experience as a target of enemy bombing. Great Britain’s Marshal of the Royal Air Force channeled the new airpower consensus when he argued in 1937 that the Royal Air Force must reorient itself around air defense. There was no other defense against the potential “panic [caused] by indiscriminate attacks on London.” If Britain could not defeat enemy bombers before they released their payloads, he wrote, “we might possibly be defeated in a fortnight or less.” Similar beliefs drove the RAF’s offensive thinking. The purpose of the British bombing campaign of German cities and civilian targets, commanding officer Arthur Harris wrote in a 1943 memo to his subordinates at Bomber Command, was “the breakdown of morale at home and on the battlefront by fear.”

Washington defense planners made similar assumptions about the power of mass panic. As Rebecca Solnit chronicles in her history of disaster response, it was the invention of intercontinental missiles that forced Americans to contemplate the possibility of homeland attack for the first time. The Truman-era civil defense initiative known as Project East River concluded that “the prevention and control of panics in the time of attack are important tasks of civil defense. For the possibility always occurs that where people panic under attack, more death and injury may occur from that cause than from the direct effects of military weapons.” Eisenhower’s chief of the Federal Civil Defense Administration made clear exactly what “military weapons” the threat of panic outclassed:

Like the A-bomb, panic is fissionable. It can produce a chain reaction more deeply destructive than any explosive known. If there is an ultimate weapon, it may well be mass panic—not the A-bomb. Mass panic—not the A-bomb—may well be the easiest way to win a battle, the cheapest way to win a war
Just as a single match can burn a dry forest, so a trivial incident can set off a monstrous disaster when the confusion and uneasiness of the population have reached tinder point.

The extraordinary claim that panicked Americans were more dangerous than an actual nuclear attack had powerful purchase in America’s Cold War culture. Soon it was influencing not only civil defense plans, but popular depictions of mass behavior and disaster planning for non-military crises. By the 1980s, both Hollywood directors and disaster management professionals were convinced that every city in America was only one catastrophe away from barbarity.

Unveiling the Myths of Fear

The first man to question the myth of panic was Charles Fritz. As an observer for the United States Army Air Corps during the Blitz, Fritz watched how the English responded to the bombing of London. He discovered that the air theorists drawing on Le Bon were wrong: in the face of death, the British public had not been reduced to hysteria. When Fritz was pulled into the United States Strategic Bombing Survey—a commission created to assess the failures and success of American bombing campaigns in Germany and Japan—a few years later, he came to a similar conclusion. Fire-bombing may have had important economic effects on the enemy war machine, but it did not result in mass panic. The survey’s field research suggested the opposite: many bombed cities had higher morale than those spared attack.

Fritz continued this line of research once he returned to America. Ensconced as associate director of the University of Chicago’s Disaster Research Project, Fritz poured over accounts of one natural disaster and industrial accident after another. He concluded that disasters “result however temporarily in what may be regarded as a kind of social utopia.” “While the natural or human forces that precipitated the disaster appear hostile and punishing,” Fritz observed, “the people who survive become more friendly, sympathetic, and helpful than in normal times.” In disaster scenarios, shared fear and suffering create “an intimate group solidarity among the survivors.” When the world suddenly falls apart, people do not grow more selfish, violent, or irrational, but more altruistic, caring, and calm.

Modern disaster sociology was born out of Fritz’s research. Surveys conducted since then confirm his results. One literature review concludes—after describing the results of multiple surveys covering hundreds of building fires and floods, as well as dozens of hurricanes, tornadoes, airplane crashes, and terrorist attacks—that “systematic studies of human behavior in disasters have failed to support news accounts of widespread panic
Rather than panic or irrational behavior, what they found was that occupants became involved in protective activities such as warning others, calling the fire department, and rescuing or assisting others.” Victims and bystanders of disaster tend to rush towards disaster zones, not flee mindlessly away from them; the greatest challenge facing many disaster response teams is not terrorized crowds, but an overload of volunteers who arrive to aid the response.

If any aspect of an unfolding disaster is marked by panic, disaster sociologists Caron Chess and Lee Clarke observe, it is the behavior of elites. Catastrophe presents a leadership class with a terrible contradiction. On the one hand, the perception that leadership is not equal to the unfolding calamity erodes the legitimacy of any ruling class. Leaders understand that Heaven’s Mandate rests on their effective prevention of and response to crisis. On the other hand, the chaos inherent to disaster inevitably reduces leadership’s ability to control—or even stay aware of—the events by which they will be judged.

Further, the high morale and solidarity that citizens exhibit during a disaster dissolve the individualist outlook that elites have long learned to control and maintain. The seemingly positive and prosocial solidarity response of the population is itself a threat to the mechanisms of elite power in our society. Just as disasters empower normal citizens on the ground, who have no choice but to take fate into their own hands, they leave elites feeling distant and helpless. Chess and Clarke call this state of affairs “elite panic:” a fearful distrust of the populace that prompts leaders to restrict information, over-concentrate resources, and use coercive methods to reassert authority in the face of temporary breakdowns in public order. This style of response poses an active danger to disaster survivors and, ironically, creates the very resistance to authority that leaders fear most. Nowhere is this clearer than in the history of epidemic response.

On August 14th, 1918, Royal S. Copeland, New York City’s health commissioner, told The New York Times that there was not “the slightest danger of an epidemic of Spanish influenza in New York.” Copeland would further reassure the public that Americans, whose health put them in favorable contrast with the malnourished masses of Europe, need not worry about the disease. Copeland maintained this jolly tone over the next few months. On October 1st, well before the epidemic had reached its highest point in the city, Copeland announced that it “had been checked” and “warned the public against undue alarm.” The close to 2,000 new cases announced over the next two days had him once again reassuring the newspaper that the flu would in no way “swamp” the city’s medical system.

On October 4th, he downplayed the 999 new cases announced that morning with his own announcement that “there are no alarming symptoms about the spread of influenza in New York.” Notwithstanding his reassurances, by October 6th the rate of daily new cases hit a new high of 2,070 flu cases in the preceding 24 hours. Copeland again called for public calm: “Considering the population here,” he told the Times, ” I do not consider that the city is stricken.” It would be another week before Copeland would finally admit that the city had reached an official state of emergency and requested that an Emergency Advisory Committee be formed to coordinate a response to the virus.

In her history of the Spanish flu’s march through America, Nancy Bristow narrates this story to illustrate a broader pattern: over the course of the epidemic American newspapers, political leaders, and public health officials “did not exploit what might seem the most obvious message for motivating the public—the direct expression of the danger posed by influenza and the seriousness of the burgeoning epidemic.” Pamphlets and government releases described the disease in milquetoast language that did not invoke “the horrible scenes witnessed as the epidemic struck” or “the appalling symptoms that had frightened even the most experienced physicians.” Most officials were determined to mobilize the public to fight the disease—but not if that meant the possibility of panic.

Epidemics differ from other disasters in an important way: in an epidemic, strangers do not just share grief and trauma—they also share diseases. Whether the sunny verdicts of disaster sociology apply in a social context where strangers may be the vector for another’s death is a fair question. Disaster sociology literature reviews frankly admit that this is a question the field does not yet have the data to answer. For that answer, we must turn to the field of medical history.

Samuel Cohn’s 2019 work Epidemics: Hate and Compassion From the Plague of Athens to AIDS is the most comprehensive attempt to address the problem. Cohn surveys hundreds of epidemics over 2,000 years of Western history, using digital tools and search technology to scour hundreds of annals and thousands of newspapers for every reference to plagues and pox that can be found in the Western historical record. Cohn finds that with the sole exception of the Black Plague—and there only in Germany and the Low Countries—no premodern epidemic spawned violence, persecution, or chaos. Instead, historical records speak of the normal compassion, solidarity, and self-sacrifice the sociologists find in most post-disaster communities. Only in the 19th century do epidemics become associated with civil disorder—and then only with three specific diseases: the bubonic plague, cholera, and smallpox.

More important than the diseases themselves may have been the treatments. Public health was a 19th-century innovation; a scientific understanding of pathogens spread was a 19th-century discovery. These parallel developments transformed epidemic response. If past centuries endured epidemics as acts of God, 19th-century elites decided to use ambulances and centralized hospitals, segregation camps, district and neighborhood quarantines, inspection teams, and inoculation campaigns to contain and curtail the threats posed by disease. Violence arose in the communities contained and curtailed by these new public health tools.

The 63 plague riots that rocked British India between 1896 and 1902 are a case in point. The British officials in charge of plague control did not speak any native languages, had little respect for Hindu religious beliefs, and little knowledge of local burial customs. In their fight against the plague, male doctors would burst into local houses for inspections regardless of whether the family’s menfolk were home. Sanitation campaigns would result in the wanton destruction of Indian property; those found sick with the plague would be whisked off to segregation camps on the spot. In plague-struck areas, the disruption to daily life caused by inspections and seizures were as unnerving as they were unending. Nothing the British did, one Pune newspaper complained, “interfered so largely and in such a systematic way with the domestic, social and religious habits of the people as the enforcement of the measures adopted for stamping out the plague.” Soon, Indians were assassinating public health commissioners, attacking hospitals, and mobbing doctors.

This disorder mirrored the enormous cholera riots that shut down Russian, English, Italian, and American cities earlier in the century, as well as smaller incidents of smallpox-related disorder, such as the month-long Milwaukee smallpox riot of 1894. But this violence was not a product of mass panic or an expression of irrational terror. The 19th-century riots were a strategic expression of their anger: rioters used violence to prevent the forced separation of the sick from their loved ones and the imposition of medical practices the population did not understand or trust, and to dissuade state officials from violating their privacy, property, or religious norms. It was the disease response, not the disease itself, that led to attacks on the social order.

The Uses of Fear

Historian Judith Leavitt finds a lesson in the different outcomes of Milwaukee’s riot-spawning smallpox outbreak and New York City’s orderly response to a smallpox outbreak fifty years later. The latter was able to contain the disease quickly and vaccinate six million New Yorkers in four weeks. Leavitt credits New York’s success to the transparency of its public health officials, who kept the entire system informed of every new case found as soon as it was identified. “Leaders themselves must respect the public’s need to know,” Leavitt concludes, “and provide the kinds of knowledge people need. The more they try to hide, the more suspect they will become.”

A commentary published on the WeChat account of the Supreme People’s Court of the People’s Republic of China came to a similar conclusion. Reprimanding Wuhan police for silencing doctors for “spreading rumors” in December and January of 2020, the account notes that “It might have been a better way to prevent the novel coronavirus if the public had believed the ‘rumor’ then and started to wear masks and carry out sanitary measures.” Here was a belated recognition that lives were lost because of undue worry over panic.

Post-hoc admissions like these have a hollow ring. We require leaders who recognize before disaster strikes that mass panic is largely a myth, not after they have mismanaged it. This is a hard thing to ask of a governing class. One reason this myth has persisted despite decades of evidence to the contrary is that narratives of panic are a useful crutch for leaders under pressure. By projecting their own insecurities onto the masses they lead, elites find a ready scapegoat for their own failings. A leader who does not measure up to the demands of disaster will find it easier to blame the crowd for panic than accept the crowd’s harsh judgments on his own performance.

This principle applies just as well to regimes as to the individuals who comprise them. The wisdom of the Chinese sages who identified the loss of Heaven’s Mandate with natural calamity echoes this sentiment. They understood that crises sift the worthy from the base. It is natural for a leader to fear such a test. It is dangerous for him to deny that such a test is taking place. This is the temptation of disaster: to retreat from disaster management into perception management and to worry more that the people fear than about the dangers the people face. The source of temptation is not hard to spot: it is easy for a governing class to take credit for preventing panic when panic so rarely takes place. It is easier to define success in terms of emotional states: those are intangible and unmeasurable. Lives lost and dollars spent are different matters. Few men are eager to gamble their personal authority or the legitimacy of their regime on something so easily assessed.

Yet the test cannot be wished away. As the world discovered in 2020, perception management only goes so far. If disaster blooms into catastrophe, governing elites focused on panic prevention find their disaster response effort suddenly dependent on the very public they are frightened of. What is needed then is honesty and leadership. Leaders then have no choice but to rely on, assist, and lead the uncoordinated, self-directed efforts of the masses. What these efforts need to succeed is information and good strategy—and the trust that they will use this information well.

The people must be trusted with fear, and the governing class must be comfortable with leadership during times of crisis. Fear is an unpleasant emotion— but at times, a useful one. Fear lends urgency to action. Fear forces the afraid to focus on that which matters. This is the great lesson of the 2020 coronavirus: We should have been allowed to fear. Alas, our leaders feared our fear more than they feared our deaths. The world bears the consequences of this stark faith in the myth of panic.

Tanner Greer is the director of the Center for Strategic Translation and a correspondent for Palladium Magazine. His writing focuses on contemporary security issues in the Asia-Pacific and the military history of East and Southeast Asia.