The U.S. Approach to Public Health: Neglect, Panic, Repeat
A once-in-a-century public health crisis is unfolding, and the richest country in the world is struggling to mount an effective response. Hospitals don’t have enough gowns or masks to protect doctors and nurses, nor enough intensive care beds to treat the surge of patients. Laboratories don’t have the equipment to diagnose cases quickly or in bulk, and state and local health departments across the country don’t have the manpower to track the disease’s spread. Perhaps worst of all, urgent messages about the importance of social distancing and the need for temporary shutdowns have been muddied by politics.
Nearly all of these problems might have been averted by a strong, national public health system, but in America, no such system exists.
It’s a state of affairs that belies the country’s long public health tradition. Before the turn of the previous century, when yellow fever, tuberculosis and other plagues ravaged the country’s largest cities at regular intervals, public health was generally accepted as a key component of the social contract. Even before scientists identified the microbes that cause such diseases, governments and individuals understood that a combination of leadership, planning and cooperation was needed to keep them at bay. Some of the nation’s oldest public health departments — in Boston, New York and Baltimore — were built on that premise.
By pushing infectious disease outbreaks to the margins, those health departments helped usher in what scientists refer to as the epidemiological transition: the remarkable leveling off of preventable deaths among children and working-age adults. That leveling off continued in the second half of the 20th century, as new federal laws ensured the protection of food, air and water from contamination, and national campaigns brought the scourges of nicotine addiction and sexually transmitted infections under control.