With support from the University of Richmond

History News Network puts current events into historical perspective. Subscribe to our newsletter for new perspectives on the ways history continues to resonate in the present. Explore our archive of thousands of original op-eds and curated stories from around the web. Join us to learn more about the past, now.

What Was Healthcare Like in the 1800s?


Nineteenth-century British depiction of a cholera outbreak. Credit: University of St. Andrews.

Because it appeared in the midst of the ongoing debate over the workings of the Affordable Care Act, “Obamacare,” my latest book, Lotions, Potions, Pills, and Magic: Health Care in Early America, received some surprising attention from the public. That curiosity is focused on connecting early health care with current issues. As all medical historians recognize, the differences in both treatments and delivery of heath care are a stark contrast with contemporary methods. But there are a few carry-overs in the ideas and behaviors of both medical personnel and the public toward health care. The continuity has some lessons to teach about the pitfalls of the American system.

Before modern medicine, the understanding of disease and other bodily afflictions was based on ideas that were at least 2,000 years old but lacked any scientific basis. All people in the Western world, and not just medical personnel, assumed that disease was caused by an imbalance or disturbance within the body. A miasma (a foul odor in the air), or an evil spirit, or a contagious disease, or any number of outside influences could bring on that imbalance. The cure lay in eliminating those elements called humors by removing the offending substance through some bodily orifice -- the mouth, nose, rectum, or the skin -- using various drugs or by removing blood. Such were the major therapies that had prevailed in both orthodox medicine and folk practices for centuries. There was no understanding of germs invading the body or of mosquito vectors, or of fleas causing disease.

During the colonial era, most American doctors were trained in Europe or had been apprenticed to those who had. They followed procedures that were universally acceptable and fairly moderate. Letting nature heal and the amelioration of symptoms had become hallmarks of the best trained. Most were educated men with elite status who could convey a sense of authority and competence because of their social class. Nonetheless, most people did not consult doctors, who charged high fees, relying mostly on home remedies, midwives, local folk healers, or in the case of African Americans, the obeah or conjurer. Such healers charged less and offered remedies that mimicked the orthodox. But all admired and respected the physician in the years before the Revolution.

After independence the character of the physician changed. They lost their special social status. Few went to Europe to study and thus they were cut off from advances on the other side of the Atlantic. Fewer still came from the educated population. Standards of medical education in this country declined dramatically. Minimally-trained doctors opened their own medical schools as moneymaking ventures encouraged by a growing commercial and acquisitive social climate. To entice students they eliminated most of the academic requirements that had been traditional. They seldom offered any laboratory experience or taught anatomy or even required literacy for admission. To compete, even the colleges with medical schools reduced their requirements.

The diploma mills were encouraged by a public that abhorred government regulation or any interference with the rights of the common man to do as he wished. There were no licensing requirements for medical personnel or professional oversight. In the face of declining respectability, physicians, anxious to reestablish their credentials, began to use more extreme depletion methods. Their model was Benjamin Rush, who as a leading physician at the turn of the century proposed using more extreme bleeding and purging. The poorly trained could point to the dramatic effects of their therapies as a form of success.

But not all people accepted this “heroic” medicine. The result was a proliferation of competing health initiatives, a growth of medical sectarians such as homeopaths, hydropaths, new botanical theorists such as Thomsonianism as well as fitness gurus such as Sylvester Graham and John Harvey Kellogg. The sugar-coated pill advertised by a variety of entrepreneurs also competed freely. They had only to patent the shape of the bottles. There was no control over their ingredients. The medical scene in the nineteenth century was a chaotic free-for-all.

As American doctors moved to prove themselves through their heroic therapies, European doctors were moving in the opposite direction by drawing on scientific methods. Laboratory studies had begun to extract the key ingredients of herbal remedies such as quinine from the cinchona bark that was one of the very few curative remedies available for malaria. In France doctors were using autopsies to evaluate particular therapies while investigating mortality rates for those same procedures. They concluded that the time-honored therapies did not work and could cause harm. The European studies were putting science to use to evaluate their traditions and found them wanting. Thus Europeans drastically moderated their actions in the face of disease.

Americans rejected both the science and the idea of moderation. Even the most forward-looking physician in America, Oliver Wendell Holmes, Sr. (the father of the future Supreme Court justice), a proponent of clean hands, ridiculed the idea that science could have any practical value for the medical profession. In the absence of verifiable cures doctors who wanted to follow the European trends to let nature heal were accused of “therapeutic nihilism” that could destroy them as a profession. Americans, the orthodox argued, were superior and did not have to follow the practices of their weaker European forebears.

When the idea of germs causing disease was first introduced in Europe in the second half of the nineteenth century, especially with the work of Pasteur and Koch, American doctors vigorously denied such a notion. Science did not apply to American medicine. Americans, they insisted, were an exceptional people.

The parochial, anti-scientific, and highly commercial atmosphere that prevailed in the nineteenth century was a major factor in retarding American medicine and contributing to the decline of the profession. As the social status of doctors deteriorated, so too did their political clout. They were not wanted on the local Boards of Health, or as city inspectors. Nor did the few aware of public health concerns have any power to change American attitudes to poverty and disease. The public envisioned poverty as the cause of disease and not disease the result of poverty and poor living conditions. Political leaders believed that low morals predisposed people to bad health; thus the poor were responsible for their own sicknesses. In such a climate, little could be done to help the denizens of the urban slums that lacked clean water or means of disposing of waste products. The rich refused to spend money that might alleviate the awful living conditions and halt the spread of contagious diseases.

The few attempts to control educational standards or license medical personnel failed in the face of general public opposition to laws that might restrict individual freedom. The forces limiting government power in the area of health, the proponents of American exceptionalism, and the rejection of the needs of the poor had won their day in nineteenth-century American medicine. Will they do so again?